Jacaranda Bundle

  • 蓝花楹全家桶
  • Jacaranda is a monorepo for @kitmi/jacaranda JavaScript application framework & its relavant utility libraries.
  • 70%+ of the documents are generated or modified by GPT-4.

Packages

  • @kitmi/jacaranda

    • A rich-feature JavaScript CLI application and http server application framework with plugable features. It supports to run by both node.js and bun.sh, and switch between koa.js (stable) and hono.js (high performance) as the http engine freely.
    • See Manual
  • @kitmi/validators

    • A dynamic validation library designed to validate objects using a declarative syntax known as Jacaranda Object Modifiers Syntax (JOMS). It allows for dynamic validation strategies by using various types of modifiers that can be combined to form complex validation rules.
    • See Manual
  • @kitmi/tester

    • A JavaScript unit test tool with in-server code coverage, benchmark, profiling, test report generation, async dump and etc.
    • See Manual
  • @kitmi/config

    • This library provides a simple and flexible way to manage configuration settings for applications across different environments. It supports both JSON and YAML file formats and automatically selects the appropriate configuration based on the current environment. It also supports config values interpolation.
    • See Manual
  • @kitmi/utils

    • A JavaScript common utilities library that enhances lodash with additional functions for manipulating text, URLs, arrays, objects, and names.
    • See Manual
  • @kitmi/types

    • This is a fundamental library that defines semantic data types with serialization, validation and sanitization. It also contains common errors definition.
    • See Manual
  • @kitmi/sys

    • This is a small collection of utility functions designed to interact with the local system using JavaScript. It is designed to work with both node.js and bun.sh.
    • See Manual
  • @kitmi/algo

    • A lightweight JavaScript library for various algorithms and data structures, focusing on graph and tree operations. It provides efficient implementations of Breadth-First Search (BFS), Depth-First Search (DFS), Topological Sorting, Finite State Machines (FSM), and representations for graphs and trees, among other utilities.
    • See Manual
  • @kitmi/jsonv

    • JSON Validation Syntax library
    • See Manual
  • @kitmi/jsonx

    • JSON Expression Syntax library
    • See Manual
  • @kitmi/adapters

    • This library provides a unified interface for interacting with various components that share similar functionalities but may have different underlying implementations or interfaces. By using this library, developers can switch between different components such as HTTP clients or JavaScript package managers without changing the consuming codebase, making the code more modular and easier to maintain.
    • See Manual
  • @kitmi/data

    • This library is the data access layer of the Jacaranda Framework, designed to provide both ORM and non-ORM approaches for database interactions. It encapsulates SQL operations, connection pool management, and routine CRUD (Create, Read, Update, Delete) operations, ensuring efficient and scalable data handling. By offering a flexible and powerful API, @kitmi/data simplifies the complexities of database management, allowing developers to focus on building robust applications without worrying about underlying data access intricacies.
    • See Manual

Command-lines

  • @kitmi/xeml
    • Jacaranda data entity modeling tool
    • See Manual

License

  • MIT
  • Copyright (c) 2023 KITMI PTY LTD

Jacaranda Types & Validation

In the realm of web development, JavaScript has long been the cornerstone language, driving both client-side and server-side interactions. However, JavaScript's type system is fundamentally based on variable types rather than the semantics of the values they hold. This discrepancy can lead to challenges when developers need to work with data that has a specific business context.

For instance, JavaScript treats a string containing a number ("123") as a string type, even though, semantically, it might represent an integer in a business context. Similarly, a date represented as a string ("2021-01-01") remains a string type, despite its semantic meaning as a date.

This discrepancy between type recognition and business logic necessitates a semantic type system that aligns with the context in which data is used. The @kitmi/types library addresses this need by defining semantic data types that include validation and sanitization, tailored to business logic rather than just the form of data. It allows for the creation of an independent and extensible type system that can be customized to fit the unique requirements of a business application.

Semantic Type System with Validation and Sanitization

The @kitmi/types library is not just a collection of predefined data types; it also allows the creation of an independent and extensible type system. This flexibility is crucial for business applications where the context and semantics of data are paramount. By defining types semantically, developers can ensure that data conforms to business rules and expectations, such as treating a numeric string as an integer or a date string as a datetime object.

Building on the capabilities of @kitmi/types, the @kitmi/validators library introduces a postProcess hook to perform validations and transformations based on the Js Type Modifiers Syntax (JTMS). This approach enables dynamic and complex validation strategies that are defined using data descriptors rather than code, allowing for the configuration and storage of these rules in a standardized format.

Conventions and Types

The @kitmi/types library defines a variety of types, including any, array, bigint, binary, boolean, datetime, integer, number, object, and text. Each type follows a specific interface that includes a name, aliases, defaultValue, sanitize, and serialize method.

Type Metadata

The type interface provides a blueprint for how each type should be structured. Common metadata properties such as plain, optional, and default allow for additional customization of how values are processed.

Enumerable types (like bigint, integer, number, and text) have a enum property used for specifying a set of allowed values.

Object type has a schema object property (can also be a functor to return a schema object) used for specifying the schema used to verify and process the object value.

Array type has a element object property (can also be a functor to return a schema object) used for specifying the schema used to verify and process its element value.

Binary type has a encoding text property.

Datetime type has a format text property.

Note: some more specific properties may not be covered here.

Plugins

The @kitmi/types library also supports plugins as serializer, deserializer, pre-processor, post-processor.

Dynamic Validation with Declarative Syntax

Building on the foundation of @kitmi/types, the @kitmi/validators library introduces a dynamic validation system using the Js Type Modifiers Syntax (JTMS). This declarative syntax allows developers to specify complex validation rules by combining different types of modifiers.

Modifier Syntax

Modifiers in JTMS can be standalone or require arguments, with the latter being expressed in object or array style.

  • Standalone Modifiers: "", e.g. ~ip, ~email, ~strongPassword
  • Modifiers with Arguments: These can be expressed either as objects or arrays:
    • Object Style:
      • name: Modifier name with prefix (e.g., ~mobile)
      • options: Arguments for the modifier (e.g., { locale: 'en-US' })
    • Two-tuple Array Style:
      • Index 0: Modifier name with prefix
      • Index 1: Modifier options argument

Types of Modifiers

There are three types of modifiers with different prefix symbols:

  • Validator (~): Validates the value.
  • Processor (>): Transforms the value.
  • Activator (=): Provides a value if the current value is null.

Sample

An optional config object for koa is described with JTMS as below:

{
    "type": "object",
    "schema": {
        "trustProxy": { "type": "boolean", "optional": true },
        "subdomainOffset": { "type": "integer", "optional": true, "post": [["~min", 2]] },
        "port": { "type": "integer", "optional": true, "default": 2331 },
        "keys": [
            {
                "type": "text"
            },
            {
                "type": "array",
                "optional": true,
                "element": { "type": "text" },
                "post": [["~minLength", 1]]
            }
        ]
    },
    "optional": true
}

Note: the keys property above can be one of a text value or an array of text with at least one element.

Why Not Use Code-Based Validation Libraries?

While libraries like Joi or Yup provide powerful code-based solutions for data validation, @kitmi/validators takes a different approach by using data to describe data formats. This methodology shifts the focus from writing validation code to defining data formats and rules as configurations. As a result, data format definitions, validations, and even processing rules become standardized and can be managed as configurations, enhancing reusability and maintainability.

Validators and Processors Extension

Furthermore, @kitmi/validators incorporates the @kitmi/jsonv and @kitmi/jsonx libraries, which introduce a series of validation and transformation operators inspired by MongoDB query operators. The @kitmi/jsonv library is utilized through the ~jsv validator, and @kitmi/jsonx is applied via the |jsx processor within JTMS. This integration equips @kitmi/validators with a robust set of validators and processors capable of not just validating data but also transforming it, thus creating a comprehensive data processing pipeline.

Use JTMS to Describe JTMS

type: object,
schema:
    type:
        type: text
        enum:
            - "any"
            - "array"
            - "bigint"
            - "binary"
            - "boolean"
            - "datetime"
            - "integer"
            - "number"
            - "object"
    plain:
        type: boolean
        optional: true
    optional
        type: boolean
        optional: true
    default:
        type: any
        optional: true
    enum
        onlyWhen: 
            $$PARENT.type: 
                $in:
                    - "bigint"
                    - "integer" 
                    - "number"
                    - "text"
        type: array
        element: 
            type: '$$'

Reflection-Based Features in @kitmi/jacaranda

The @kitmi/jacaranda framework is an advanced JavaScript application framework designed to facilitate the development of both command-line interface (CLI) and HTTP server applications. It is compatible with node.js and bun.sh, and offers the flexibility to choose between koa.js and hono.js for the HTTP engine, catering to the needs for stability and high performance respectively.

Core Concepts

At the heart of @kitmi/jacaranda lies a reflection-based feature system coupled with a dependency-injection pattern. This design philosophy ensures modularity and a clear separation of concerns, which are essential for building scalable and maintainable applications.

Reflection-Based Feature System

The framework treats each top-level node in the configuration file as a distinct feature. This approach allows developers to modularize their application by encapsulating specific functionalities within self-contained features. Each feature is responsible for a particular aspect of the application, such as configuration, logging, or internationalization.

Features in @kitmi/jacaranda are loaded in a specific order, following the stages of Config, Initial, Services, Plugins, and Final. This ordered loading is further refined by dependency relations declared among features, ensuring that dependencies are resolved before a feature is initialized. Topological sorting is employed to manage the loading sequence of features that share the same stage.

Moreover, the framework supports both built-in features and custom features placed under the application's features directory. This directory is configurable, allowing developers to structure their application as they see fit. Features can also declare their required npm packages, and the framework provides a script to install these dependencies using the developer's preferred package manager.

Dependency Injection Pattern

@kitmi/jacaranda embraces dependency injection as a core pattern for managing feature dependencies. Each feature acts as an injected dependency, which can be consumed by other parts of the application. This pattern promotes loose coupling and high cohesion, making the application easier to test and maintain.

A feature can either register a service or extend the app prototype. Registering a service is the recommended approach as it aligns with the principles of dependency injection and service-oriented architecture. By registering services, features expose their functionalities to the rest of the application in a decoupled manner.

The service registry is a critical component of the dependency injection system. It maintains a registry of all available services, allowing features to declare their dependencies explicitly. When a feature requires a service, it retrieves the service instance from the registry, rather than creating a direct dependency. This approach simplifies the management of feature interactions and dependencies.

Feature Develop Guideline

export default {
    stage: 'Services', // 5 stages: Config, Initial, Services, Plugins, Final

    groupable: true, // optinonal, true or false, whether it can be grouped by serviceGroup, usually means multiple instances of this services are allowed 

    packages: [], // required packages to be installed bofore starting the server

    depends: [], // other features as dependencies

    load_: async function (app, options, name) {} // feature loader, usually register the service instance under the given name (when grouped will be suffixed with instance id)
};

Registry system

Global runtime registry

import { runtime } from '@kitmi/jacaranda';
import koaPassport from 'koa-passport';
import pg from 'pg';

runtime.loadModule('pg', pg);
runtime.loadModule('koa-passport', koaPassport);

Module-specific registry

In the export default entries of each app modules.

export default {
    ...,
    registry: {
        models,
        features: {
            ...features,
        },
        middlewares: appMiddlewares,
        controllers: {
            actions,
            resources,
        },
    },
};

@kitmi/jacaranda Server Routing

Server & App

A server hosts app modules and library modules.

Request Handling Flow

Routing Flow

App Modules and Library Modules

  • Library Module

    • Library module usually provide common business logic or common data access models for other app modules.
    • Library module does not provide external access interface.
  • App Module

    • App module usually has routing config to provide external access interface with certain kind of router.

Hereafter, app refers to an app module.

Interoperbility

Access lib from app

app.host.getLib('<lib name>')

Access app from another app

  • by route
// suppose the target app is mounted at /cms
app.host.getAppByRoute('/cms')

@kitmi/jacaranda Module Loading Helper

Overview

In the Jacaranda Framework, managing and loading modules can be complex, especially when dealing with different package management tools and the modular structure of projects. Modules might reside outside the project's working directory, creating challenges in locating and loading them efficiently. The Module Loader Helper in Jacaranda addresses these challenges by providing a unified way to load modules from various sources.

Module Loader Helper

Usage

The loadModuleFrom_ function is the core utility provided by the Jacaranda Framework for loading modules. It supports various sources, ensuring that modules can be loaded from different locations as needed.

import { loadModuleFrom_ } from '@kitmi/jacaranda';

const moduleToLoad = await loadModuleFrom_(app, source, moduleName, payloadPath);

// source can be 'runtime', 'registry', 'direct', 'project'

Parameters

  • app: The current application instance.
  • source: The source from which the module should be loaded. It can be one of 'runtime', 'registry', 'direct', or 'project'.
  • moduleName: The name of the module to load.
  • payloadPath: The path to the module's payload, if applicable.

From Different Sources

runtime

Load module from the Global Runtime Registry.

Server calls runtime.loadModule during bootstrapping to inject the module instance into the global runtime registry.

runtime.loadModule('<module-full-path>', module);

registry

Load module from the App Module Specific Registry.

The app module itself preloads module instances into the app's own registry in the module entry file.

export default {
    ...,

    registry: {
        ...,        
    },
};

direct

Load module directly by calling require with esmCheck.

project

Load module from the project's working path.

Jacarana Data Access Models

Overview

This document outlines the design of the Jacarana Data Access Models for a Node.js backend system, focusing on the interaction between connectors, entity models, database models, and business logic classes. The architecture ensures a clean, extensible, and maintainable codebase, leveraging modern JavaScript features like async/await and Proxies.

arch

Base Components

  1. Connector: Manages database connections and connection pools.
  2. DbModel: Base class for database interactions, managing connections and transactions.
  3. EntityModel: Base class for ORM entities, encapsulates data through a proxy.
  4. BusinessLogic: Encapsulates the business logic of the system, interacting with one or more EntityModels and managing transactions.

Specific database features can be implemented in subclasses inheriting from the above base components.

E.g.

  • PostgresConnector: Leverages pg package to manage connections and connection pools to PostgreSQL database.
  • PostgresEntityModel: Supports special query syntax of PostgreSQL, e.g., ANY, ALL.
  • PostgresDbModel: Supports multiple schemas inside a database.

Connector

The Connector class is responsible for managing connections to the database.

Interface

class Connector {
  async connect_(); // Get a connection from the pool or create a new one, depending on the driver
  async disconnect_(); // Release the connection back to the pool or close it
  async end_(); // Close all connections in the pool
  async ping_(); // Ping the database
  async execute_(); // Execute a query
  async beginTransaction_(); // Begin a transaction
  async commit_(); // Commit a transaction
  async rollback_(); // Rollback a transaction

  async create_(model, data, options, connection) -> ({ data, affectedRows })
  async find_(model, options, connection) -> ({ data })
  async update_(model, data, options, connection) -> ({ data, affectedRows })
  async delete_(model, options, connection) -> ({ data, affectedRows })
  async upsert_(model, data, uniqueKeys, dataOnInsert, options, connection) -> ({ data, affectedRows })
}

EntityModel

The EntityModel class serves as a base class for data entities with static meta providing metadata. EntityModel instance itself does not save any data since JS always handles data in the form of JSON and it's not necessary to implement an ActiveRecord-like class.

Interface

class EntityModel {
  get meta(); // Entity metadata, { name, keyField, associations, features }
  async findOne_(criteria); // Implement find one logic
  async findMany_(criteria); // Implement find many logic
  async findAll_(); // Implement find all logic
  async createOne_(data); // Implement create one logic
  async createMany_(dataArray); // Implement create many logic
  async updateOne_(criteria, data); // Implement update one logic
  async deleteOne_(criteria); // Implement delete one logic
}

DbModel

The DbModel class manages the lifecycle of a connection created from the connector, and all EntityModel instances are created from DbModel. DbModel uses Proxy to delegate PascalCase getter to the entity(getterName) method.

Interface

class DbModel {
  get meta(); // Database metadata, { schemaName, Entities }
  entity(name); // Get an entity instance
  async transaction_(async function(anotherDbInstance));
}

BusinessLogic

The BusinessLogic class encapsulates the business logic of the system. It interacts with one or more EntityModel instances and manages transactions to complete a business operation.

Interface

class BusinessLogic {
  constructor(db) {    
    this.db = db;
  }

  async validateUserPassword(username, password) {
    // Example business operation logic
    await this.db.transaction_(async (db) => { // !! the passed in db instance is different with this.db
      const user = await db.entity('User').findOne_({ username });
      // verify user password with hashed password
    });
  }

  // Other business logic methods
}

Usage

  • A default DbModel instance db can be retrieved from the Jacarana App instance.
const db = app.db('db name');
const businessLogic = new BusinessLogic(db);
  • For a normal query
const User = db.entity('User');
const user = await User.findOne_({ id: 1837 });

Summary

This architecture provides a robust and flexible foundation for database access and management, supporting multiple database types and schemas, and enabling seamless integration of data operations with transaction management. The introduction of the BusinessLogic layer ensures that business operations are encapsulated, maintainable, and scalable. The use of async interfaces and proxies ensures modern, efficient, and maintainable code.

@kitmi/data

Jacaranda Framework Data Access Layer

Built-in Jacaranda Features

Below features come with @kitmi/data can be configured in both @kitmi/jacaranda server config file and app module config file.

  • dataSource - Defines connector of a database

Note: For postgres, different postgres schema are considered different data source in this framework.

  • dataModel - To be used by @kitmi/xeml for code generation

  • db - Maps of schema name to data source defined by dataSource feature

Note: Schema in this framework is a collection of entities and it can be defined as a subset of a database. Different schemas can use the same dataSource connector.

Sample config:

dataSource:
    postgres:
        testConnector:
            connection: 'postgres://postgres:postgres@localhost:5432/jc-testdb'
            adminCredential:
                username: 'postgres'
                password: 'postgres'
            logStatement: true
            logConnection: false
            logTransaction: true

dataModel:
    schemaPath: './xeml'
    schemaSet:
        test:
            dataSource: postgres.testConnector
        test2:
            dataSource: postgres.testConnector
    dependencies:
        commons: '@xgent/xem-commons'

db:
    test:
        dataSource: postgres.testConnector        
    test2:
        dataSource: postgres.testConnector

Basic Usage

Supposed all the entity models are generated by @kitmi/xeml without errors and database has been migrated to the latest state.

The db feature will inject a db function into the app object.

The db function signature is as below.

app.db([schema-name]) -> DbModel; 

The schema-name is optional and will use the value of app.settings.defaultDb as the default schema. The value of app.settings.defaultDb can be configured in the app's config as below:

settings:
    defaultDb: test

The db feature can also reference to a data source defined in a library module.

db:
    test:
        fromLib: '<name of the library defines the data source>'
        dataSource: postgres.testConnector     

When an app with the db feature enabled, database operation can be done as below:

    const db = app.db('db name'); // or app.db() for default db configured in app's settings
    // ctx.db is a shorthand of ctx.module.db, ctx.module is the app instance serving the request

    // or db.entity('user') or db.User
    const User = db.entity('User'); 
    
    // or await User.findOne_({ id: 1011 }); 
    // or await User.findOne_(1011) if 1011 is the value of User's primary key; 
    const user = await User.findOne_({ $where: { id: 1011 } });      

    const user = await User.updateOne_({ name: 'New name' }, { $where: { id: 1011 } });
    // ...

CRUD Operations

  • Retrieval

    • async findOne_(findOptions)
    • async findMany_(findOptions)
    • async findManyByPage_(findOptions, page, rowsPerPage)
  • Creation

    • async create_(data, createOptions)
    • async createFrom_(findOptions, columnMapping)
  • Update

    • async updateOne_(data, updateOptions)
    • async updateMany_(data, updateOptions)

Note: regarding the data, please refers to Data to Update

  • Deletion

    • async deleteOne_(deleteOptions)
    • async deleteMany_(deleteOptions)
    • async deleteAll_()
  • Not implemented yet

    • async createMany_(data /* array of object */, createOptions)
    • async createMany_(fieldNames, data /* array of array */, createOptions)
    • async aggregate_(...)
    • async cached_(...)

Hooks

  • async beforeCreate_(context) -> boolean
  • async beforeUpdate_(context) -> boolean
  • async beforeUpdateMany_(context) -> boolean
  • async beforeDelete_(context) -> boolean
  • async beforeDeleteMany_(context) -> boolean
  • async afterCreate_(context) -> boolean
  • async afterUpdate_(context) -> boolean
  • async afterUpdateMany_(context) -> boolean
  • async afterDelete_(context) -> boolean
  • async afterDeleteMany_(context) -> boolean

Operation Options

Common Options

$ctx

The koa like ctx object passed in to interpolate into query condition, will also be passed on to associated operation.

  • session
  • request
  • header
  • state

$skipUniqueCheck

To skip unique check for $where object when performing xxxOne_ operation.

$key

The key field of the entity.

findOptions

$select

  • Select by dot-separated field name (syntax: [.])
$select: [
    '*',
    'user.* -password -passwordSalt', // with exclusions
    'offices.bookableResources.type'
]
// SELECT A.*, X.`type` ... JOIN `OfficeBookableResource` as X

Note: The output columns may have some automatically added fields especially keys of different layers for building the hierachy structure

  • Select by function
$select: [ { $xr: 'Function', name: 'MAX', alias: 'max', args: ['order'] } ]
// SELECT MAX(`order`) as max
  • Using the xrXXX helpers.
import { xrCall, xrCol } from '@kitmi/data';
//...
{
    $select: [ xrCall('COUNT', '*'),  xrCall('SUM', xrCol('field1')) ]
    // SELECT COUNT(*), SUM("field1")
}
Helper Functions
  • xrCall: call a SQL function
{
    $select: [ xrCall('SUM', xrCol('someField')) ] // SUM(someField)
}
  • xrCol: reference to a table column

Difference between using xrCol and string:

postgres: 
    xrCol('abc') -> quote_ident('abc') -> "abc"
    direct string "abc" or 'abc' -> 'abc'
  • xrExpr: binary expression
{
    $select: [ xrExpr(xrCol('someField'), '+', 1) ] // someField + 1
}
  • xrRaw: raw expression

Note: Directly inject into the SQL statement, should use it carefully.

xrRaw(<raw-statement>, [params])

In the raw statement, you can put a db.paramToken as the placeholders of params and separately pass the params array as the second argument.

e.g.

xrRaw(
    `("testJson"->>'duration')::INTEGER > ${Book.db.paramToken} AND ("testJson"->>'duration')::INTEGER < ${Book.db.paramToken}`,
    quizCount
)
  • xrGet: get array element or json field
xrGet(<field>, (<1-based index> | <"." separated key path>)[, <alias>])
  • Others

More usage samples can be found in the Data to Update section below.

$relation

Indicate to include what associations in the query.

Note: No trailing (s).

  • Use anchor name as the relationship
// use an anchor 
$relation: [ 'profile', 'roles' ];
// can use multi-levels
$relation: [ 'profile.address', 'roles.role' ];
  • Use custom relation that not defined in xeml
$relation: [{ alias: 'anyNode', entity: 'tagCategoryTree', joinType: 'CROSS JOIN', on: null }], // 
$relation: [
    ...,
    {
        alias: 'course',
        entity: 'course',
        joinType: 'INNER JOIN',
        on: {
            'course.rootDoc': xrCol('rootDoc.knowledges.knowledge.documents.document.id'),
        },
    },
    'course.branches.branch.subject',
    ...
]

$where

The where clause object.

  • Condition with AND
{
    $where: {
        key1: value1,
        key2: value2
    } 
    // key1 == value1 AND key2 == value2
}

{
    $where: {
        $and: [ { key1: value1 }, { key2: value2 } ]
    } 
    // key1 == value1 AND key2 == value2
}
  • Condition with OR
{
    $where: {
        $or: [ { key1: value1 }, { key2: value2 } ],
        $or_2: [ { key3: value3, key4: value4 }, { key5: value5 } ],
    }
    // (key1 == value1 OR key2 == value2) AND ((key3 == value3 AND key4 == value4) OR (key5 == value5))
}
  • Condition with NOT
{
    $where: {
        $not: {
            key1: value1,
            key2: value2
        }
    }
    // NOT (key1 == value1 AND key2 == value2)
}
  • Condition with Expressions
{
    $where: {
        $expr: xrExpr(xrCol('someField'), '==', xrCol('someField2')),
        $expr_2: xrExpr(xrCol('metadata'), '@>', obj)
    }
    // someField == someField2 AND metadata @> $1
    // $1: obj
}
  • Condition with Raw Statement

Had better put a db.paramToken as the placeholders of params and separately pass the params array as the second argument.

const duration = [10, 20];
await Book.findMany_({
    $where: {
        $expr: xrRaw(
            `("testJson"->>'duration')::INTEGER > ${Book.db.paramToken} AND ("testJson"->>'duration')::INTEGER < ${Book.db.paramToken}`,
            duration
        ),
    },
});
Condition Operators
  • $gt: >
  • $lt: <
  • $gte: >=
  • $lte: <=
  • $exist
    • { $exist: true } = IS NOT NULL
    • { $exist: false } = IS NULL
  • $eq: ==
  • $neq: <>
  • $in, $notIn
  • $between, $notBetween
  • $startsWith: LIKE %S
  • $endsWith: LIKE S%
  • $like: LIKE %S%
  • $fitler: jsonb_field @> filter

$orderBy

  • Order by ascending
{
    ...,
    $orderBy: 'field1' // ORDER BY field1 
}
{
    ...,
    $orderBy: [ 'field1', 'field2' ] // ORDER BY field1, field2
}
{
    ...,
    $orderBy: { 'field1': true } // ORDER BY field1 
}
{
    ...,
    $orderBy: { 'field1': 1 } // ORDER BY field1 
}
  • Order by descending
{
    ...,
    $orderBy: { 'field1': false } // ORDER BY field1 DESC
}
{
    ...,
    $orderBy: { 'field1': -1 } // ORDER BY field1 DESC 
}
  • Mix
{
    ...,
    $orderBy: { 'field1': -1, 'field2': 1 } // ORDER BY field1 DESC, field2 
}
  • Order by alias

When a query has joining tables, all column reference will be padding the table alias automatically, e.g. { $orderBy: 'field1' } will be converted into ORDER BY A."field1".

If you want to order by an alias which is not a column of any table, the above rule will fail the final SQL execution. A "::" prefix can be used to force the conversion to not add the table alias, i.e. { $orderBy: '::field1' } to ORDER BY "field1".

$groupBy

{
    ...,
    $groupBy: 'field1' // GROUP BY field1
}
{
    ...,
    $groupBy: [ 'field1', 'field2' ] // GROUP BY field1, field2
}

$offset

{
    $offset: 10
}

$limit

{
    $limit: 10
}

$countBy

Returns total record count

$includeDeleted -

To include logically deleted records

$skipOrm -

Internal use, to skip mapping the result into a nested object.

$asArray -

Return result as array, i.e. array mode.

$sqlOnly -

Only return the sql.

createOptions

$ignore -

If already exist (unique key conclicts), just ignore the operation.

$upsert -

If already exist (unique key conclicts), just update the record.

$getCreated -

Return created records.

updateOptions

$where

See $where in findOptions.

$getUpdated -

Return updated records.

createOptions & updateOptions

$bypassReadOnly -

Internal use, to bypass some read-only fields

$skipModifiers -

Skip auto generated modifiers.

$skipValidators -

Skip the specified validators

$dryRun -

Just do the entity pre-process and skip the actual db creation call.

$migration -

For migration only.

deleteOptions

$getDeleted -

Return the deleted records

$deleteAll -

To delete all records with deleteMany_

$physical -

To physically delete a record.

Data to Update

  • Plain Object
const data = {
    field1: 'value1',
    field2: 'value2'
};

await Entity.updateOne_(data, { $where, ... });
  • Special Values
    • xrCol - Set the value to another column
    • xrCall - Set the value to the result of a function call
    • xrExpr - Set the value to the result of an expression
    • xrRaw - Set the value to a raw SQL statement
    • doInc(field, n): shorthand for field + n using xrExpr and xrCol
    • doDec(field, n): shorthand for field - n using xrExpr and xrCol
entity.updateOne_({
    version: doInc('version', 1)  // version = version + 1
}, { $where: { id: xxx } })
  • Special Operators
    • $set: { key: value [, key2: value2, ...] } for jsonb field
    • $setAt: { at, value } for array field
    • $setSlice: { begin, end, value } for update partial array field, value is an array

Transaction

When executing transactions, DbModel (i.e. entity.db or app.db()) will fork a new instance containing a dedicated connection for a transaction.

The business logic of the whole transaction should be wrapped in an async function and should use the newly given db object as _db in below example.

  // Transaction
  const ret = await this.db.transaction_(async (_db) => {

      const User = _db.entity('user'); // should use _db to get entity
      ...

      return ret;
  });

Create with Associations

  • 1:many as array and reference as object
const { op, data, affectedRows, insertId } = await Product.create_({
    'type': 'good',
    'name': 'Demo Product 2',
    'desc': 'Demo Product Description 2',
    'image': 'https://example.com/demo.jpg',
    'thumbnail': 'https://example.com/demo-thumb.jpg',
    'free': true,
    'openToGuest': true,
    'isPackage': false,
    'hasVariants': true,
    'category': 2,
    ':assets': [
        {
            'tag': 'snapshots',
            ':resource': {
                mediaType: 'image',
                url: 'https://example.com/demo-asset.jpg',
            },
        },
        {
            'tag': 'snapshots',
            ':resource': {
                mediaType: 'image',
                url: 'https://example.com/demo-asset2.jpg',
            },
        },
        {
            'tag': 'poster',
            ':resource': {
                mediaType: 'image',
                url: 'https://example.com/demo-poster.jpg',
            },
        },
    ],
    ':attributes': [
        {
            type: 'dimension',
            value: '10x10x10',
        },
        {
            type: 'origin',
            value: 'China',
        },
    ],
    ':variants': [
        {
            type: 'color',
            value: 'red',
        },
        {
            type: 'color',
            value: 'blue',
        },
        {
            type: 'color',
            value: 'green',
        },
        {
            type: 'size',
            value: 'L',
        },
        {
            type: 'size',
            value: 'M',
        },
        {
            type: 'size',
            value: 'S',
        },
    ],
});

Find with Associations

const result = await Product.findOne_({
    id: insertId,   // all k-v pairs without starting with $ will be pushed into $where
    $select: ['*'],
    $relation: ['category', 'assets.resource', 'attributes', 'variants'],
});

License

  • MIT
  • Copyright (c) 2023 KITMI PTY LTD

@kitmi/data How-To

INSERT INTO ... SELECT ...

await ClosureTable.createFrom_(
    {
        $select: [
            'ancestorId',
            'anyNode.descendantId',
            xrAlias(xrExpr(xrExpr(xrCol('depth'), '+', xrCol('anyNode.depth')), '+', 1), 'depth'),
        ],
        $where: { 'descendantId': parentId, 'anyNode.ancestorId': childId },
        $relation: [{ alias: 'anyNode', entity: 'tagCategoryTree', joinType: 'CROSS JOIN', on: null }],
        $upsert: { depth: xrCall('LEAST', xrCol('depth'), xrCol('EXCLUDED.depth')) },
    },
    {
        'ancestorId': 'ancestorId',
        'anyNode.descendantId': 'descendantId',
        '::depth': 'depth',
    }
);
// INSERT INTO "tagCategoryTree" ("ancestorId","descendantId","depth") SELECT A."ancestorId", anyNode."descendantId", ((A."depth" + anyNode."depth") + $1) AS "depth" FROM "tagCategoryTree" A , "tagCategoryTree" anyNode WHERE A."descendantId" = $2 AND anyNode."ancestorId" = $3 ON CONFLICT ("ancestorId","descendantId") DO UPDATE SET "depth" = LEAST("tagCategoryTree"."depth",EXCLUDED."depth")

WHERE xxx IN (SELECT ...)

await TagCategoryTree.deleteMany_({
    $where: {
        descendantId: {
            $in: xrDataSet(TagCategoryTree.meta.name, {
                $select: ['descendantId'],
                $where: { ancestorId: keyValue },
            }),
        },
    }
});
// DELETE FROM "tagCategoryTree" WHERE "descendantId" IN (SELECT "descendantId" FROM "tagCategoryTree" WHERE "ancestorId" = $1)

Custom join and group by through skipping orm

const ret = await Video.findMany_({
    $select: [xrCall('COUNT', xrCol('rootDoc.taggings.tag.id'))],
    $relation: [
        'rootDoc.knowledges.knowledge.documents.document',
        'rootDoc.taggings.tag',
        {
            alias: 'course',
            entity: 'course',
            joinType: 'INNER JOIN',
            on: {
                'course.rootDoc': xrCol('rootDoc.knowledges.knowledge.documents.document.id'),
            },
        },
        'course.branches.branch.subject',
    ],
    $where: {
        'rootDoc.taggings.tag.id': { $in: [7, 8] },
        'course.branches.branch.subject.id': { $in: [1, 2] },
        'course.branches.branch.id': { $in: [1, 2] },
        'course.id': { $in: [1, 2] },
    },
    $groupBy: ['rootDoc.taggings.tag.id'],
    $skipOrm: true,
});
// SELECT COUNT(H."id") FROM "video" A LEFT JOIN "document" B ON A."rootDoc" = B."id" LEFT JOIN "documentKnowledge" C ON B."id" = C."document" LEFT JOIN "knowledgeChip" D ON C."knowledge" = D."id" LEFT JOIN "documentKnowledge" E ON D."id" = E."knowledge" LEFT JOIN "document" F ON E."document" = F."id" LEFT JOIN "documentTagging" G ON B."id" = G."entity" LEFT JOIN "tag" H ON G."tag" = H."id" INNER JOIN "course" course ON A."rootDoc" = F."id" LEFT JOIN "branchCourse" I ON course."id" = I."course" LEFT JOIN "subjectBranch" J ON I."branch" = J."id" LEFT JOIN "subject" K ON J."subject" = K."id" WHERE H."id" = ANY ($1) AND K."id" = ANY ($2) AND J."id" = ANY ($3) AND course."id" = ANY ($4) AND A."isDeleted" <> $5 GROUP BY H."id"