Dynamic NodeJS Schema Validation Rules Generator
Introduction
Are you tired of writing validation rules for your database tables manually? Introducing nodejs-schema-rules, a powerful CLI tool that automates the generation of basic validation rules for popular libraries such as JOI, ValidatorJS, and @vinejs/vine. These rules are based on your database table schema, providing a convenient starting point that you can refine and enhance to suit your specific needs.
In this blog post, we will explore the installation, configuration, and usage of nodejs-schema-rules. Whether you are working with MySQL, PostgreSQL, or SQLite databases, this tool offers a unified solution, making dynamic schema-based validation accessible and efficient. Stay tuned to discover how nodejs-schema-rules can enhance your development workflow and contribute to the overall reliability of your Node.js applications.
Installation
Install nodejs-schema-rules globally to access the ndVr
CLI:
npm install nodejs-schema-rules -g
# or
yarn add global nodejs-schema-rules
After installation, initialize the schema.config.js
file:
ndVr init
Modify the generated schema.config.js
file according to your database configuration:
require("dotenv").config();
const schemaConfig = {
defaultDatabase: 'sqlite',
databases: {
postgres: {
host: 'localhost',
port: 5432,
user: 'postgres',
password: '123456',
database: 'testing'
},
mysql: {
host: 'localhost',
port: 3306,
user: 'root',
password: '123456',
database: 'schema_builder'
},
sqlite: { database: './schema_builder.db' }
},
skipColumns: [ 'created_at', 'updated_at', 'deleted_at' ],
validationSchemaType: 'joi'
};
module.exports = schemaConfig;
Usages
Use the ndVr
CLI to generate validation rules for your database tables. For example:
ndVr joi -t my_table -db mysql -c column1,column2
This command generates validation rules for the specified database table (my_table
) and its columns (column1
and column2
). You can choose the database type (mysql
, postgres
, sqlite
) and the validation library (joi
, validatorjs
, vine
). The generated rules can be used to enforce data integrity and validate incoming requests in your application. Options:
- -db, — database: Specify the type of database (e.g.,
mysql
,postgres
,sqlite
). - -t, — table: Specify the name of the database table for which rules should be generated.
- -c, — columns: Specify the column names of the table to generate rules for.
- -h, — help: Display help for the command.
Examples:
- Generate rules for a MySQL table named
users
with columnsid
andname
:
ndVr joi -t users -db mysql -c id,name
- Generate rules for a PostgreSQL table named
users
with the validation libraryvalidatorJs
:
ndVr validatorJs -t users -db postgres -c id,name
- Generate rules for a SQLite table named
users
:
ndVr vine -t users -db sqlite -c id,name
If you have a table structure like this:
CREATE TABLE data_types (
id INTEGER PRIMARY KEY,
name TEXT,
age INTEGER,
height REAL,
is_student BOOLEAN,
birthdate DATE,
registration_timestamp TIMESTAMP,
description BLOB,
created_at TIMESTAMP,
updated_at TIMESTAMP
);
Generate rules for a whole table:
ndVr joi -db sqlite -t data_types
Output:
{
name: Joi.string().required(),
age: Joi.integer().min(-9223372036854775808).max(9223372036854775807).required(),
height: Joi.number().required(),
is_student: Joi.required(),
birthdate: Joi.date().required(),
registration_timestamp: Joi.required(),
description: Joi.required(),
}
Generate rules for specific columns:
ndVr joi -db sqlite -t data_types -c name,age
Output:
{
name: Joi.string().required(),
age: Joi.integer().min(-9223372036854775808).max(9223372036854775807).required(),
}
Always skip columns: Add the columns to skip in the schema-config.js
file, under the skipColumns
attribute:
skipColumns: (process.env.SKIP_COLUMNS || 'created_at,updated_at,deleted_at').split(',')
By incorporating nodejs-schema-rules into your workflow, you not only enhance the integrity of your data but also contribute to a more efficient and maintainable codebase. The generated rules serve as a solid foundation, allowing developers to focus on refining and enhancing validation logic for their specific use cases.