Node SQL API

Over the last few years through teaching and an increasing Full Stack workload I have found more of my day to day devoted to writing Javascript. More often than not if I am working on something outside of native Mobile I am using Javascript. Recently I had cause to create a CRUD API layer to expose a few different SQL stores. I hoped to keep dependencies to a minimum but still provide a capable set of services. I believe I struck the balance well, and will review some of it in this post. It’s a pickle!

My idea is to create a dynamic CRUD (Create, Read, Update, Delete) layer that exposes the operations based on the SQL table schemas. This would produce a stateless set of code in which my requesting app would have zero data model knowledge prior to runtime. I would need to write a script that interacts with the database to fetch table schemas and then generate the corresponding Express routes for each CRUD operation. I use Supabase as the backend provider for a few of my apps so for this example I will use Postgres. The appropriate npm install pg express is required but otherwise it’s all code.

Step 1: Set Up Database Connection

First, I set up a connection object to the Postgres database. One of the more interesting Javascript features I have found is Object Destructing. This allows me to set values on the pg module’s Pool object and manipulate it’s expected values for use on the connection.

1
2
3
4
5
6
7
8
9
10
11
12
const { Pool } = require('pg'); //Destructuring
const express = require('express');
const app = express();
app.use(express.json());

const pool = new Pool({
user: 'your_username',
host: 'your_host',
database: 'your_database',
password: 'your_password',
port: 5432, // Default Postgres port
});

Step 2: Function to Get Table Schema

So now that I have a theoretical DB connection I can create a function to get the schema of a table. This portion is fairly straightforward. I retrieve the schema information for a provided SQL table and map the results to a new kvp object. I grab the pkey because that will be the reference for subsequent -RUD operations.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
async function getTableSchema(tableName) {
try {
// Query to get columns and their data types
const columnsResult = await pool.query(`SELECT column_name, data_type FROM information_schema.columns WHERE table_name = $1`, [tableName]);

// Query to get the primary key
const pkResult = await pool.query(`
SELECT kcu.column_name
FROM information_schema.table_constraints AS tc
JOIN information_schema.key_column_usage AS kcu
ON tc.constraint_name = kcu.constraint_name
AND tc.table_schema = kcu.table_schema
WHERE tc.table_name = $1
AND tc.constraint_type = 'PRIMARY KEY'`,
[tableName]);

const primaryKey = pkResult.rows.length > 0 ? pkResult.rows[0].column_name : null;

return {
columns: columnsResult.rows.map(row => ({
name: row.column_name,
type: row.data_type
})),
primaryKey: primaryKey
};
} catch (err) {
console.error('Error fetching table schema:', err);
throw err;
}
}

Step 3: Generate CRUD Routes

This is where most of the heavy lifting is done. This implementation utilizes the primary key to reference objects. The operations construct their appropriate query string and an array of values from the request body. SQL magic happens and then it’s either 404, 500 or 200 with json responses.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
async function generateCRUDRoutes(app, tableName) {
const schema = await getTableSchema(tableName);
const primaryKey = schema.primaryKey;

// CREATE
app.post(`/api/${tableName}`, async (req, res) => {
const columns = schema.columns.map(col => col.name);
const values = columns.map((_, index) => `$${index + 1}`);
const query = `INSERT INTO ${tableName} (${columns.join(', ')}) VALUES (${values.join(', ')}) RETURNING *`;

try {
const result = await pool.query(query, columns.map(name => req.body[name]));
res.json(result.rows[0]);
} catch (err) {
console.error('Error on CREATE operation:', err);
res.status(500).send('Error on CREATE operation');
}
});

// READ ALL
app.get(`/api/${tableName}`, async (req, res) => {
try {
const result = await pool.query(`SELECT * FROM ${tableName}`);
res.json(result.rows);
} catch (err) {
console.error('Error on READ operation:', err);
res.status(500).send('Error on READ operation');
}
});

// READ BY PKEY
app.get(`/api/${tableName}/:${primaryKey}`, async (req, res) => {
try {
const query = `SELECT * FROM ${tableName} WHERE ${primaryKey} = $1`;
const result = await pool.query(query, [req.params[primaryKey]]);
if (result.rows.length > 0) {
res.json(result.rows[0]);
} else {
res.status(404).send('Entity not found');
}
} catch (err) {
console.error('Error on READ specific entity operation:', err);
res.status(500).send('Error on READ specific entity operation');
}
});

// UPDATE
app.put(`/api/${tableName}/:${primaryKey}`, async (req, res) => {
const updates = schema.columns.map(col => `${col.name} = $${col.name === primaryKey ? 1 : 2}`);
const query = `UPDATE ${tableName} SET ${updates.join(', ')} WHERE ${primaryKey} = $1 RETURNING *`;

try {
const result = await pool.query(query, [req.params[primaryKey], ...Object.values(req.body)]);
res.json(result.rows[0]);
} catch (err) {
console.error('Error on UPDATE operation:', err);
res.status(500).send('Error on UPDATE operation');
}
});

// DELETE
app.delete(`/api/${tableName}/:${primaryKey}`, async (req, res) => {
try {
await pool.query(`DELETE FROM ${tableName} WHERE ${primaryKey} = $1`, [req.params[primaryKey]]);
res.send('Record deleted successfully');
} catch (err) {
console.error('Error on DELETE operation:', err);
res.status(500).send('Error on DELETE operation');
}
});
}

Step 4: Use the Functions

Finally, I setup the routes from an array of SQL Table names by calling generateCRUDRoutes for each table. The only other element of import is the PORT as I always prefer a nonstandard port like 8090.

1
2
3
4
5
6
7
const tableNames = ['YourTable1', 'YourTable2']; // Replace with your table names
tableNames.forEach(tableName => generateCRUDRoutes(app, tableName));

const PORT = 8090;
app.listen(PORT, () => {
console.log(`Server is running on port ${PORT}`);
});

Overall this is a pretty basic, single file structure. As a POC for a potential implementation in projects I think it checks a lot of the boxes. Future enhancements could abstract the database pool further to allow for flexibility on SQL flavors, hardening around the queries for common injection techniques, allowing for filterable query parameters based on columns outside the pkey and introduction of some standard express middleware(s). Less hardcoding at a minimum.