Modern, lightweight, minimal serverless client for Prisma Postgres®.
Install this package using your package manager and registry of choice:
- npm:
npm install @prisma/ppg - pnpm:
pnpm add @prisma/ppg(orpnpm add jsr:@prisma/ppg) - Yarn:
yarn add @prisma/ppg - Bun:
bun add @prisma/ppg - Deno:
deno add jsr:@prisma/ppg - esm.sh CDN:
https://esm.sh/@prisma/ppg
Important: This client requires a Prisma Postgres Direct TCP Connection URL. This URL format is specific to Prisma Postgres and has the following structure:
postgres://identifier:[email protected]:5432/postgres?sslmode=require
You can find this connection string in the API Keys section of your Prisma Postgres dashboard. Other standard PostgreSQL connection strings will not work with this client.
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";
const ppg = prismaPostgres(
defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!)
);
// SQL template literals with automatic parameterization
const users = await ppg.sql<User>`SELECT * FROM users WHERE id = ${userId}`.collect();This library provides two APIs:
prismaPostgres()- High-level API with SQL template literals, transactions, and batch operations (recommended for most users)client()- Low-level untyped API with explicit parameter passing and session management
The prismaPostgres() function returns a feature-rich client with SQL template literals, transactions, and batch operations.
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";
// Recommended: Use defaultClientConfig to include default parsers and serializers
const ppg = prismaPostgres(
defaultClientConfig("postgres://identifier:[email protected]:5432/postgres?sslmode=require")
);
// Or manually configure (no default parsers/serializers):
const ppgCustom = prismaPostgres({
connectionString: "postgres://identifier:[email protected]:5432/postgres?sslmode=require",
parsers: [/* your custom parsers */],
serializers: [/* your custom serializers */]
});Important: Use
defaultClientConfig()to automatically include default parsers and serializers for common PostgreSQL types (Date, JSON, BigInt, etc.). Without it, you'll need to manually configure parsers and serializers.
Execute queries using tagged template literals. Values are automatically parameterized to prevent SQL injection.
type User = { id: string; name: string; email: string };
const users = ppg.sql<User>`SELECT * FROM users`;
// Stream results one by one
for await (const user of users) {
console.log(user.id, user.name, user.email);
}Collect all results into an array:
const users = await ppg.sql<User>`SELECT * FROM users WHERE email = ${"[email protected]"}`.collect();
console.log(users[0].name);Use sql.exec for INSERT, UPDATE, DELETE operations that return affected row counts:
const affected = await ppg.sql.exec`DELETE FROM users WHERE id = ${userId}`;
console.log(`Deleted ${affected} user(s)`);Use the query() method with explicit positional parameters:
const users = await ppg.query<User>("SELECT * FROM users WHERE email = $1", "[email protected]").collect();Use the exec() method for write operations:
const affected = await ppg.exec("INSERT INTO users (name, email) VALUES ($1, $2)", "Alice", "[email protected]");
console.log(`Inserted ${affected} row(s)`);Execute multiple queries within an automatic transaction. The transaction commits on success or rolls back on error:
const result = await ppg.transaction(async (tx) => {
// BEGIN is executed automatically
await tx.sql.exec`INSERT INTO users (name) VALUES ('Alice')`;
const users = await tx.sql<User>`SELECT * FROM users WHERE name = 'Alice'`.collect();
// COMMIT is executed automatically
return users[0].name;
});
console.log(result); // "Alice"Rollback on error:
try {
await ppg.transaction(async (tx) => {
await tx.sql.exec`INSERT INTO users (name) VALUES ('Bob')`;
throw new Error("Something went wrong");
// ROLLBACK is executed automatically
});
} catch (error) {
console.log("Transaction rolled back");
}Use query() and exec() methods within transactions:
await ppg.transaction(async (tx) => {
const before = await tx.query<User>("SELECT * FROM users WHERE id = $1", userId).collect();
await tx.exec("UPDATE users SET name = $1 WHERE id = $2", "New Name", userId);
const after = await tx.query<User>("SELECT * FROM users WHERE id = $1", userId).collect();
});Execute multiple statements in a single round-trip within an automatic transaction.
const [users, affected, counts] = await ppg.batch<[User[], number, { count: bigint }[]]>(
{ query: "SELECT * FROM users WHERE id < $1", parameters: [5] },
{ exec: "INSERT INTO users (name) VALUES ($1)", parameters: ["Charlie"] },
{ query: "SELECT COUNT(*)::int8 as count FROM users", parameters: [] }
);
console.log(users); // User[]
console.log(affected); // number
console.log(counts[0].count); // bigintconst [users, affected, counts] = await ppg.batch()
.query<User>("SELECT * FROM users WHERE id = $1", userId)
.exec("UPDATE users SET name = $1 WHERE id = $2", "Updated Name", userId)
.query<{ count: bigint }>("SELECT COUNT(*)::int8 as count FROM users")
.run();Batch operations are atomic - they rollback automatically on error:
try {
await ppg.batch<[number, User[]]>(
{ exec: "INSERT INTO users (name) VALUES ($1)", parameters: ["Dave"] },
{ query: "SELECT * FROM invalid_table", parameters: [] } // Fails
);
} catch (error) {
// First insert is rolled back
}When using defaultClientConfig(), the client automatically handles PostgreSQL types:
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";
const ppg = prismaPostgres(defaultClientConfig(process.env.DATABASE_URL!));
// JSON/JSONB - automatic parsing
const rows = await ppg.sql<{ data: { key: string } }>`
SELECT '{"key": "value"}'::jsonb as data
`.collect();
console.log(rows[0].data.key); // "value" (already parsed)
// BigInt - parsed to JavaScript BigInt
const bigints = await ppg.sql<{ big: bigint }>`
SELECT 9007199254740991::int8 as big
`.collect();
console.log(typeof bigints[0].big); // "bigint"
// Date/Timestamp - parsed to Date objects
const dates = await ppg.sql<{ created: Date }>`
SELECT NOW() as created
`.collect();
console.log(dates[0].created instanceof Date); // true
// Date serialization - automatic conversion
const testDate = new Date("2024-01-15T10:30:00Z");
await ppg.sql.exec`INSERT INTO events (timestamp) VALUES (${testDate})`;
// Null handling
const rows2 = await ppg.sql<{ id: number | null; name: string | null }>`
SELECT NULL::int as id, 'test'::text as name
`.collect();With defaultClientConfig(), the following types are automatically parsed:
| PostgreSQL Type | JavaScript Type | OID |
|---|---|---|
boolean |
boolean |
16 |
int2, int4 |
number |
21, 23 |
int8 |
bigint |
20 |
float4, float8 |
number |
700, 701 |
text, varchar |
string |
25, 1043 |
json, jsonb |
object |
114, 3802 |
date |
Date |
1082 |
time |
string |
1083 |
timestamp |
Date |
1114 |
timestamptz |
Date |
1184 |
The client() function provides direct control over query execution and session management.
import { client, defaultClientConfig } from "@prisma/ppg";
// Recommended: Use defaultClientConfig to include default parsers and serializers
const cl = client(
defaultClientConfig("postgres://identifier:[email protected]:5432/postgres?sslmode=require")
);
// Or manually configure:
const clCustom = client({
connectionString: "postgres://identifier:[email protected]:5432/postgres?sslmode=require",
parsers: [/* your custom parsers */],
serializers: [/* your custom serializers */]
});Important: Use
defaultClientConfig()to get automatic type parsing (Date, JSON, BigInt, etc.) and serialization (Date, BigInt, Number).
const result = await cl.query("SELECT * FROM users WHERE id = $1", userId);
console.log(result.columns); // Column metadata
const rows = await result.rows.collect(); // Collect all rows
console.log(rows[0].values); // Access row values as arrayconst affected = await cl.exec("INSERT INTO users (name, email) VALUES ($1, $2)", "Alice", "[email protected]");
console.log(`Affected ${affected} rows`);const result = await cl.query("SELECT * FROM users");
for await (const row of result.rows) {
console.log(row.values); // Process one row at a time
}Sessions provide stateful connections for transactions and sequential operations.
const session = await cl.newSession();
// Execute queries in the session
const result = await session.query("SELECT 1");
const rows = await result.rows.collect();
// Close the session when done
session.close();{
using session = await cl.newSession();
await session.query("SELECT 1");
// Session automatically closes when leaving scope
}const session = await cl.newSession();
await session.query("BEGIN");
try {
await session.query("INSERT INTO users (name) VALUES ($1)", "Alice");
const result = await session.query("SELECT * FROM users WHERE name = $1", "Alice");
await session.query("COMMIT");
} catch (error) {
await session.query("ROLLBACK");
throw error;
} finally {
session.close();
}Sessions can handle concurrent queries via WebSocket multiplexing:
const session = await cl.newSession();
const [res1, res2, res3] = await Promise.all([
session.query("SELECT 1"),
session.query("SELECT 2"),
session.query("SELECT 3")
]);
const rows1 = await res1.rows.collect();
const rows2 = await res2.rows.collect();
const rows3 = await res3.rows.collect();
session.close();const result = await cl.query("SELECT id, name FROM users");
result.columns.forEach(col => {
console.log(col.name); // Column name
console.log(col.oid); // PostgreSQL type OID
});const result = await cl.query("SELECT id, name FROM users");
const rows = await result.rows.collect();
rows.forEach(row => {
const [id, name] = row.values;
console.log(id, name);
});You can override or add parsers on top of the defaults:
import { client, defaultClientConfig } from "@prisma/ppg";
import type { ValueParser } from "@prisma/ppg";
const uuidParser: ValueParser<string> = {
oid: 2950, // UUID type OID
parse: (value: string | null) => value ? value.toUpperCase() : null
};
const config = defaultClientConfig("postgres://identifier:[email protected]:5432/postgres?sslmode=require");
const cl = client({
...config,
parsers: [...config.parsers, uuidParser], // Add to defaults
});
// Or replace defaults entirely:
const clCustom = client({
connectionString: "postgres://identifier:[email protected]:5432/postgres?sslmode=require",
parsers: [uuidParser], // Only your custom parsers
});Add custom serializers on top of defaults:
import { client, defaultClientConfig } from "@prisma/ppg";
import type { ValueSerializer } from "@prisma/ppg";
class Point { constructor(public x: number, public y: number) {} }
const pointSerializer: ValueSerializer<Point> = {
supports: (value: unknown): value is Point => value instanceof Point,
serialize: (value: Point) => `(${value.x},${value.y})`
};
const config = defaultClientConfig("postgres://identifier:[email protected]:5432/postgres?sslmode=require");
const cl = client({
...config,
serializers: [pointSerializer, ...config.serializers], // Your serializer first
});
await cl.query("INSERT INTO locations (point) VALUES ($1)", new Point(10, 20));Note: Custom serializers are checked in order. Put your custom serializers before defaults so they take precedence.
Send binary data efficiently:
import { byteArrayParameter, boundedByteStreamParameter, BINARY, TEXT } from "@prisma/ppg";
// From Uint8Array
const bytes = new Uint8Array([1, 2, 3, 4]);
const param = byteArrayParameter(bytes, BINARY);
await cl.query("INSERT INTO files (data) VALUES ($1)", param);
// From ReadableStream
const stream = getReadableStream();
const streamParam = boundedByteStreamParameter(stream, BINARY, 1024);
await cl.query("INSERT INTO files (data) VALUES ($1)", streamParam);The library provides a structured error hierarchy:
import {
GenericError, // Base error class
ValidationError, // Invalid input or configuration
HttpResponseError, // HTTP transport errors
WebSocketError, // WebSocket transport errors
DatabaseError // PostgreSQL errors
} from "@prisma/ppg";
try {
await ppg.sql`SELECT * FROM invalid_table`.collect();
} catch (error) {
if (error instanceof DatabaseError) {
console.log(error.code); // PostgreSQL error code (e.g., "42P01")
console.log(error.details); // Additional error details
}
}Creates a client configuration with default parsers and serializers.
Returns:
connectionString: string- The connection URLparsers: ValueParser<unknown>[]- Default type parsers (Date, JSON, BigInt, etc.)serializers: ValueSerializer<unknown>[]- Default type serializers (Date, BigInt, Number)
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";
const ppg = prismaPostgres(defaultClientConfig(process.env.DATABASE_URL!));Configuration:
connectionString: string- PostgreSQL connection URLparsers?: ValueParser<unknown>[]- Custom type parsers (usedefaultClientConfigfor defaults)serializers?: ValueSerializer<unknown>[]- Custom type serializers (usedefaultClientConfigfor defaults)
Returns:
sql- SQL template literal tagsql.exec- SQL template literal for exec operationsquery<R>(sql, ...params): CollectableIterator<R>- Execute query with explicit parametersexec(sql, ...params): Promise<number>- Execute command with explicit parameterstransaction<T>(callback): Promise<T>- Execute interactive transactionbatch(...statements)- Execute batch operationsbatch()- Start fluent batch builder
Configuration:
connectionString: string- PostgreSQL connection URLparsers?: ValueParser<unknown>[]- Custom type parsers (usedefaultClientConfigfor defaults)serializers?: ValueSerializer<unknown>[]- Custom type serializers (usedefaultClientConfigfor defaults)
Returns:
query(sql, ...params): Promise<Resultset>- Execute queryexec(sql, ...params): Promise<number>- Execute commandnewSession(config?: SessionConfig): Promise<Session>- Create new session
interface CollectableIterator<T> extends AsyncIterableIterator<T> {
collect(): Promise<T[]>;
}
interface Resultset {
columns: Column[];
rows: CollectableIterator<Row>;
}
interface Row {
values: unknown[];
}
interface Column {
name: string;
oid: number;
}
interface Session extends Statements, Disposable {
readonly active: boolean;
close(): void;
}Apache-2.0