Skip to content

Execute

Execute operations run your query and retrieve results. Use execute() for simple queries, paginate() for cursor-based pagination, and stream() for processing large datasets.

Run the query and return all results:

const results = await store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.status.eq("active"))
.select((ctx) => ctx.p)
.execute();
// results: readonly Person[]

Returns a readonly array of the selected type:

// TypeScript infers the shape from your selection
const results = await store
.query()
.from("Person", "p")
.select((ctx) => ({
name: ctx.p.name,
email: ctx.p.email,
}))
.execute();
// results: readonly { name: string; email: string | undefined }[]

Get the first result or undefined:

const alice = await store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.email.eq("alice@example.com"))
.select((ctx) => ctx.p)
.first();
if (alice) {
console.log(alice.name);
}

Count matching results without fetching data:

const activeCount = await store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.status.eq("active"))
.count();
// activeCount: number

Check if any results exist:

const hasActiveUsers = await store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.status.eq("active"))
.exists();
// hasActiveUsers: boolean

For large datasets, cursor-based pagination is more efficient than limit/offset. It uses keyset pagination which doesn’t degrade as you go deeper.

const firstPage = await store
.query()
.from("Person", "p")
.select((ctx) => ({
id: ctx.p.id,
name: ctx.p.name,
}))
.orderBy("p", "name", "asc") // ORDER BY required
.paginate({ first: 20 });
{
data: readonly T[], // The actual results
hasNextPage: boolean, // More results available forward
hasPrevPage: boolean, // More results available backward
nextCursor: string | undefined, // Opaque cursor for next page
prevCursor: string | undefined, // Opaque cursor for previous page
}

Use first and after to paginate forward:

// Get first page
const page1 = await query.paginate({ first: 20 });
// Get next page using the cursor
if (page1.hasNextPage && page1.nextCursor) {
const page2 = await query.paginate({
first: 20,
after: page1.nextCursor,
});
}

Use last and before to paginate backward:

// Get last page
const lastPage = await query.paginate({ last: 20 });
// Get previous page
if (lastPage.hasPrevPage && lastPage.prevCursor) {
const prevPage = await query.paginate({
last: 20,
before: lastPage.prevCursor,
});
}
ParameterTypeDescription
firstnumberNumber of results from the start
afterstringCursor to start after (forward pagination)
lastnumberNumber of results from the end
beforestringCursor to start before (backward pagination)

Pagination works with graph traversals:

const employeesPage = await store
.query()
.from("Company", "c")
.whereNode("c", (c) => c.name.eq("Acme Corp"))
.traverse("worksAt", "e", { direction: "in" })
.to("Person", "p")
.select((ctx) => ({
id: ctx.p.id,
name: ctx.p.name,
role: ctx.e.role,
}))
.orderBy("p", "name", "asc")
.paginate({ first: 50 });

For very large datasets, use streaming to process results without loading everything into memory.

const stream = store
.query()
.from("Event", "e")
.select((ctx) => ctx.e)
.orderBy("e", "createdAt", "desc") // ORDER BY required
.stream({ batchSize: 1000 });
// Process results as they arrive
for await (const event of stream) {
console.log(event.title);
await processEvent(event);
}

The batchSize option controls how many records are fetched per database query:

// Smaller batches: Lower memory usage, more database queries
.stream({ batchSize: 100 })
// Larger batches: Higher memory usage, fewer database queries
.stream({ batchSize: 5000 })
// Default is 1000
.stream()
async function exportAllUsers(): Promise<void> {
const stream = store
.query()
.from("User", "u")
.whereNode("u", (u) => u.status.eq("active"))
.select((ctx) => ({
id: ctx.u.id,
email: ctx.u.email,
name: ctx.u.name,
}))
.orderBy("u", "id", "asc")
.stream({ batchSize: 500 });
let count = 0;
for await (const user of stream) {
await exportToExternalSystem(user);
count++;
if (count % 1000 === 0) {
console.log(`Exported ${count} users...`);
}
}
console.log(`Export complete: ${count} users`);
}

Prepared queries let you compile a query once and execute it many times with different parameter values. This eliminates recompilation overhead for repeated query shapes.

Use param() to declare a named placeholder inside any predicate position:

import { param } from "@nicia-ai/typegraph";

Call .prepare() on an executable query to pre-compile the AST and SQL. Returns a PreparedQuery<R> that can be executed with different bindings.

const findByName = store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.name.eq(param("name")))
.select((ctx) => ctx.p)
.prepare();
// Execute with different bindings — no recompilation
const alices = await findByName.execute({ name: "Alice" });
const bobs = await findByName.execute({ name: "Bob" });

Parameters work anywhere a scalar value is accepted:

const findByAge = store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.age.between(param("minAge"), param("maxAge")))
.select((ctx) => ctx.p)
.prepare();
const youngAdults = await findByAge.execute({ minAge: 18, maxAge: 25 });
const seniors = await findByAge.execute({ minAge: 65, maxAge: 120 });

prepared.execute(bindings) validates bindings strictly: all declared parameters must be provided, and unknown binding keys are rejected.

param() works with any scalar predicate:

PredicateExample
eq / neqp.name.eq(param("name"))
gt / gte / lt / ltep.age.gt(param("minAge"))
betweenp.age.between(param("lo"), param("hi"))
containsp.name.contains(param("substr"))
startsWith / endsWithp.name.startsWith(param("prefix"))
like / ilikep.email.like(param("pattern"))

When the backend supports executeRaw (both SQLite and PostgreSQL backends do), the pre-compiled SQL text is sent directly to the database driver with substituted parameter values — zero recompilation overhead. When executeRaw is unavailable, the prepared query substitutes parameters into the AST and recompiles.

Get the query AST for inspection:

const builder = store
.query()
.from("Person", "p")
.whereNode("p", (p) => p.status.eq("active"))
.select((ctx) => ctx.p);
const ast = builder.toAst();
console.log(JSON.stringify(ast, null, 2));

Compile to SQL without executing:

const compiled = builder.compile();
console.log("SQL:", compiled.sql);
console.log("Parameters:", compiled.params);

Useful for:

  • Debugging query behavior
  • Understanding performance characteristics
  • Building custom query executors

Both paginate() and stream() require an orderBy() clause:

// Required for pagination
.orderBy("p", "name", "asc")
.paginate({ first: 20 });
// Required for streaming
.orderBy("e", "createdAt", "desc")
.stream();

For deterministic pagination, include a unique field in your ordering:

.orderBy("p", "name", "asc")
.orderBy("p", "id", "asc") // Ensures stable ordering
async function listUsers(cursor?: string, limit = 20) {
const query = store
.query()
.from("User", "u")
.whereNode("u", (u) => u.status.eq("active"))
.select((ctx) => ({
id: ctx.u.id,
name: ctx.u.name,
email: ctx.u.email,
}))
.orderBy("u", "createdAt", "desc")
.orderBy("u", "id", "desc");
const result = cursor
? await query.paginate({ first: limit, after: cursor })
: await query.paginate({ first: limit });
return {
users: result.data,
nextCursor: result.nextCursor,
hasMore: result.hasNextPage,
};
}
async function processAllOrders() {
const stream = store
.query()
.from("Order", "o")
.whereNode("o", (o) => o.status.eq("pending"))
.select((ctx) => ctx.o)
.orderBy("o", "createdAt", "asc")
.stream({ batchSize: 100 });
for await (const order of stream) {
try {
await fulfillOrder(order);
await store.update("Order", order.id, { status: "fulfilled" });
} catch (error) {
console.error(`Failed to process order ${order.id}:`, error);
}
}
}
function useInfiniteUsers() {
const [users, setUsers] = useState<User[]>([]);
const [cursor, setCursor] = useState<string | undefined>();
const [hasMore, setHasMore] = useState(true);
async function loadMore() {
const result = await store
.query()
.from("User", "u")
.select((ctx) => ctx.u)
.orderBy("u", "name", "asc")
.paginate({ first: 20, after: cursor });
setUsers((prev) => [...prev, ...result.data]);
setCursor(result.nextCursor);
setHasMore(result.hasNextPage);
}
return { users, loadMore, hasMore };
}
  • Order - Ordering and limiting results
  • Shape - Output transformation
  • Overview - Query categories reference