SapixDBSapixDB/Docs
Early Access
User Manual · v1.0

Getting Started with SapixDB

No prior database experience required. This manual walks you through everything from installation to writing your first intelligent, self-auditing record.

What you'll need
  • A computer running macOS, Linux, or Windows
  • Docker Desktop installed (free at docker.com)
  • A terminal / command prompt
  • Basic familiarity with JSON (key-value pairs like {"name": "Alice"})

1. What is SapixDB?

Think of SapixDB as a database that never forgets, never lies, and never needs you to restructure it. Every piece of data you store is:

  • Signed — cryptographically stamped so you always know who created it
  • Linked — chained to the previous record, creating an unbreakable history
  • Permanent — nothing is ever deleted or overwritten; old versions stay queryable
  • Schema-free — add new fields any time; no ALTER TABLE, no migration scripts

The analogy SapixDB uses is biology. Your data is stored as nucleotides (individual records) strung together into a strand(a chain of records for one entity). Just like DNA, the chain can be read from any point in history and cannot be tampered with.

Traditional Database
SapixDB
Rows that get overwritten
Nucleotides that accumulate
DELETE removes data forever
History stays — always queryable
Schema migrations required
Fields evolve naturally
Manual audit logs
Built-in cryptographic trail
You own the table
AI agents can own the data

2. Installation

SapixDB runs as a Docker container. You do not need to install anything special — Docker handles all the dependencies.

1
Install Docker Desktop
Download and install Docker Desktop from docker.com/products/docker-desktop. After installation, open Docker Desktop and make sure it says "Engine running".
2
Create a project folder
Open your terminal and create a folder for SapixDB:
terminal
mkdir my-sapixdb && cd my-sapixdb
3
Create the docker-compose.yml file
Inside that folder, create a file named docker-compose.yml and paste this content:
docker-compose.yml
services:
  sapixdb:
    image: sapixdb/agent:latest
    container_name: sapixdb
    restart: unless-stopped
    ports:
      - "7475:7475"
    volumes:
      - sapixdb_strand:/data/strand
      - sapixdb_graph:/data/graph
      - sapixdb_blobs:/data/blobs
    environment:
      SAPIX_AGENT_ID: my-first-agent
      SAPIX_STRAND_DIR: /data/strand
      SAPIX_GRAPH_DIR: /data/graph
      SAPIX_BLOB_DIR: /data/blobs
      SAPIX_PORT: 7475

volumes:
  sapixdb_strand:
  sapixdb_graph:
  sapixdb_blobs:
4
Start SapixDB
terminal
docker compose up -d
Docker will download the SapixDB image (first time only — takes 1–2 minutes) and start the database in the background.
5
Verify it's running
terminal
curl http://localhost:7475/v1/health
You should see:
response
{"status":"ok","agent":"my-first-agent"}
If you see that, SapixDB is live and ready.
No curl? Use your browserOpen http://localhost:7475/v1/health directly in your browser. You'll see the same JSON response.

3. First Steps — Understanding the URL structure

Every request you make to SapixDB follows this pattern:

http://localhost:7475/v1/{your-agent-id}/{collection}/{...action}
  • 7475 — the port SapixDB listens on
  • /v1/ — API version prefix
  • my-first-agent — the name you gave in SAPIX_AGENT_ID
  • users / orders / etc. — your collection name (like a table name)

There is no schema to define first. You just start writing data and SapixDB figures out the structure from your records.

4. Writing Data

Use a POST request to write a record. Let's store a user:

Write a single record

terminal — write a user
curl -X POST http://localhost:7475/v1/my-first-agent/strand/write \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "users",
    "data": {
      "name": "Alice",
      "email": "[email protected]",
      "role": "admin"
    }
  }'
response
{
  "id": "nuc_abc123",
  "hash": "sha3:e7f2a1...",
  "prev_hash": null,
  "timestamp": "2026-05-12T10:00:00Z",
  "collection": "users"
}

SapixDB returns an id (the unique nucleotide ID) and a hash(the cryptographic fingerprint of the record). Save the id — you'll use it to read this record back.

Write another record — SapixDB links them automatically

terminal — update Alice's role
curl -X POST http://localhost:7475/v1/my-first-agent/strand/write \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "users",
    "data": {
      "name": "Alice",
      "email": "[email protected]",
      "role": "superadmin"
    }
  }'

SapixDB does not overwrite the previous record. It appends a new nucleotide that links back to the old one. Alice now has a history: first she wasadmin, now she's superadmin. Both versions are preserved forever.

No DELETE in SapixDBSapixDB is an append-only database. You cannot delete records. Instead, write a new record with a "deleted": true field — your application checks that field, but the full history is always preserved for audit purposes.

5. Reading Data

Get the latest version of a record

terminal — read by collection
curl "http://localhost:7475/v1/my-first-agent/strand/query" \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "users",
    "filter": { "name": "Alice" },
    "latest": true
  }'
response
{
  "results": [
    {
      "id": "nuc_def456",
      "data": { "name": "Alice", "email": "[email protected]", "role": "superadmin" },
      "timestamp": "2026-05-12T10:05:00Z",
      "hash": "sha3:f8a3b2..."
    }
  ]
}

Get ALL versions (the full history)

terminal — full history
curl "http://localhost:7475/v1/my-first-agent/strand/query" \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "users",
    "filter": { "name": "Alice" },
    "latest": false
  }'

This returns every version ever written for Alice — in chronological order, each linked to the previous by hash. This is the strand.

Read a record by its exact ID

terminal
curl http://localhost:7475/v1/my-first-agent/strand/nuc_abc123

6. Time Travel — Query the Past

Because SapixDB never overwrites data, you can ask "what did the database look like at 9 AM yesterday?" This is called time travel and it's built in — no extra configuration needed.

terminal — query at a specific point in time
curl "http://localhost:7475/v1/my-first-agent/strand/query" \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "users",
    "filter": { "name": "Alice" },
    "as_of": "2026-05-12T10:02:00Z"
  }'

The response will show Alice's record exactly as it was at 10:02 AM — theadmin version, before the update to superadmin.

Why this mattersIf you're in healthcare, finance, or any regulated industry — proving what your data looked like at a specific past moment is a compliance requirement. SapixDB satisfies this automatically for every record, forever.

7. Agents & Data Ownership

In SapixDB, an agent is a named identity that owns a collection of data. When you set SAPIX_AGENT_ID=my-first-agent, you're declaring that agent's identity.

This matters because:

  • Every nucleotide is signed with the agent's identity
  • You can run multiple agents (e.g., one per microservice or AI process)
  • AI agents like LLMs can own their own data strand — writing directly to SapixDB
  • You always know which agent wrote which record

Ingest data from an AI agent or external process

SapixDB includes a special /ingest endpoint designed for automated pipelines — AI agents, webhooks, cron jobs, etc.:

terminal — ingest from an AI agent
curl -X POST http://localhost:7475/v1/my-first-agent/ingest \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "decisions",
    "data": {
      "agent": "gpt-4o",
      "action": "approved_loan",
      "reason": "Credit score 780, DTI 28%",
      "confidence": 0.94
    }
  }'

Every AI decision is now permanently logged, signed, and auditable. You can always prove what the AI decided, when, and why.

8. Graph Relationships

SapixDB includes a built-in graph layer. You can create directed edges between any two records, representing relationships like "Alice manages Bob" or "Order #42 belongs to Customer #7".

Create a relationship

terminal — create edge
curl -X POST http://localhost:7475/v1/my-first-agent/graph/edge \
  -H "Content-Type: application/json" \
  -d '{
    "src": "nuc_abc123",
    "dst": "nuc_xyz789",
    "edge_type": "manages",
    "weight": 1.0
  }'

Traverse relationships

terminal — find everything Alice manages (depth 2)
curl "http://localhost:7475/v1/my-first-agent/graph/traverse/nuc_abc123?depth=2&direction=outbound"

This returns all nodes reachable from Alice's record within 2 hops — useful for org charts, dependency trees, recommendation engines, and access control graphs.

9. HIPAA & SOX Compliance

SapixDB's architecture is compliance-by-default. Here's why that matters and what it means in practice:

HIPAA (Healthcare)
Every access to patient data is permanently logged with a cryptographic signature. Who read it, who wrote it, when — immutably. This satisfies the HIPAA audit trail requirement without any extra tooling.
SOX (Financial)
Financial records are append-only and hash-linked. No one can alter a past entry without breaking the chain — which is immediately detectable. SOX requires this kind of tamper-evident record keeping.
GDPR (Right to be Forgotten)
SapixDB stores data in encrypted blobs. To 'delete' under GDPR, you destroy the encryption key — the record remains in the chain but becomes unreadable. The cryptographic chain stays intact.

10. Troubleshooting

Q: SapixDB won't start / port already in use

Another process is using port 7475. Change the port mapping in docker-compose.yml: "7476:7475" and update your requests to use port 7476.

Q: curl: command not found

Install curl: on macOS run brew install curl, on Ubuntu run sudo apt install curl. Alternatively, use Postman (a free GUI tool) to send requests instead.

Q: Health check returns connection refused

The container may still be starting. Wait 10 seconds and try again. Check container status with docker compose ps. If it says "unhealthy", check logs: docker compose logs sapixdb

Q: I wrote data but the query returns empty

Make sure the collection name in your read query exactly matches what you used when writing. Collection names are case-sensitive. Also verify your SAPIX_AGENT_ID in the URL path matches the one set in docker-compose.yml.

Q: How do I stop SapixDB?

Run docker compose down in your project folder. Your data is safely stored in Docker volumes and will be there when you start again with docker compose up -d.

Q: How do I back up my data?

Your data lives in the sapixdb_strand, sapixdb_graph, and sapixdb_blobs Docker volumes. Back these up with docker run --rm -v sapixdb_strand:/data -v $(pwd):/backup ubuntu tar czf /backup/strand-backup.tar.gz /data

Ready to go deeper?

Explore the full developer reference for advanced queries, distributed mode, agent graph traversal, and SaQL — our semantic query language.