Uploading knowledge files
Upload files to the chat agent's uploads area so you can promote them into your knowledge base.
This guide builds on Uploading files. Knowledge uploads let your SDK integration stage files that the Lightfield chat agent can see.
How knowledge uploads work
Section titled “How knowledge uploads work”The SDK does not write directly into the knowledge base. Instead, each upload lands in a dedicated uploads area that the chat agent can read. The agent then does the actual work of promoting the file into files/ or skills/ and updating index.md, under your direction.
This separation is intentional:
- Uploads always land in a predictable staging location
- Nothing is indexed automatically — you decide what becomes knowledge
- Organization happens through chat, where you can inspect, rename, or summarize content before promoting it
SDK upload → uploads area → (you, in chat) → knowledge baseBefore you begin
Section titled “Before you begin”You will need:
- A valid API key whose caller resolves to a specific user
- The
files:createscope to create and complete uploads - The
files:readscope if you want to verify the uploaded file or fetch a signed download URL - For
knowledge_workspace: the caller must be an org admin
Choose a purpose
Section titled “Choose a purpose”There are two knowledge purposes. Pick the one that matches who the file is for.
| Purpose | Where the file lands | Who can upload |
|---|---|---|
knowledge_user | The authenticated user’s personal uploads area | Any caller with a resolvable user identity |
knowledge_workspace | The organization’s shared uploads area | Org admins only |
Non-admin callers that request knowledge_workspace are rejected with 403.
Upload lifecycle
Section titled “Upload lifecycle”Knowledge uploads use the same three-step lifecycle as any other file:
POST /v1/filesPUT <uploadUrl>POST /v1/files/{id}/complete
There are no new endpoints. The only knowledge-specific behaviors are the purpose value you send, the timestamped filename returned in the response, and the if-none-match: * header that is included in uploadHeaders. See Collision handling in the file upload guide for details.
Step 1: Create the upload session
Section titled “Step 1: Create the upload session”curl https://api.lightfield.app/v1/files \ -X POST \ -H "Authorization: Bearer YOUR_API_KEY" \ -H "Lightfield-Version: 2026-03-01" \ -H "Content-Type: application/json" \ -d '{ "purpose": "knowledge_user", "filename": "playbook.md", "mimeType": "text/markdown", "sizeBytes": 4821 }'A successful response looks like:
{ "id": "fil_abc123", "filename": "playbook_1713045600000.md", "status": "PENDING", "uploadUrl": "https://...", "uploadHeaders": { "content-type": "text/markdown", "if-none-match": "*" }, "expiresAt": "2026-04-08T12:00:00.000Z"}The filename in the response is the stored name — a timestamp was appended before the extension so uploads never collide with existing objects. Use this value when you refer to the file in chat so the agent can locate it.
Step 2: Upload the raw bytes
Section titled “Step 2: Upload the raw bytes”Use every header from uploadHeaders — including if-none-match: *. Stripping it will invalidate the signature.
curl -i -X PUT "$UPLOAD_URL" \ -H "content-type: text/markdown" \ -H "if-none-match: *" \ --data-binary @./playbook.mdIf an object already exists at the target key (astronomically unlikely given the timestamp), S3 rejects the PUT with 412 Precondition Failed.
Step 3: Complete the upload
Section titled “Step 3: Complete the upload”curl https://api.lightfield.app/v1/files/$FILE_ID/complete \ -X POST \ -H "Authorization: Bearer YOUR_API_KEY" \ -H "Lightfield-Version: 2026-03-01" \ -H "Content-Type: application/json" \ -d '{}'After this call, the file status transitions from PENDING to COMPLETED and the chat agent can see it.
Promote the file in chat
Section titled “Promote the file in chat”The chat agent mounts the uploads area alongside the knowledge base. After the upload completes, ask the agent to do the work:
- “Move
playbook_1713045600000.mdfrom uploads intofiles/workspace/sales/asplaybook.md.” - “Summarize the file I just uploaded and add an entry for it in
index.md.” - “File this under
skills/as a skill the agent can call, then delete the copy in uploads.”
The agent uses its existing file operations to move, rename, and index the file. No additional API calls are required from the SDK.
File lifecycle
Section titled “File lifecycle”| Stage | Where the file lives | Who moves it |
|---|---|---|
After POST /v1/files/{id}/complete | Uploads area (uploads/user or uploads/workspace) | — |
| After you ask the agent in chat | Knowledge base (files/ or skills/) | Chat agent |
There is no automatic cleanup of the uploads area. Files stay there until you (through the chat agent) delete or move them.
Errors
Section titled “Errors”| Status | Code | Meaning |
|---|---|---|
400 | purpose_requires_user | The API key has no resolvable user identity. Use a USER key, or a WORKSPACE key whose creator is still in the org. |
403 | purpose_requires_admin | knowledge_workspace was requested by a non-admin caller. |
412 | (S3 response) | The presigned PUT’s if-none-match: * precondition failed. Create a new upload session to get a fresh timestamped key. |
See Errors for general error handling.
End-to-end flow
Section titled “End-to-end flow”POST /v1/fileswithpurpose: "knowledge_user"or"knowledge_workspace"PUT <uploadUrl>with every header fromuploadHeaders(includingif-none-match: *)POST /v1/files/{id}/complete- Open agent and ask it to promote the file into knowledge
Next steps
Section titled “Next steps”- Uploading files — Reference guide for the underlying file upload lifecycle.
- API keys — Learn how USER vs WORKSPACE keys resolve an actor user.
- API Reference — Full endpoint reference for files and other resources.