Skip to main content
NIOM keeps things simple: one configuration file at ~/.niom/config.json. That’s it. No .env files, no environment variables to juggle.

Common setups

The bare minimum to get running. Uses Claude with DuckDuckGo search (no API key needed):
{
  "workspace": "C:\\Users\\you",
  "gateway_key": "vck_...",
  "provider": "anthropic",
  "model": "anthropic/claude-4-sonnet-20250514"
}
Fast analysis with Groq, main work with Claude, vision with GPT-4o, Brave search, and GitHub connected:
{
  "workspace": "C:\\Users\\you",
  "gateway_key": "vck_...",
  "provider": "anthropic",
  "model": "anthropic/claude-4-sonnet-20250514",
  "search": { "provider": "brave", "api_key": "BSA..." },
  "models": {
    "fast": "groq/llama-3.3-70b-versatile",
    "vision": "openai/gpt-4o"
  },
  "mcp": [
    {
      "name": "github",
      "command": "github-mcp-server",
      "env": { "GITHUB_TOKEN": "ghp_..." }
    }
  ]
}
Everything runs locally. No API keys, no cloud, no network calls:
{
  "workspace": "C:\\Users\\you",
  "provider": "ollama",
  "model": "ollama/llama3.1",
  "models": {
    "fast": "ollama/llama3.1",
    "vision": "ollama/llava"
  }
}
Make sure Ollama is installed and running locally.

Full reference

Core settings

FieldTypeDefaultWhat it does
workspacestringHome directoryRoot directory for file operations — NIOM works within this folder
gateway_keystringYour AI Gateway API key (required for cloud models)
providerstring"anthropic"Which LLM provider to use
modelstring"anthropic/claude-4-sonnet-20250514"The main model for execution
sidecar_portnumber3001Port the AI sidecar listens on
NIOM can search the web during conversations. Two providers are supported:
FieldTypeDefaultWhat it does
search.provider"brave" or "duckduckgo""duckduckgo"DuckDuckGo works without a key. Brave is faster and richer.
search.api_keystringBrave Search API key (free tier: 2,000 searches/month)

Model roles

NIOM uses different models for different jobs — so you’re not paying premium prices for simple analysis:
FieldUsed forDefaultWhy
model (your selection)Main executionClaude SonnetThe workhorse. Handles tool calls, reasoning, and responses
models.fastAnalysis + evaluationGroq Llama 3.3 70BSub-second, cheap. Used for intent classification and quality checks
models.visionScreenshot understandingGPT-4oNeeded for computer use — reads what’s on your screen

File watching (Cortex)

NIOM watches your workspace for file changes to build context. You can tune what it watches:
FieldTypeDefaultWhat it does
cortex.watch_pathsstring[][]Extra directories to watch beyond your workspace
cortex.excludedstring[]Common excludesFolder names to ignore (e.g., node_modules, .git, target)
cortex.max_eventsnumber1000How many file events to keep in memory

MCP servers

Connect external tools by adding MCP servers. They auto-start when the sidecar boots:
{
  "mcp": [
    {
      "name": "github",
      "command": "github-mcp-server",
      "args": [],
      "env": { "GITHUB_TOKEN": "ghp_..." }
    }
  ]
}
See the MCP Integration guide for setup instructions and a list of popular servers.

Where your data lives

~/.niom/
├── config.json              # This file — all your settings
├── memory/                  # Unified persistence layer (MemoryStore)
│   ├── index.json           # Fast lookup index for all collections
│   ├── conversations/       # Chat threads
│   │   └── {thread-id}.enc  # AES-256-GCM encrypted
│   ├── tasks/               # Background task definitions
│   │   └── {task-id}.enc
│   ├── runs/                # Task execution history
│   │   └── {task-id}/
│   │       └── run_NNNN.enc
│   └── brain/               # Long-term memory
│       └── knowledge.enc    # Facts, preferences, patterns
├── screenshots/             # Computer use (auto-cleanup, last 5 kept)
└── mcp/                     # MCP configs (future)
All data in memory/ is encrypted with AES-256-GCM. Even if someone accesses your ~/.niom/ directory, they can’t read your conversations, tasks, or learned facts.

How to update your config

Three ways, pick whichever suits you:
  1. Settings UI — open the overlay and go to Settings
  2. API callPOST /providers/configure (docs)
  3. Edit the file directly — change ~/.niom/config.json (requires sidecar restart)