MCP Integration

Explore MCP

Connect OpenCode to the Explore API so your local AI coding assistant uses your server for knowledge, research and real-time data on every query.

Live
Node.js 18+
OpenCode 1.0+
stdio transport
Node.js 18 or higher installed on your machine
OpenCode installed | npm i -g opencode-ai
An Explorer API token | request one at Threema

Download both files and place them in the same directory on your machine, for example /home/youruser/explorer-mcp/

files
explorer-mcp.js   — the MCP server
package.json      — Node package config
1
Make the script executable
bash
chmod +x /home/youruser/explorer-mcp/explorer-mcp.js
2
Create OpenCode config
nano ~/.opencode/opencode.json with the following content. Replace the path and token with your actual values.
json
{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "explorer": {
      "type": "local",
      "command": ["node", "/home/youruser/explorer-mcp/explorer-mcp.js"],
      "enabled": true,
      "environment": {
        "EXPLORER_API_URL": "https://carlostkd.ch/explore/api/explore.php",
        "CODE_API_URL": "https://carlostkd.ch/explore/api/code.php",
        "ASSISTANT_API_URL": "https://carlostkd.ch/explore/api/assistant.php",
        "ASSISTANT_API_URL": "https://carlostkd.ch/explore/api/status.php",
        "ASSISTANT_API_URL": "https://carlostkd.ch/explore/api/search.php",
        "EXPLORER_TOKEN": "your_token_here"
      }
    }
  }
}
Important: The Token is not from Proton its the token from my server. You can use the test key or request one.
Important: Use the actual token string — not a shell variable like $EXPLORER_TOKEN. OpenCode does not inherit shell environment variables.
3
Create agent rules
nano ~/.config/opencode/AGENTS.md to tell the LLM when to use the tool.
markdown
## Tool Usage Rules

You have five tools. NEVER answer any message directly without using a tool first.

ROUTING RULES:
- "ping" → call ping tool and show full response without summarizing
- Message starts with "explore" → call explore_topic
- Message starts with "code" → call code_topic
- Message starts with "search" → call search_web
- Weather, news, prices, stocks → call search_web
- Everything else including greetings, identity questions, general conversation → call ask_lumo

Never answer directly without using one of these tools.
The only exception is direct file editing operations like reading or writing files in the current project.
4
Verify connection
bash
opencode mcp list
You should see:
output
●  ✓ explorer connected
       node /home/youruser/explorer-mcp/explorer-mcp.js
If it shows failed kill old processes and retry: pkill -f explorer-mcp.js && opencode mcp list
TUI mode
opencode
shell single command
opencode run "your question here"
TriggerExampleResult
explore explore quantum computing explore_topic
code code how do I reverse a string in python code_topic
weather / news / prices weather in zurich explore_topic
general question what is your name ask_lumo
anything else what time is it in tokyo ask_lumo
ping Get the last news from the server call this at anytime to get the last news

Requests are limited per token per day. The counter resets at midnight. If you hit the limit you will see:

error
MCP error -32603: Rate limit reached. Resets at midnight.
Note: Your daily limit may differ from the default 1000 i can give you more based on request.
► Server shows failed on mcp list
A stale process from a previous session is still running. Kill it and restart.
bash
pkill -f explorer-mcp.js
opencode mcp list
► Getting HTML instead of JSON
Make sure you are using the latest version of explorer-mcp.js
► LLM ignores the tool for coding questions
Some LLM models are confident enough to answer simple questions without calling any tool. Use the explicit trigger code <question> to force code_topic, or just ask naturally and the LLM will route to ask_lumo automatically.