Context
Model Context Protocols (MCPs) are coming online because of AI
What is it?
It’s a standard dictates the format for defining:
Resources
Passive data sources defined like such:
{
"uri": "nginx://logs/error",
"name": "Nginx Error Logs",
"mimeType": "text/plain",
"description": "Live stream of the latest Nginx error logs"
}
Tools
Executable Functions. You are providing actions that the agent can take.
{
"name": "update_nginx_map",
"description": "Updates the IP allowlist in the Nginx config",
"inputSchema": {
"type": "object",
"properties": {
"ip_address": { "type": "string" },
"action": { "type": "integer", "enum": [0, 1] }
},
"required": ["ip_address", "action"]
}
}
Prompts
Workflows, templates, Instructions. You are providing task lists the agent can take.
{
"name": "debug-gateway",
"description": "Start a step-by-step diagnostic of the Nginx gateway",
"arguments": [
{
"name": "service_name",
"description": "The Go service to check",
"required": true
}
]
}
File/Folder Structure
The files are organized as such:
/nginx-mcp-server
├── server.py <-- The "Brain" (Where decorators live)
├── pyproject.toml <-- Dependencies (mcp[cli], httpx, etc.)
└── .env <-- Secret keys (if your API needs them)
FastMCP
It providers you a way to write MCP classes using Python instead of writing json. It includes logging to client, progress reporting, hot reloading.
Handshake
- The LLM client (VSCode or Claude Desktop) reads the local config file. The mcp.json file.
{
"mcpServers": {
"nginx-manager": {
"command": "python",
"args": ["/path/to/server.py"]
}
}
}
A list of mcpServers
-
The client then performs a handshake. The JSON-RPC Discovery phase. The Python code is converted into the JSON-RPC Schema at this time.
-
The client receives the schema and processes it for use.