docs(readme): update documentation with detailed MCP usage instructions
Some checks failed
CI - Semantic Release / Semantic Release (push) Has been cancelled
Some checks failed
CI - Semantic Release / Semantic Release (push) Has been cancelled
- Remove outdated badge links from README - Add comprehensive parameter documentation for IDE extensions - Clarify usage instructions for automatic LLM observability capture
This commit is contained in:
19
README.md
19
README.md
@@ -1,8 +1,5 @@
|
||||
# LLM Observability MCP for PostHog
|
||||
|
||||
[](https://www.npmjs.com/package/@sfiorini/llm-observability-mcp)
|
||||
[](https://github.com/sfiorini/llm-observability-mcp)
|
||||
[](https://www.typescriptlang.org/)
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
|
||||
A Model Context Protocol (MCP) server that provides a tool to capture LLM Observability events and send them to PostHog.
|
||||
@@ -125,7 +122,21 @@ Use this if you prefer to run the server as a standalone process.
|
||||
|
||||
For IDE extensions that support system prompts, you can instruct the AI to automatically use this MCP tool for every interaction. Add the following to your IDE's system prompt configuration:
|
||||
|
||||
"Use the `capture_llm_observability` MCP tool and send all requested data. For the `userId`, send `<my_username>`."
|
||||
```text
|
||||
Use `capture_llm_observability` MCP.
|
||||
Make sure to include all parameters and for the `userId`, send `<my_username>`:
|
||||
userId - The distinct ID of the user
|
||||
traceId - The trace ID to group AI events
|
||||
model - The model used (e.g., gpt-4, claude-3, etc.)
|
||||
provider - The LLM provider (e.g., openai, anthropic, etc.)
|
||||
input - The input to the LLM (messages, prompt, etc.)
|
||||
outputChoices - The output from the LLM
|
||||
inputTokens - The number of tokens in the input
|
||||
outputTokens - The number of tokens in the output
|
||||
latency - The latency of the LLM call in seconds
|
||||
httpStatus - The HTTP status code of the LLM call
|
||||
baseUrl - The base URL of the LLM API
|
||||
```
|
||||
|
||||
Replace `<my_username>` with a unique identifier for yourself. This ensures that all LLM activity is automatically logged in PostHog without needing to give the command each time.
|
||||
|
||||
|
||||
@@ -63,7 +63,7 @@ async function capturePosthogLlmObservability(
|
||||
distinctId: trackArgs.userId,
|
||||
properties: posthogProperties,
|
||||
});
|
||||
methodLogger.error(`Got the response from the controller`, result);
|
||||
methodLogger.debug(`Got the response from the controller`, result);
|
||||
|
||||
// Format the response for the MCP tool
|
||||
return {
|
||||
|
||||
Reference in New Issue
Block a user