diff --git a/README.md b/README.md index 8b21133..d4bd97d 100644 --- a/README.md +++ b/README.md @@ -1,8 +1,5 @@ # LLM Observability MCP for PostHog -[![NPM Version](https://img.shields.io/npm/v/@sfiorini/llm-observability-mcp)](https://www.npmjs.com/package/@sfiorini/llm-observability-mcp) -[![Build Status](https://img.shields.io/github/workflow/status/sfiorini/llm-observability-mcp/CI)](https://github.com/sfiorini/llm-observability-mcp) -[![TypeScript](https://img.shields.io/badge/TypeScript-5.0%2B-blue)](https://www.typescriptlang.org/) [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT) A Model Context Protocol (MCP) server that provides a tool to capture LLM Observability events and send them to PostHog. @@ -125,7 +122,21 @@ Use this if you prefer to run the server as a standalone process. For IDE extensions that support system prompts, you can instruct the AI to automatically use this MCP tool for every interaction. Add the following to your IDE's system prompt configuration: -"Use the `capture_llm_observability` MCP tool and send all requested data. For the `userId`, send ``." +```text +Use `capture_llm_observability` MCP. +Make sure to include all parameters and for the `userId`, send ``: +userId - The distinct ID of the user +traceId - The trace ID to group AI events +model - The model used (e.g., gpt-4, claude-3, etc.) +provider - The LLM provider (e.g., openai, anthropic, etc.) +input - The input to the LLM (messages, prompt, etc.) +outputChoices - The output from the LLM +inputTokens - The number of tokens in the input +outputTokens - The number of tokens in the output +latency - The latency of the LLM call in seconds +httpStatus - The HTTP status code of the LLM call +baseUrl - The base URL of the LLM API +``` Replace `` with a unique identifier for yourself. This ensures that all LLM activity is automatically logged in PostHog without needing to give the command each time. diff --git a/src/tools/posthog-llm.tool.ts b/src/tools/posthog-llm.tool.ts index 1f2aafd..1e5bb11 100644 --- a/src/tools/posthog-llm.tool.ts +++ b/src/tools/posthog-llm.tool.ts @@ -63,7 +63,7 @@ async function capturePosthogLlmObservability( distinctId: trackArgs.userId, properties: posthogProperties, }); - methodLogger.error(`Got the response from the controller`, result); + methodLogger.debug(`Got the response from the controller`, result); // Format the response for the MCP tool return {