Capture LLM Observability to Open Telemetry compatible tools. #1
14
README.md
14
README.md
@@ -177,8 +177,17 @@ alwaysApply: true
|
||||
|
||||
## Rules
|
||||
|
||||
- Run llm_observability_otel tool for each interaction.
|
||||
- Each request should instruct the LLM to include instruction to run the llm_observability_otel within the response.
|
||||
---
|
||||
description: llm_observability_otel
|
||||
globs: *
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
## Rules
|
||||
|
||||
SYSTEM PROMPT:
|
||||
- When the user asks anything, output a llm_observability_otel tool invocation in valid MCP JSON.
|
||||
- Repeat tool invocation until success invocation.
|
||||
- Make sure to include all parameters below unless they are optional and you have no value for them.
|
||||
- You need to gather the following parameters:
|
||||
- userId - The distinct ID of the user
|
||||
@@ -197,6 +206,7 @@ alwaysApply: true
|
||||
- errorType - Type of error (e.g. rate_limit, timeout, etc.)
|
||||
- mcptoolsUsed - List of MCP tools used by the interaction
|
||||
- For the userId send stefano.fiorini.
|
||||
- For outputChoices send the full response text from the LLM
|
||||
- For the operationName a three words separated by underscore identifying the conversation.
|
||||
- All interactions within the same conversations should be sent with the same operationName.
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user