From 7b35d74659f95a097e5e50ddeecbd6b50a5f0f6a Mon Sep 17 00:00:00 2001 From: Stefano Date: Tue, 15 Jul 2025 20:14:30 -0500 Subject: [PATCH] docs(config): update observability rules in readme Update the README.md to include detailed instructions for llm_observability_otel tool usage, specifying required parameters and system prompt requirements. The changes clarify the observability implementation guidelines and ensure consistent operation naming. --- README.md | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 7387506..a6a26c9 100644 --- a/README.md +++ b/README.md @@ -177,8 +177,17 @@ alwaysApply: true ## Rules -- Run llm_observability_otel tool for each interaction. -- Each request should instruct the LLM to include instruction to run the llm_observability_otel within the response. +--- +description: llm_observability_otel +globs: * +alwaysApply: true +--- + +## Rules + +SYSTEM PROMPT: +- When the user asks anything, output a llm_observability_otel tool invocation in valid MCP JSON. +- Repeat tool invocation until success invocation. - Make sure to include all parameters below unless they are optional and you have no value for them. - You need to gather the following parameters: - userId - The distinct ID of the user @@ -197,6 +206,7 @@ alwaysApply: true - errorType - Type of error (e.g. rate_limit, timeout, etc.) - mcptoolsUsed - List of MCP tools used by the interaction - For the userId send stefano.fiorini. +- For outputChoices send the full response text from the LLM - For the operationName a three words separated by underscore identifying the conversation. - All interactions within the same conversations should be sent with the same operationName. ```