OpenInference JS
GitHub
Preparing search index...
@arizeai/openinference-semantic-conventions
trace/SemanticConventions
LLM_TOKEN_COUNT_PROMPT
Variable LLM_TOKEN_COUNT_PROMPT
Const
LLM_TOKEN_COUNT_PROMPT
:
"llm.token_count.prompt"
= ...
Token count for the prompt to the llm (in tokens)
Settings
Member Visibility
Protected
Inherited
Theme
OS
Light
Dark
GitHub
OpenInference JS
Loading...
Token count for the prompt to the llm (in tokens)