OpenInference JS
    Preparing search index...
    LLM_COST_PROMPT_DETAILS_CACHE_INPUT: "llm.cost.prompt_details.cache_input" = ...

    Cost of input tokens in the prompt that were cached in USD