OpenInference JS
    Preparing search index...

    Extends BatchSpanProcessor to support OpenInference attributes. This processor enhances spans with OpenInference attributes before exporting them. It can be configured to selectively export only OpenInference spans or all spans.

    import { OpenInferenceBatchSpanProcessor, isOpenInferenceSpan } from "@arizeai/openinference-vercel";
    import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto"

    const exporter = new OTLPTraceExporter();
    const processor = new OpenInferenceBatchSpanProcessor({
    exporter,
    spanFilter: isOpenInferenceSpan,
    config: { maxQueueSize: 2048, scheduledDelayMillis: 5000 },
    });

    const tracerProvider = new NodeTracerProvider({
    resource: resourceFromAttributes({
    [SEMRESATTRS_PROJECT_NAME]: "your-project-name",
    }),
    spanProcessors: [processor], // <-- pass processor here
    });

    Hierarchy

    Index

    Constructors

    Methods

    • Returns void