-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
☔ [JS] LlamaIndex.TS instrumentation #173
Comments
🤖 Hey there @mikeldking! Nice to see you back 😊. Ready to tackle some LlamaIndex.TS instrumentation together? Sure, I can provide some guidance on how to add LlamaIndex.TS instrumentation to the OpenInference repository.
Here's a sample code snippet: from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
def instrument():
resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
span_exporter = OTLPSpanExporter(endpoint="http://phoenix:6006/v1/traces")
span_processor = SimpleSpanProcessor(span_exporter=span_exporter)
tracer_provider.add_span_processor(span_processor=span_processor)
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
LlamaIndexInstrumentor().instrument()
Remember to call Please note that the above code snippets are just examples and might need to be adjusted to fit your specific use case. If you have any further questions or need more specific guidance, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
@dosu-bot you provided python for js! That's not helpful |
🤖 I apologize for the confusion earlier. Let's focus on the JavaScript instrumentation for LlamaIndex.TS. From the context provided, it seems that the OpenInference codebase already supports JavaScript and has a setup for instrumentation using OpenTelemetry's NodeTracerProvider. This setup includes span processors for outputting tracing data to the console and sending it to a specified URL, as well as OpenAIInstrumentation for registering instrumentations. To add LlamaIndex.TS instrumentation, you would need to follow a similar process. Here's a general outline:
I hope this helps! If you have any further questions or need clarification on any of the steps, please let me know. For more information on how to use OpenTelemetry for instrumentation in JavaScript, you can refer to the OpenTelemetry JavaScript GitHub repository. -Dosu Sources
|
Add LlamaIndex.TS instrumentation
Spike
Setup
Open Questions
The text was updated successfully, but these errors were encountered: