mirror of
https://github.com/open-telemetry/opentelemetry-python-contrib.git
synced 2025-07-28 04:34:01 +08:00
Add OpenAI example (#3006)
This commit is contained in:
@ -7,7 +7,7 @@ extension-pkg-whitelist=cassandra
|
|||||||
|
|
||||||
# Add list of files or directories to be excluded. They should be base names, not
|
# Add list of files or directories to be excluded. They should be base names, not
|
||||||
# paths.
|
# paths.
|
||||||
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs
|
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs,.venv
|
||||||
|
|
||||||
# Add files or directories matching the regex patterns to be excluded. The
|
# Add files or directories matching the regex patterns to be excluded. The
|
||||||
# regex matches against base names, not paths.
|
# regex matches against base names, not paths.
|
||||||
|
@ -15,7 +15,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||||||
|
|
||||||
- `opentelemetry-instrumentation-sqlalchemy` Update unit tests to run with SQLALchemy 2
|
- `opentelemetry-instrumentation-sqlalchemy` Update unit tests to run with SQLALchemy 2
|
||||||
([#2976](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2976))
|
([#2976](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2976))
|
||||||
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
|
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
|
||||||
([#2996](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2996))
|
([#2996](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2996))
|
||||||
- `opentelemetry-instrumentation-sqlalchemy` Add sqlcomment to `db.statement` attribute
|
- `opentelemetry-instrumentation-sqlalchemy` Add sqlcomment to `db.statement` attribute
|
||||||
([#2937](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2937))
|
([#2937](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2937))
|
||||||
|
@ -7,6 +7,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||||||
|
|
||||||
## Unreleased
|
## Unreleased
|
||||||
|
|
||||||
|
- Add example to `opentelemetry-instrumentation-openai-v2`
|
||||||
|
([#3006](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3006))
|
||||||
- Support for `AsyncOpenAI/AsyncCompletions` ([#2984](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2984))
|
- Support for `AsyncOpenAI/AsyncCompletions` ([#2984](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2984))
|
||||||
|
|
||||||
## Version 2.0b0 (2024-11-08)
|
## Version 2.0b0 (2024-11-08)
|
||||||
|
@ -6,17 +6,21 @@ OpenTelemetry OpenAI Instrumentation
|
|||||||
.. |pypi| image:: https://badge.fury.io/py/opentelemetry-instrumentation-openai-v2.svg
|
.. |pypi| image:: https://badge.fury.io/py/opentelemetry-instrumentation-openai-v2.svg
|
||||||
:target: https://pypi.org/project/opentelemetry-instrumentation-openai-v2/
|
:target: https://pypi.org/project/opentelemetry-instrumentation-openai-v2/
|
||||||
|
|
||||||
Instrumentation with OpenAI that supports the OpenAI library and is
|
This library allows tracing LLM requests and logging of messages made by the
|
||||||
specified to trace_integration using 'OpenAI'.
|
`OpenAI Python API library <https://pypi.org/project/openai/>`_.
|
||||||
|
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
------------
|
------------
|
||||||
|
|
||||||
|
If your application is already instrumented with OpenTelemetry, add this
|
||||||
|
package to your requirements.
|
||||||
::
|
::
|
||||||
|
|
||||||
pip install opentelemetry-instrumentation-openai-v2
|
pip install opentelemetry-instrumentation-openai-v2
|
||||||
|
|
||||||
|
If you don't have an OpenAI application, yet, try our `example <example>`_
|
||||||
|
which only needs a valid OpenAI API key.
|
||||||
|
|
||||||
References
|
References
|
||||||
----------
|
----------
|
||||||
|
@ -0,0 +1,18 @@
|
|||||||
|
# Update this with your real OpenAI API key
|
||||||
|
OPENAI_API_KEY=sk-YOUR_API_KEY
|
||||||
|
|
||||||
|
# Uncomment to use Ollama instead of OpenAI
|
||||||
|
# OPENAI_BASE_URL=http://localhost:11434/v1
|
||||||
|
# OPENAI_API_KEY=unused
|
||||||
|
# CHAT_MODEL=qwen2.5:0.5b
|
||||||
|
|
||||||
|
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
|
||||||
|
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
|
||||||
|
OTEL_SERVICE_NAME=opentelemetry-python-openai
|
||||||
|
|
||||||
|
# Change to 'false' to disable logging
|
||||||
|
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
|
||||||
|
# Change to 'console' if your OTLP endpoint doesn't support logs
|
||||||
|
OTEL_LOGS_EXPORTER=otlp_proto_http
|
||||||
|
# Change to 'false' to hide prompt and completion content
|
||||||
|
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
|
@ -0,0 +1,39 @@
|
|||||||
|
OpenTelemetry OpenAI Instrumentation Example
|
||||||
|
============================================
|
||||||
|
|
||||||
|
This is an example of how to instrument OpenAI calls with zero code changes,
|
||||||
|
using `opentelemetry-instrument`.
|
||||||
|
|
||||||
|
When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
|
||||||
|
compatible endpoint. Traces include details such as the model used and the
|
||||||
|
duration of the chat request. Logs capture the chat request and the generated
|
||||||
|
response, providing a comprehensive view of the performance and behavior of
|
||||||
|
your OpenAI requests.
|
||||||
|
|
||||||
|
Setup
|
||||||
|
-----
|
||||||
|
|
||||||
|
Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
|
||||||
|
OTLP compatible endpoint should be listening for traces and logs on
|
||||||
|
http://localhost:4318. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
|
||||||
|
|
||||||
|
Next, set up a virtual environment like this:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
python3 -m venv .venv
|
||||||
|
source .venv/bin/activate
|
||||||
|
pip install "python-dotenv[cli]"
|
||||||
|
pip install -r requirements.txt
|
||||||
|
|
||||||
|
Run
|
||||||
|
---
|
||||||
|
|
||||||
|
Run the example like this:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
dotenv run -- opentelemetry-instrument python main.py
|
||||||
|
|
||||||
|
You should see a poem generated by OpenAI while traces and logs export to your
|
||||||
|
configured observability tool.
|
@ -0,0 +1,21 @@
|
|||||||
|
import os
|
||||||
|
|
||||||
|
from openai import OpenAI
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
client = OpenAI()
|
||||||
|
chat_completion = client.chat.completions.create(
|
||||||
|
model=os.getenv("CHAT_MODEL", "gpt-4o-mini"),
|
||||||
|
messages=[
|
||||||
|
{
|
||||||
|
"role": "user",
|
||||||
|
"content": "Write a short poem on OpenTelemetry.",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
)
|
||||||
|
print(chat_completion.choices[0].message.content)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
@ -0,0 +1,6 @@
|
|||||||
|
openai~=1.54.4
|
||||||
|
|
||||||
|
opentelemetry-sdk~=1.28.2
|
||||||
|
opentelemetry-exporter-otlp-proto-http~=1.28.2
|
||||||
|
opentelemetry-distro~=0.49b2
|
||||||
|
opentelemetry-instrumentation-openai-v2~=2.0b0
|
Reference in New Issue
Block a user