Skip to content

Track you LLM prompt engineering¶

Developing effective Large Language Model (LLM) applications involves continuously refining the prompts you feed into foundational models and GenAI agents.

CometLLM is your solution for tracking and managing all your LLM prompt engineering workflows.

Prompts Page with Chain trace
Example Comet LLM project with an open chain trace

Refer to Comet Experiment Management for your experiment tracking when training/fine-tuning LLM and GenAI models. Also, explore the Panels gallery for LLMOps-specific Comet Experiment visualizations such as the Prompt History and Prompt Playground panels.

Note

🚀 Did you know that CometLLM is open source? We'd love your contributions to our comet-llm repo!

Additionally, you can reach out to our Product Support team for any questions, comments, suggestions, and other feedback.

Prompt engineering logging with CometLLM¶

To log your prompt engineering workflows with CometLLM you need to follow a simple four-step process:

  1. Install the comet-llm package.
  2. Create an LLM Project.
  3. Log your prompts and chains, together with relevant metrics and metadata, to the LLM Project.
  4. Analyze your LLM Project performance in the Comet UI.

Below, we provide more details and references for each step.

1. Install the comet-llm package¶

To install the comet-llm package, simply run:

pip install comet-llm

You can find more information in the comet-llm reference page.

2. Create an LLM Project¶

Comet differentiates LLM Projects from standard Comet ML Projects because prompt engineering workflows require novel experiment tracking and managing functionalities with respect to classic ML training and tuning workflows as introduced in the Analyze your LLM Project from Comet UI section.

You can create an LLM Project from Comet UI or from Comet LLM SDK.

1. Navigate to your Comet workspace home page.

2. Click on the New Project button by the top right corner of the screen.

3. Choose Large Language Models as the Project type, and set the other project fields with your desired values.

4. Click Create.

Comet LLM Project - Create

comet-llm is the Python SDK offered by Comet to power your LLM prompt engineering management.

To create an LLM project programmatically, simply initialize the SDK with your desired project name and workspace using comet_llm.init(). For example:

1
2
3
import comet_llm

comet_llm.init(project="llm-example")

The specified project is created as soon as the first prompt or chain is logged.

Once the LLM Project has been created, you can access it interactively from the Projects UI page.

Note

Whether for a new or existing LLM Project, you always need to initialize the Comet LLM SDK in your Python script for every prompt engineering workflow you want to log.

Learn more about getting started with comet-llm from the Configure the LLM SDK page.

3. Log your prompts and chains to the LLM Project¶

The Comet LLM Project lets you keep track and manage the prompts that you send to your third-party or proprietary LLM applications, especially when doing prompt engineering.

CometLLM differentiates between prompts and chains.

You can add any custom metadata to the logged prompts and chains to streamline the analysis of your prompt engineering trials.

The Log your prompts and chains page provides you with all the details, instructions, and examples for you to get started with logging your prompt engineering inside a Comet LLM Project.

Tip

CometLLM has direct integrations with LangChain, OpenAI, and Hugging Face!

4. Analyze your LLM Project from Comet UI¶

The Comet UI lets you analyze and manage LLM projects, with all associated logged prompts and chains, within user-friendly interfaces that you can easily share with your team members.

Prompts Page with prompt run analysis
Example Comet LLM project with an open prompt run analysis

You can take advantage of Comet UI's user-friendly Prompts table view to seamlessly analyze and manage your LLM prompts and chains. If preferred, you can also search and export your LLM metadata programmatically.

For example, you could...

Group your logged LLM Project entries by the LLM model version and by the number of prompt tokens to research how to optimize the prompt performance for your application.

Example Comet LLM Project¶

Click on the button below to explore an example project and get started with LLM Projects in the Comet UI!

Additionally, you can review an example notebook with Comet-powered prompt engineering by clicking on the tutorial below.

Open In Colab

May. 17, 2024