Skip to content

comet_llm

init¶

comet_llm.init(api_key: Optional[str] = None,
    workspace: Optional[str] = None,
    project: Optional[str] = None) -> None

An easy, safe, interactive way to set and save your settings.

Will ask for your api_key if not already set. Your api_key will not be shown.

Will save the config to .comet.config file. Default location is "~/" (home) or COMET_CONFIG, if set.

Args:

  • api_key: str (optional) comet API key.
  • workspace: str (optional) comet workspace to use for logging.
  • project: str (optional) project name to create in comet workspace.

log_prompt¶

comet_llm.log_prompt(prompt: str, output: str,
    workspace: Optional[str] = None, project: Optional[str] = None,
    tags: Optional[List[str]] = None, api_key: Optional[str] = None,
    prompt_template: Optional[str] = None,
    prompt_template_variables: Optional[
        Dict[str, Union[str,
    bool, float, None]]
    ] = None, metadata: Optional[Dict[str, Union[str,
    bool, float, None]]] = None, timestamp: Optional[float] = None,
    duration: Optional[float] = None) -> llm_result.LLMResult

Logs a single prompt and output to Comet platform.

Args:

  • prompt: str (required) input prompt to LLM.
  • output: str (required), output from LLM.
  • workspace: str (optional) comet workspace to use for logging.
  • project: str (optional) project name to create in comet workspace.
  • tags: List[str] (optional), user-defined tags attached to a prompt call.
  • api_key: str (optional) comet API key.
  • prompt_template: str (optional) user-defined template used for creating a prompt.
  • prompt_template_variables: Dict[str, str] (optional) dictionary with data used in prompt_template to build a prompt.
  • metadata: Dict[str, Union[str, bool, float, None]] (optional) user-defined dictionary with additional metadata to the call.
  • timestamp: float (optional) timestamp of prompt call in seconds
  • duration: float (optional) duration of prompt call

Example:

log_prompt(
    prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
    metadata={
        "input.type": "completions",
        "input.model": "text-davinci-003",
        "input.provider": "openai",
        "output.index": 0,
        "output.logprobs": None,
        "output.finish_reason": "length",
        "usage.prompt_tokens": 5,
        "usage.completion_tokens": 7,
        "usage.total_tokens": 12,
    },
    prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
    prompt_template_variables={"question": "What is your name?"},
    output=" My name is [your name].",
    duration=16.598,
)

Returns: LLMResult.

log_user_feedback¶

comet_llm.log_user_feedback(id: str, score: float,
    api_key: Optional[str] = None) -> None

Note: This method is deprecated, instead use the API.get_trace_by_name or API.get_trace_by_key to get the LLMTraceAPI object and then use the LLMTraceAPI.log_user_feedback method.

Logs user feedback for the provided Prompt or Chain ID. This will overwrite any previously set value.

Args:

  • id: str (required) the ID of the Prompt or Chain.
  • score: float (required) the feedback score, can be either 0, 0.0, 1, or 1.0.
  • api_key: str (optional) comet API key.

start_chain¶

comet_llm.start_chain(inputs: Dict[str, JSONEncodable],
    api_key: Optional[str] = None, workspace: Optional[str] = None,
    project: Optional[str] = None, metadata: Optional[Dict[str, Dict[str,
    JSONEncodable]]] = None, tags: Optional[List[str]] = None) -> None

Creates global Chain object that tracks created Spans.

Args:

  • inputs: Dict[str, JSONEncodable] (required) chain inputs.
  • workspace: str (optional) comet workspace to use for logging.
  • project: str (optional) project name to create in comet workspace.
  • tags: List[str] (optional), user-defined tags attached to a prompt call.
  • api_key: str (optional) comet API key.
  • metadata: Dict[str, Dict[str, JSONEncodable]] (optional) user-defined dictionary with additional metadata to the call.
  • tags: List[str] (optional) user-defined tags attached to the chain

Span¶

class comet_llm.Span(self, inputs: JSONEncodable, category: str, name: Optional[str] = None, metadata: Optional[Dict[str, JSONEncodable]] = None)

A single unit of Chain that has its own context. Spans can be nested, in that case inner one exist in the context of the outer(parent) one. Outer Span is considered to be a parent for an inner one.

Span.init¶

__init__(inputs: JSONEncodable, category: str, name: Optional[str] = None,
    metadata: Optional[Dict[str, JSONEncodable]] = None)

Args:

  • inputs: JSONEncodable (required), span inputs.
  • category: str (required), span category.
  • name: str (optional), span name. If not set will be generated automatically.
  • metadata: Dict[str, JSONEncodable] (optional), span metadata.

Span.set_outputs¶

set_outputs(outputs: Dict[str, JSONEncodable], metadata: Optional[Dict[str,
    JSONEncodable]] = None) -> None

Sets outputs to span object.

Args:

  • outputs: Dict[str, JSONEncodable] (required), outputs
  • metadata: Dict[str, JSONEncodable] (optional), span metadata. If metadata dictionary was passed to init method, it will be updated.

end_chain¶

comet_llm.end_chain(outputs: Dict[str, JSONEncodable],
    metadata: Optional[Dict[str,
    JSONEncodable]] = None) -> llm_result.LLMResult

Commits global chain and logs the result to Comet.

Args:

  • outputs: Dict[str, JSONEncodable] (required) chain outputs.
  • metadata: Dict[str, Dict[str, JSONEncodable]] (optional) user-defined dictionary with additional metadata to the call. This metadata will be deep merged with the metadata passed to start_chain if it was provided.
  • tags: List[str] (optional) user-defined tags attached to the chain

Returns: LLMResult

is_ready¶

comet_llm.is_ready() -> bool

True if comet API key is set.

May. 17, 2024