CLI

Automatic Prompt Optimization

One of the main features of the PlanAI CLI is the automatic prompt optimization tool. This feature helps refine prompts for Large Language Models (LLMs) by leveraging more advanced LLMs to improve prompt effectiveness.

Key aspects of the optimize-prompt tool:

  • Automates the process of iterating and improving prompts

  • Uses real production data from debug logs for optimization

  • Dynamically loads and uses production classes in the workflow

  • Employs an LLM-based scoring mechanism to evaluate prompt effectiveness

  • Adapts to various LLM tasks

To use the optimize-prompt tool, ensure you have generated debug logs with real production data by setting debug_mode=True for your LLMTaskWorker classes. Then, run a command similar to:

planai --llm-provider openai --llm-model gpt-4o-mini --llm-reason-model gpt-4 optimize-prompt --python-file your_app.py --class-name YourLLMTaskWorker --search-path . --debug-log debug/YourLLMTaskWorker.json --goal-prompt "Your optimization goal here"

The tool will generate optimized prompts as text files along with corresponding JSON files containing metadata about the improvements.

For more detailed information on the prompt optimization feature, refer to the PlanAI documentation.

planai command line interface

usage: planai [-h] [--llm-provider LLM_PROVIDER] [--llm-model LLM_MODEL]
              [--llm-reason-model LLM_REASON_MODEL]
              {optimize-prompt,cache} ...

Positional Arguments

command

Possible choices: optimize-prompt, cache

Subcommands

Named Arguments

--llm-provider

LLM provider name

--llm-model

LLM model name for generation

--llm-reason-model

LLM model name for reasoning

Sub-commands

optimize-prompt

Optimize prompt based on debug logs

planai optimize-prompt [-h] [--python-file PYTHON_FILE]
                       [--class-name CLASS_NAME] [--debug-log DEBUG_LOG]
                       [--goal-prompt GOAL_PROMPT]
                       [--output-config OUTPUT_CONFIG]
                       [--search-path SEARCH_PATH] [--config CONFIG]
                       [--num-iterations NUM_ITERATIONS]
                       [--llm-opt-provider LLM_OPT_PROVIDER]
                       [--llm-opt-model LLM_OPT_MODEL]
Named Arguments
--python-file

Path to the Python file

--class-name

Class name in the Python file

--debug-log

Path to the JSON debug log file

--goal-prompt

Goal prompt for optimization

--output-config

Output a configuration file

--search-path

Optional path to include in module search path

--config

Path to a configuration file

--num-iterations

Number of optimization iterations

Default: 3

--llm-opt-provider

LLM provider name to be used for the prompt that is being optimized. This should be the same LLM as being used in production.

--llm-opt-model

LLM model name for generation to be used for the prompt that is being optimized. This should be the same LLM as being used in production.

cache

Inspect and manipulate cache

planai cache [-h] [--output-task-filter OUTPUT_TASK_FILTER] [--delete DELETE]
             [--clear] [--search-dirs SEARCH_DIRS]
             cache_dir
Positional Arguments
cache_dir

Directory of the diskcache to operate on

Named Arguments
--output-task-filter

Filter for output task type

--delete

Delete a specific cache key

--clear

Clear the cache

Default: False

--search-dirs

Comma-separated list of directories to search for Python modules

PlanAI Command Line Interface

This module provides a command-line interface for PlanAI, focusing on automated prompt optimization.

The main functionality includes: 1. Optimizing prompts based on debug logs 2. Configuring LLM providers and models 3. Processing input from Python files, debug logs, and goal prompts 4. Outputting optimized configurations

Usage:

python -m planai.cli –llm-provider <provider> –llm-model <model> –llm-reason-model <reason_model> optimize-prompt [options]

planai.cli.create_parser()[source]
planai.cli.main(args=None)[source]
planai.cli.parse_comma_separated_list(arg: str) List[str][source]