Skip to content

LLM Prompt optimization design refactor #56

@ivanmilevtues

Description

@ivanmilevtues

At the mometn all prompts are stored in the prompts.py file. The prompts are mostly optimized for the Gemini models (2.5-pro and 2.5-flash). However, different models respond better to different prompting styles: https://promptbuilder.cc/blog/claude-vs-chatgpt-vs-gemini-best-prompt-engineering-practices-2025

Goal:

Have an abstract factory which will provide all prompt strings for everything in the codebase that is:
Tool names, Tool descriptions, System prompts, User prompts, Extractor prompts

Implementation:

  • AbstractFactory pattern with a default factory which is set to the current prompts.
  • The factory has to select the LLM alongside with all of its strings
  • The factory itself has to be a singleton which is used by all the different components around the codebase.
  • Unit tests that the default factory is selected when the instanitated llm has no prompt implementation yet.
  • Have a nice prompt structure - can be outside of the *.py, can be something like a properties file or something else relevant for the task,

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthacktoberfestA good issues for hacktroberfest

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions