Introducing Prompt Management

Deploy and manage LLM applications in production on Qwak with prompt versioning and prompt management, all in one unified AI platform.
Guy Eshet
Guy Eshet
Senior Product Manager at JFrog ML
June 17, 2024
Contents
Introducing Prompt Management

We are excited to release Qwak's new Prompt Management module, a key feature as part of our LLM platform. Prompt management centralizes the way AI teams create, deploy, and manage prompts, enabling faster production deployment of LLM applications.

Integrated with our advanced MLOps, feature store, and vector store abilities, managing prompts on Qwak enables faster and more reliable deployment of LLM applications.

Prompt Management on Qwak

Qwak's Prompt Management system is designed to boost AI teams performance by providing:

  • Centralized Management: Save, edit, and manage prompts alongside model configurations.
  • Prompt Versioning: Easily track and manage prompt versions in your deployment lifecycle.
  • Prompt SDK: Python SDK to manage and deploy prompts across various application environments.
  • Prompt Playground: Experiment with different prompts and models in an interactive environment.

Centralized Prompt Registry

Qwak's platform now includes a single prompt registry, ensuring that all your prompts are organized, accessible, and ready for deployment whenever needed. Update, edit and create new prompt, while easily updating your prompts live in your production deployment.

Experiment with Prompts and Models

Prompt Management introduces a dedicated prompt playground. This space allows you to experiment with models, track versions, and compare results effortlessly. Designed for in-depth experimentation, the playground provides a controlled environment for testing and refining prompts without the constraints of traditional coding processes.

Collaborate with Your Team

With Qwak's new prompt management capabilities, you can create and deploy prompts outside of code,  collaborating across your entire team. Whether you're a data scientist, prompt engineer, or developer, our platform supports a cohesive workflow, ensuring everyone is aligned and contributing to the development process.

Manage Prompt Versions

Each prompt in Qwak is assigned a unique ID and an optional version description, enabling precise tracking and management. Every time you save and update a prompt, a new version is created automatically, allowing easy reversion and tracking of changes over time.

Prompt Hot-Loading

Prompt hot-loading ensures your deployments automatically use the latest default version of your prompts. Qwak checks for updated prompt versions, ensuring that your application dynamically reflects the latest changes.

from qwak.llmops.prompt.manager import PromptManager

# Create an instance of the Qwak PromptManager
prompt_manager = PromptManager()

# Fetch the latest version of the default prompt
my_prompt = prompt_manager.get_prompt(
  name="banker-agent"
)

# This object will automatically be updated with the latest parameters
print(my_prompt)

# Generate the model response using the hydrated prompt
prompt_response = my_prompt.invoke(variables={"question": "What is your name?"})

# Extract the response content, following the OpenAI model response
response = prompt_response.choices[0].message.content
print(response)

Use Any AI Provider

With Qwak, you're not limited to a single AI provider. Our platform lets you integrate with your favorite AI providers, such as OpenAI, while also leveraging Qwak's robust model deployment capabilities. This flexibility ensures that you can choose the best tools for your specific needs.

Bring Your Own Models

With Qwak, you can easily integrate self-hosted models with prompts, enabling a seamless AI development experience. The upcoming Qwak Model Library provides a curated collection of optimized LLMs in a one-click deployment. This feature allows you to deploy any open-source model, like Llama 3 or Mistral, on our fully-fledged, multi-cloud, enterprise-ready platform.

Get Started Today

Planning to deploy LLM applications, or already running LLM applications in production? It's time to take prompt management to the next level. Talk to us to get started or check out our technical docs to learn more about prompts.

Chat with us to see the platform live and discover how we can help simplify your journey deploying AI in production.

say goodbe to complex mlops with Qwak