Writing a prompt

To maximize the efficiency of Lampi, it is crucial to master the technique of providing it with precise instructions and communicating effectively. This is where the art of composing effective prompts plays a significant role.

In this tutorial, you'll discover:

  1. What is a prompt?

  2. Why it is so important?

  3. The best way to write a prompt

  4. Some examples of prompts.


What is a β€œPrompt”?

A prompt refers to a question or command given to an AI model, instructing it to perform a specific task or provide a certain output. Prompts are the catalysts that drive AI models to interpret, analyze, and generate data in a manner that aligns with the user's intent. Prompts play a pivotal role in the field of Natural Language Processing (NLP), the subfield of AI that enables machines to understand, process, and generate human language.


Why Prompts are so Important when using AI?

The quality and clarity of a prompt can greatly influence the output generated by the AI model, making it important to craft prompts that effectively convey the user’s intent and desired outcome.

To better understand the profound impact of a well-constructed prompt, let's consider a practical example. Suppose you provide an AI model with a vague prompt, such as "Can you give me a cake recipe?" The AI model could interpret this in various ways, resulting in an array of outputs, some of which may not be as useful to the user. However, a specific and detailed prompt, such as "Can you provide a recipe for a gluten-free chocolate cake under 300 calories?", would guide the AI model to generate a precise, useful, and practical output.

In a business context, understanding and implementing prompt engineering can be a game-changer.

Consider the difference between a vague prompt, such as "Perform a market analysis of the Luxury sector," and a more detailed prompt, like "Prepare a comprehensive report analyzing the market trends of the Luxury sector in the third quarter of 2023 using the following structure: [structure]". The latter provides a precise direction to the AI model, enabling it to generate a relevant and actionable output.

The potential of well-crafted prompts in transforming AI outputs cannot be overstated.

Precise prompts AI systems to deliver accurate, detailed, and relevant results. However, the art of prompt engineering is not devoid of challenges. Creating effective prompts requires an understanding of the AI model's capabilities, the data at hand, and the desired outcome.


How to write a perfect prompt?

Writing prompts is not difficult. But writing the perfect prompts to get relevant results from the language model is not as easy as it looks, either.

The process of creating perfect prompts isn't just about firing off a question or instruction. It involves a strategic approach to get the desired output. It is not about asking a question, but about asking the right question in the right way.

Guidelines for getting better results:

  • Ask the model to adopt a persona: It involves instructing the model to respond as if it were a specific character or individual to have specific characteristics, knowledge, and viewpoints. With Lampi, you can use specific agents who are designed to act in a specific way, and even create your own agents.

  • Write a clear instruction: Lampi can’t read your mind. Avoid vagueness as it can lead to broad, generalized, and often irrelevant responses. Instead, be as specific as possible in your prompts. The more specific the prompt, the better the AI model can generate precise and relevant responses.

  • Use action-oriented language: Ensure your prompts are action-oriented, guiding the AI model on what exactly it needs to do. The model should be able to understand the tasks based on the verbs used in your prompts. Use commands to instruct the model what you want to achieve, such as "Write", "Classify", "Summarize", "Translate", "Order", etc. Keep in mind that you need to experiment to see what works best

  • Emphasis on key aspects: Highlight the key aspects of your request in your prompts. These key points will guide the AI model to focus on the aspects you consider most important, enabling it to provide outputs that cater to your specific needs.

  • Provide context: Providing adequate context in your prompts is key to getting the desired output. With Lampi, you can trigger relevant documents to provide context to your queries directly (for more information about it, Retrieval Augmented Generation (RAG)). By giving the AI more context, you can guide it to deliver more accurate and meaningful responses.

  • Split complex tasks into simpler sub-tasks: Complex tasks often have a higher likelihood of mistakes than simpler ones. Additionally, it's possible to reorganize complex tasks into a series of easier tasks, where the results from the initial tasks are used to inform the next steps.

  • Use examples: You can use examples in your prompts. If you want to write something based on a specific example, include that example in your prompt.

  • Give the model time to "think": Just as you might need a moment, models also benefit from additional time to formulate responses. They tend to make fewer reasoning mistakes when they aren't rushed to provide an answer immediately. Requesting a "chain of thought" approach before arriving at an answer can significantly enhance the model's ability to reach accurate conclusions.

  • Specify response length: AI models don't inherently know whether you want a brief or comprehensive response. It's essential to specify the desired length of the response in your prompt to avoid confusion and excessive regeneration of content.

  • Refine as needed: Creating the perfect prompt might not always be a one-shot process but rather an iterative one. Just like the classic saying, "Rome wasn't built in a day," achieving the ideal prompt requires refinement and adjustment. You’ll find that you may need to modify and fine-tune your prompts to align more closely with the specific outputs you desire. When refining, choosing the right words matters in prompt engineering: expand, explain, simplify, clarify, formalize, reiterate, etc.

As you get started with designing prompts, you should keep in mind that it is really an iterative process that requires a lot of experimentation to get optimal results. You can start with simple prompts and keep adding more elements and context as you aim for better results. Iterating your prompt along the way is vital for this reason.

To avoid rewriting your prompts, we recommand to save the prompts that work for you.

For Retrieval Augmented Generation (RAG) applications, the art of prompting slightly differs as RAG combines an information retrieval component with a text generator model.

RAG takes an input and retrieves a set of relevant information given a source. The information are concatenated as context with the original input prompt and fed to the text generator which produces the final output.


Examples

  • Write clear instructions

Initial promptsEnhanced prompts

Tell me how to improve customer satisfaction

Outline a strategy to enhance customer satisfaction for an e-commerce platform, focusing on post-purchase support and user experience

Increase online sales

Develop a plan to boost online sales for a bookstore by optimizing the website's user experience and implementing targeted email marketing campaigns.

  • Ask the model to adopt a persona

Initial promptEnhanced prompts

How to effectively manage remote teams?

Act as an experienced remote team manager with a track record of leading distributed teams across tech companies. Explain the top strategies for maintaining productivity and strong communication.

How to manage risk in investment portfolios?

As a risk management expert with extensive experience in hedge funds, explain the strategies for managing risk in investment portfolios.

There are a lot of other tactics to implement in order to improve the outputs of Lampi. For example, you can always ask the model if it missed anything on previous passes, or you can instruct the model to work out its own solution before rushing to a conclusion.

For more


πŸ‘ŒBest PracticesGo back to menu

Last updated