Lampi AI Chat combines natural language processing with powerful automation capabilities. By interacting with AI assistants and agents, you can instantly analyze and extract insights from lengthy documents, eliminating the need for manual review and information gathering, but also perform and automate tasks and workflows efficiently.
Once you have created a space, you can create a specific chat to interact with Lampi. In the chat, you will find an input bar where you can type your queries (or "prompts').
To really make the most of Lampi AI, you need to learn how works the chat and how to leverage the different tools you have access.
When you chat with Lampi, you have the possibility to activate or deactivate some tools.
Tools enable you to enhance an assistant with various capabilities, allowing it to efficiently tackle complex queries.
Tools selection:
- If you choose to select 0 tool when interacting with Lampi: The AI assistant uses only the model to generate an answer, which can be useful for general queries or when you want to leverage LLMs' general knowledge without using your own context.
- If you select one or several tools: The AI assistant evaluates the user request and uses the selected tools to answer the query.
Note that tools are designed to perform tasks efficiently within larger processes, while AI agents manage complete end-to-end workflows, enhancing productivity and operational efficiency.
What tools can you access?
Tool | What the assistant will do | How to use it |
---|---|---|
Search |
Perform a semantic search across the selected data to find the most relevant content and generate a response using this context, with references from relevant data sources. |
Activate the toggle in the knowledge tab on the right of the screen and select relevant documents. |
Web search & Browse |
Perform web searches and browse web pages to gather information and use it to generate responses. |
Activate the button in the chat bar. |
Data visualization |
Create graphs using the data it gathered. |
Ask in your prompt to generate mermaid graphs. |
Vision |
Analysis image, including text and visual data. |
Click on the button to add image and ask Lampi to analysis it. |
Note: You can also activate or deactivate the chat "History". By activating the chat history, you allow Lampi to follow up on previous answers.
If you ask specific information about a company, Lampi will understand that you try to retrieve information about it; you don't need to write each time the name of the company. Without the chat history, each query is independent of the other ones; you need to be more precise in each query.
How to know which tool to use?
Search
This tool is excellent for in-depth research or when you need specific information from a large volume of documents.
When activated, this tool allows Lampi to perform semantic searches within the context you have selected, as indicated in the "Manage context in chat" section.
This mode is based on the Retrieval Augmented Generation (RAG) concept, which is designed to retrieve information in a specific context.
Web search and browse
Utilize this tool when you require information not contained in your knowledge or if you need general or public knowledge that might require more current information than what is available in your knowledge, and let Lampi browse the web and gather the most current information available.
Note: If you are uncertain about the data source but have relevant reports in your knowledge, feel free to ask both tools. This way, Lampi can retrieve the information either from the internet or from your documents, ensuring you get the most accurate and comprehensive responses.
History
The use of chat history depends on your specific use cases. If memory is essential for your interactions, enabling chat history will enhance the continuity of conversations, making it easier for Lampi to provide relevant responses based on previous queries.
By understanding how to leverage these tools effectively, you can maximize your interactions with Lampi and enhance your overall experience.
Maximize Toolsโ Performance
To get the best results from Lampi and its various tools, consider the following strategies:
Craft clear prompts
Providing clear and detailed instructions is essential for effective tool performance. For more, see Writing a prompt.
If youโve activated the Search or web tools, explicitly instruct the AI assistant to rely solely on the available knowledge to minimize inaccuracies and hallucinations. Focusing the context encourages the assistant to utilize only the relevant information at its disposal. For more, see Reducing hallucinations in AI outputs.
Select data sources wisely
Choosing the right data sources for each action is crucial to achieving high-quality, relevant results.
The assistant will perform better when you carefully select specific data that is pertinent to the task at hand, rather than relying on a vast amount of unrelated information.
For instance, if you're working on market intelligence, consider selecting the โSearchโ tool and selecting only relevant reports from your documents.
Choose the relevant AI Assistant
When engaging with Lampi, always ensure you have an AI assistant selected with instructions that align with your query and the tools you are using.
If you create an AI assistant designed for a specific function, such as translation or rephrasing, it may not be suitable for prompts that require context retrieval. For more, see Create AI assistants.