How works the chat interface?

Learn how works the chat interface.

Once you have created a space, you can create a specific chat to interact with Lampi.

Chat with Lampi

In the chat, you will find an input bar where you can type your queries (or "prompts'). For more information about the art of prompt, you can refer to our best practices.

Ask Lampi anything about your context. It understands everything about it and provides you with instant answers to your questions, with references from relevant data sources. This feature is ideal for in-depth research or when you need specific information from a large volume of documents.


Chat without context

The no-context mode allows you to interact with Lampi without any specific document or context in mind. This mode is useful for general queries or when you want to leverage LLMs' general knowledge without using your own context.

Web search and context search modes need to be deactivated.


When enabled, this button allows Lampi to perform searches on the web.

You can also decide to activate the chat "History". By activating the chat history, you allow Lampi to follow up on previous answers.


When activated, this button allows Lampi to perform searches in the context you have selected, as indicated in How to manage context.

This mode is based on RAG concept, which is designed to retrieve information in a specific context. For more information about it, you can visit check Retrieval Augmented Generation (RAG).


Activate or deactivate agentic workflows

You can decide to activate ReAct (Reasoning and Acting) behavior for the LLM.

When enabled, this button allows the LLM to use reasoning and acting capabilities. It relies on the other buttons (Web Search, Context Search, and History) to provide additional context.

The ReAct relies on Web Search and/or Context Search at least one of them needs to be enabled.

This mode is based on RAG concept, which is designed to retrieve information in a specific context. For more information about it, you can visit check Retrieval Augmented Generation (RAG).


Activate or deactivate the chat history

You can also decide to activate the chat "History". By activating the chat history, you allow Lampi to follow up on previous answers.

For example, if you ask specific information about a company, Lampi will understand that you try to retrieve information about it; you don't need to write each time the name of the company. Without the chat history, each query is independent of the other ones; you need to be more precise in each query.

The chat history can impact the quality of Lampi's search capabilities.


Verify the sources of Lampi's outputs

Lampi not only provides answers to your questions but also cites the source document, including specific details or page numbers, for your reference.

Please be aware that the field of Artificial Intelligence is still in its early stages of development. As such, we strongly advise users to always review and verify the answers provided by Lampi to ensure accuracy and reliability.


📄Chat with LampiGo back to menu

Last updated