Prompt

prompt.md

The prompt.md file is where you define the core prompt logic for your AI agent in Sensai. Whenever Sensai detects a prompt.md file in a folder, it treats that folder as an AI agent using tools and more if present.

An agent is accessible locally via HTTP. The folder path corresponds to the HTTP path of the agent (see parametric-routes).

The text content of the prompt.md file will be sent to a large language model (LLM) whenever the agent is called. Markdown is the de facto format supported by most mainstream LLMs, making it ideal for structuring prompts with clarity and readability.

Here's a basic example:

prompt.md
You are a helpful assistant.
 
Answer the following question:
Question: #{question}
 
Provide a concise and accurate response.

Dynamic parameters

In the example above, you’ll notice the #{...} syntax. This is used to insert values dynamically from the HTTP request directly into your prompt. This makes it easy to control, structure and secure what is sent to the LLM.

It supports a subset of JavaScript, so you can do things like string concatenation or use simple expressions.

prompt.md
You are a helpful assistant.
I am #{firstName + ' ' + lastName.toUpperCase()}
 
Answer the following question:
Question: #{question}
 
Provide a concise and accurate response.

Using dynamic parameters helps create focused, single-purpose agents that are harder to break or manipulate. It keeps inputs well-structured, and makes your prompt more secure and reliable.