Launching Lumen¶
Start Lumen AI from the command line or Python. Choose the method that fits your workflow.
Before launching
Make sure you've installed Lumen and configured an LLM provider. See the Installation guide.
Launch from the command line¶
The simplest way to start is with a single command:
This opens the chat interface at localhost:5006.
Pre-load a dataset¶
Start with data already loaded:
Configure the LLM¶
Configure the LLM at startup using CLI flags:
- Controls randomness; higher = more creative (0.0-2.0)
Combine options
lumen-ai serve penguins.csv \
--provider openai \
--model-kwargs '{"default": {"model": "gpt-4o"}}' \
--temperature 0.7
For a complete list of CLI options:
Launch from Python¶
For more control, use the Python API:
Save as app.py, then launch:
Pre-load data¶
Configure the LLM¶
Configure LLM in Python
import lumen.ai as lmai
# Configure your LLM
llm = lmai.llm.OpenAI(
model_kwargs={
'default': {'model': 'gpt-4o-mini'},
'sql': {'model': 'gpt-4o'}
},
temperature=0.7
)
ui = lmai.ExplorerUI(data='penguins.csv', llm=llm)
ui.servable()
See LLM Providers for advanced LLM configuration.
Add custom components¶
Custom agents and analyses
import lumen.ai as lmai
from lumen.ai.agents import AnalysisAgent
# Create custom analysis
analysis_agent = AnalysisAgent(analyses=[MyAnalysis])
ui = lmai.ExplorerUI(
data='penguins.csv',
agents=[analysis_agent, MyCustomAgent()],
tools=[my_custom_tool],
suggestions=[
("search", "What data is available?"),
("bar_chart", "Show me a visualization"),
]
)
ui.servable()
Common CLI flags¶
| Flag | Purpose | Example |
|---|---|---|
--provider |
Specify LLM provider | --provider anthropic |
--model-kwargs |
Configure models | --model-kwargs '{"default": {"model": "claude-sonnet-4-5"}}' |
--temperature |
Control randomness | --temperature 0.5 |
--port |
Custom port | --port 8080 |
--address |
Network address | --address 0.0.0.0 |
--show |
Auto-open browser | --show |
--log-level |
Debug verbosity | --log-level DEBUG |
Next steps¶
- Navigating the UI — Learn how to use the interface
- Using Lumen AI — Start asking questions and exploring data
- LLM Providers — Configure your LLM provider and models