LLM Chat

5 functions for integrating large language models into BioLang scripts. Supports Anthropic, OpenAI, Ollama, and any OpenAI-compatible API. Auto-detects from environment variables.

chat

Send a prompt to the configured LLM and return the text response.

chat(prompt, opts?) -> string
OptionTypeDefaultDescription
modelstringautoModel name override
systemstringnilSystem prompt
max_tokensint4096Maximum response tokens
temperaturefloat0.7Sampling temperature
historylist[]Conversation history
let answer = chat("What is the function of BRCA1?")
println(answer)

# With system prompt for domain expertise
answer = chat("Explain this VCF line: chr17 43094464 . G A 500 PASS", {
  system: "You are a clinical genomics expert. Be concise.",
  temperature: 0.3
})

chat_code

Ask the LLM to generate BioLang code. Returns only the code block, stripped of markdown fencing.

chat_code(description, opts?) -> string
let code = chat_code("Read a FASTQ file, compute per-base quality stats, and make a boxplot")
println(code)
# lines = read_lines("sample.fastq")
# qualities = ...
# violin(qualities, {title: "Per-base Quality"})

llm_models

List available models from the configured provider.

llm_models() -> list
let models = llm_models()
for m in models {
  println(m)
}
# claude-sonnet-4-20250514
# claude-opus-4-20250514
# (or gpt-4o, llama3, etc. depending on provider)