Skip to content

RayforceDB/thal

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

thal

Reactive semantic runtime — a dataflow runtime for building LLM-backed applications as molecules, reactions, and effect actors.

Originally named thalamus; the conceptual project name persists in the design docs and tagline. The crate ships as thal because the longer name is taken on crates.io.

  ████████╗██╗  ██╗ █████╗ ██╗
  ╚══██╔══╝██║  ██║██╔══██╗██║
     ██║   ███████║███████║██║
     ██║   ██╔══██║██╔══██║██║
     ██║   ██║  ██║██║  ██║███████╗
     ╚═╝   ╚═╝  ╚═╝╚═╝  ╚═╝╚══════╝

What it is

A runtime for "everything is a molecule, everything is a reaction." Inspired by Differential Dataflow + reactive streams + chemical abstract machines.

  • Molecules are typed records (schemas declared in a .thal file).
  • Reactions join, filter, fan out, and roll up molecules — declarative dataflow with delta-driven incremental evaluation.
  • Effect actors (LLM calls, subprocesses, terminal I/O, OAuth flows) are themselves first-class molecules with a Pending → Done lifecycle.

What it does today

  • Full DSL: types, mixins, molecules, reactions (when / rollup / where / emit), list comprehensions, optional types, multi-emit, primary keys with merge clauses.
  • Builtin actors: Timer (source), TerminalWrite, TerminalPrompt (reedline-backed with history + slash commands), Process, LlmCall, Spinner.
  • LLM providers: mock, anthropic (Messages API), openai_compat (OpenAI / OpenRouter / Ollama / etc. via TokenFileProvider), codex_responses (OpenAI Responses API used by Codex via ChatGPT account).
  • Provider configuration as molecules: LlmProvider { name, kind, base_url, token_file, token_jq } declared in .thal (or auto-loaded from ~/.config/thal/auto/).
  • OAuth flows: generic RFC 8628 device flow plus OpenAI Codex's PKCE-flavored custom flow. Run thal setup codex for the full browser dance.
  • Setup wizard: thal setup [<provider>] for codex / openai / openrouter / ollama / github-copilot / oauth-custom / custom.

Quick start

# Run the demo with the mock LLM (no setup):
cargo run -- examples/chat.thal

# Wire up a real LLM via browser OAuth:
cargo run -- setup codex     # follow the URL, paste the code in your browser
cargo run -- examples/chat.thal

thal setup codex writes auto-loaded provider config to ~/.config/thal/auto/codex.thal so subsequent runs pick it up automatically — no manual editing of chat.thal.

Architecture

The single .thal file declares the schema and reactions; the runtime manages molecule arrangements (delta-pinning, rollup re-evaluation, idempotent dedup) and dispatches actors. See plans/01-debugging-loop-worked-example.md through plans/26-rich-repl.md for the design history.

License

Dual-licensed under MIT or Apache-2.0 at your option.

About

Reactive semantic runtime — molecules, reactions, and effect actors for building LLM-backed applications as dataflow programs.

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages