Start with plain language. Avaryn clarifies when the request is underspecified.
Verifiable Results AI
Natural language in. Verified outcomes out.
Avaryn turns a request into a result you can inspect, trace, and export. Not another black-box answer. A structured outcome with verification, evidence, clarifications, and a report you can actually use.
Every run returns a verification verdict, summary, and evidence references.
Deliver the result as Markdown, JSON, or HTML with the same public contract.
How It Works
Designed to make trust visible.
Avaryn does not stop at generating text. It moves a request toward a verifiable result, asks for clarification when needed, and packages the outcome in a stable public shape.
Interpret the request
Start from natural language and determine what outcome the user is actually trying to produce.
Clarify before acting
When the request is underspecified, Avaryn pauses and asks for the missing field instead of guessing.
Return the proof surface
Every run carries summary, verification, evidence, clarifications, and export readiness in one envelope.
Export for real work
Share the same result as JSON for systems, Markdown for operators, or HTML for polished delivery.
Result Envelope
A public contract for results people can inspect.
Avaryn centers the output around one stable shape: result, verification, evidence, clarifications, and exports. That makes the experience readable for humans and predictable for systems.
Result
The useful thing the user asked for, preserved as structured output.
Summary
A human-readable explanation of what happened and whether the run completed.
Verification
A verdict, confidence level, and verification summary instead of hand-wavy success language.
Evidence
References, receipts, and derived artifacts that make the result checkable.
Clarifications
Explicit pending fields when the run cannot safely continue without user input.
Exports
One run, many surfaces. JSON for systems. Markdown for operators. HTML for delivery.
Output Formats
One request, multiple delivery surfaces.
Avaryn keeps the contract stable while changing the rendering layer. The same verified run can move between operators, apps, and final presentation.
# Avaryn Result **Run ID**: `avr_20260326_demo` **Status**: completed **Badge**: Verified by Avaryn ## Summary Weekly operations rollup completed with verification. ## Verification - Verdict: `verified` - Confidence: `high` - Checks passed: `4`
{
"run_id": "avr_20260326_demo",
"status": "completed",
"summary": "Weekly operations rollup completed with verification.",
"result": {
"report_id": "weekly_ops_rollup",
"title": "Weekly Operations Rollup"
},
"verification": {
"verdict": "verified",
"confidence": "high",
"checks_passed": 4,
"checks_failed": 0
}
}
<section class="avaryn-report">
<h1>Avaryn Result</h1>
<p class="badge">Verified by Avaryn</p>
<p>Weekly operations rollup completed with verification.</p>
<ul>
<li>Verdict: verified</li>
<li>Confidence: high</li>
</ul>
</section>
Demo Catalog
Thirty representative prompts from the current 306-app runtime surface.
Representative examples drawn from the current 306-app runtime surface. They show the kinds of natural-language requests Avaryn can route toward verifiable results while keeping the public contract clean.
Representative prompts
Choose a category, then inspect individual examples.
Monday operating briefing
A dated operating brief that pulls together meetings, ticket pressure, inbox priorities, and recommended next actions.
How To Use
Use Avaryn in Codex, through local MCP, or as a headless engine.
The orchestration surface can change while the public result contract stays the same. That makes Avaryn usable by operators, Codex workflows, and other agent systems without changing the envelope.
Ask through the plugin
Best for operator-led work inside Codex where the query may need clarifications before execution continues.
Call the public tools directly
Use `run_query`, `resume_run`, `get_result`, and `export_result` when you want a stable local tool boundary.
Feed the envelope into other agents
Use JSON output when Avaryn should become the result-producing backend for another automation or planner.
{
"tool": "run_query",
"arguments": {
"query": "Summarize the last 50 support conversations into product feedback themes.",
"mode": "execute",
"format": "json",
"evidence_level": "strict"
}
}
{
"run_id": "avr_20260327_feedback",
"status": "completed",
"summary": "Support conversations grouped into six product feedback themes.",
"result": {
"report_id": "feedback_theme_rollup",
"themes": 6
},
"verification": {
"verdict": "verified",
"confidence": "high"
},
"evidence": [
{"type": "conversation_batch"},
{"type": "report_artifact"}
],
"exports": {
"json": true,
"markdown": true,
"html": true
}
}
Call `run_query`
Send a natural-language request plus the desired mode and format.
Handle clarification if needed
If the run pauses, supply the missing fields with `resume_run` instead of redoing the whole request.
Export or relay the envelope
Use the structured result as-is, or render Markdown and HTML for operators and final delivery.
Why It Matters
AI should return outcomes you can trust, not just prose you have to re-audit by hand.
Avaryn is built for workflows where the shape of the result matters as much as the text. That includes operators, builders, researchers, analysts, and teams that want a result to stay inspectable after the run ends.
Built for teams that need:
- clarification before irreversible work
- verification that stays attached to the result
- evidence references instead of vague confidence theater
- a clean public interface without exposing engine internals
Launch
Avaryn starts local, structured, and deliberate.
The current package is a proprietary internal alpha for Codex users who want a local-first path to verifiable results. The packaging is ready for repeated testing, but not yet registered for broad marketplace distribution.