Verifiable Results AI

Natural language in. Verified outcomes out.

Avaryn turns a request into a result you can inspect, trace, and export. Not another black-box answer. A structured outcome with verification, evidence, clarifications, and a report you can actually use.

Ask

Start with plain language. Avaryn clarifies when the request is underspecified.

Verify

Every run returns a verification verdict, summary, and evidence references.

Export

Deliver the result as Markdown, JSON, or HTML with the same public contract.

Category Verifiable Results AI
Trust Mark Verified by Avaryn
Exports Markdown, JSON, HTML
Runtime Surface 306 current apps and bridges
Launch Shape Codex plugin + local MCP runtime

How It Works

Designed to make trust visible.

Avaryn does not stop at generating text. It moves a request toward a verifiable result, asks for clarification when needed, and packages the outcome in a stable public shape.

01

Interpret the request

Start from natural language and determine what outcome the user is actually trying to produce.

02

Clarify before acting

When the request is underspecified, Avaryn pauses and asks for the missing field instead of guessing.

03

Return the proof surface

Every run carries summary, verification, evidence, clarifications, and export readiness in one envelope.

04

Export for real work

Share the same result as JSON for systems, Markdown for operators, or HTML for polished delivery.

Result Envelope

A public contract for results people can inspect.

Avaryn centers the output around one stable shape: result, verification, evidence, clarifications, and exports. That makes the experience readable for humans and predictable for systems.

Summary

A human-readable explanation of what happened and whether the run completed.

Verification

A verdict, confidence level, and verification summary instead of hand-wavy success language.

Evidence

References, receipts, and derived artifacts that make the result checkable.

Clarifications

Explicit pending fields when the run cannot safely continue without user input.

Exports

One run, many surfaces. JSON for systems. Markdown for operators. HTML for delivery.

Output Formats

One request, multiple delivery surfaces.

Avaryn keeps the contract stable while changing the rendering layer. The same verified run can move between operators, apps, and final presentation.

# Avaryn Result

**Run ID**: `avr_20260326_demo`
**Status**: completed
**Badge**: Verified by Avaryn

## Summary
Weekly operations rollup completed with verification.

## Verification
- Verdict: `verified`
- Confidence: `high`
- Checks passed: `4`

Demo Catalog

Thirty representative prompts from the current 306-app runtime surface.

Representative examples drawn from the current 306-app runtime surface. They show the kinds of natural-language requests Avaryn can route toward verifiable results while keeping the public contract clean.

306 current apps and bridges
30 representative NLQ prompts
6 demo lanes across work categories

Representative prompts

Choose a category, then inspect individual examples.

Operations Verified by Avaryn

Monday operating briefing

A dated operating brief that pulls together meetings, ticket pressure, inbox priorities, and recommended next actions.

Prompt

Result highlights
    Evidence signals
      Likely app surface
      Export surfaces

      How To Use

      Use Avaryn in Codex, through local MCP, or as a headless engine.

      The orchestration surface can change while the public result contract stays the same. That makes Avaryn usable by operators, Codex workflows, and other agent systems without changing the envelope.

      Interactive

      Ask through the plugin

      Best for operator-led work inside Codex where the query may need clarifications before execution continues.

      Local MCP

      Call the public tools directly

      Use `run_query`, `resume_run`, `get_result`, and `export_result` when you want a stable local tool boundary.

      Headless

      Feed the envelope into other agents

      Use JSON output when Avaryn should become the result-producing backend for another automation or planner.

      Public tool call
      {
        "tool": "run_query",
        "arguments": {
          "query": "Summarize the last 50 support conversations into product feedback themes.",
          "mode": "execute",
          "format": "json",
          "evidence_level": "strict"
        }
      }
      Headless result envelope
      {
        "run_id": "avr_20260327_feedback",
        "status": "completed",
        "summary": "Support conversations grouped into six product feedback themes.",
        "result": {
          "report_id": "feedback_theme_rollup",
          "themes": 6
        },
        "verification": {
          "verdict": "verified",
          "confidence": "high"
        },
        "evidence": [
          {"type": "conversation_batch"},
          {"type": "report_artifact"}
        ],
        "exports": {
          "json": true,
          "markdown": true,
          "html": true
        }
      }
      01

      Call `run_query`

      Send a natural-language request plus the desired mode and format.

      02

      Handle clarification if needed

      If the run pauses, supply the missing fields with `resume_run` instead of redoing the whole request.

      03

      Export or relay the envelope

      Use the structured result as-is, or render Markdown and HTML for operators and final delivery.

      Why It Matters

      AI should return outcomes you can trust, not just prose you have to re-audit by hand.

      Avaryn is built for workflows where the shape of the result matters as much as the text. That includes operators, builders, researchers, analysts, and teams that want a result to stay inspectable after the run ends.

      Built for teams that need:

      • clarification before irreversible work
      • verification that stays attached to the result
      • evidence references instead of vague confidence theater
      • a clean public interface without exposing engine internals

      Launch

      Avaryn starts local, structured, and deliberate.

      The current package is a proprietary internal alpha for Codex users who want a local-first path to verifiable results. The packaging is ready for repeated testing, but not yet registered for broad marketplace distribution.

      Package Shape Plugin plus local MCP runtime
      Public Tools `run_query`, `get_result`, `resume_run`, `export_result`
      Testing Focus Result contract, demo quality, and local runtime reliability
      Registration Status Intentionally unregistered while internal testing continues

      Private alpha

      Want Avaryn on your machine first?

      Contact [email protected]