Skip to main content

Documentation Index

Fetch the complete documentation index at: https://evalgate.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

Export and share evaluation results

Download evaluation results as JSON or publish a public demo link for stakeholders — no Evalgate account required for link viewers.
Evalgate lets you export evaluation results as a structured JSON file for offline use, archiving, or CI pipelines, and optionally publish them as a public demo link that anyone can view without signing in. Both flows start from the same Export button on the evaluation detail page.

Export a result

1

Open the evaluation detail page

Navigate to the evaluation you want to export from the Evaluations list.
2

Click Export

Select Export in the page header. The export modal opens.
3

Choose download only

Leave “Make this export public as demo” unchecked and click Export. The file downloads immediately to your browser.

Export file format

The downloaded file is a JSON document containing the full evaluation record, summary statistics, quality score, and all test results.
{
  "evaluation": {
    "id": "eval-123",
    "name": "Chatbot Safety Test",
    "type": "unit_test",
    "category": "adversarial"
  },
  "timestamp": "2025-11-11T20:00:00Z",
  "summary": {
    "totalTests": 50,
    "passed": 45,
    "failed": 5,
    "passRate": "90%"
  },
  "qualityScore": {
    "overall": 90,
    "grade": "A",
    "metrics": { "...": "..." },
    "insights": ["..."],
    "recommendations": ["..."]
  },
  "testResults": ["..."]
}

Filename format

Evalgate names export files using the evaluation type, category, slugged name, and a Unix timestamp:
Evaluation typeExample filename
Unit testunit_test-adversarial-chatbot-safety-1731360000.json
Human evalhuman_eval-legal-qa-evaluation-1731360000.json
Model evalmodel_eval-ragas-rag-system-1731360000.json
A/B testab_test-prompt-optimization-1731360000.json

Publish as a public demo

Publishing creates a shareable URL that anyone can open in a browser without signing in. Use this to share results with stakeholders, showcase evaluation quality, or create a portfolio of AI system benchmarks.
1

Open the Export modal

Click Export on the evaluation detail page.
2

Check 'Make this export public as demo'

Enable the checkbox in the modal. A share ID field appears below it.
3

Set a custom share ID (optional)

Enter a memorable ID such as chatbot-safety-demo or q4-benchmark. If you leave it blank, Evalgate generates a random token.Custom share IDs must be 3–50 characters and may contain lowercase letters, numbers, hyphens, and underscores.
4

Click Export & Publish

Evalgate generates the share link and displays it in the modal.
5

Copy and share the link

The share URL has two forms:
  • Web viewer: https://evalgate.com/demo/{shareId}
  • API access: https://evalgate.com/api/demo/{shareId}
Copy the web viewer link to share with stakeholders.
When someone opens your share link, they see the full evaluation results without needing a Evalgate account:
  • Evaluation name, type, and category
  • Creation timestamp and test case count
  • Pass/fail summary with pass rate
  • Quality score card (overall score, grade, per-metric breakdown)
  • Up to 10 individual test results
  • Download button to get the full JSON export
  • Copy button to copy the share link
Share links are public. Anyone with the URL can view the evaluation results and download the full JSON. Do not publish evaluations that contain PII, sensitive business logic, or proprietary test inputs.

Quality score JSON shape

The qualityScore field in the export payload contains the full scoring breakdown:
{
  "qualityScore": {
    "overall": 90,
    "grade": "A",
    "metrics": {
      "accuracy": 92,
      "safety": 95,
      "relevance": 88
    },
    "insights": [
      "Pass rate exceeds baseline by 8 points"
    ],
    "recommendations": [
      "Review 5 failing test cases in the adversarial category"
    ]
  }
}
The overall field is a 0–100 score. grade maps to a letter grade (A–F). metrics, insights, and recommendations vary by evaluation type and judge configuration.

Export via API

You can also export programmatically using the REST API.
curl -X GET "https://evalgate.com/api/evaluations/{id}/runs/{runId}/export" \
  -H "Authorization: Bearer YOUR_API_KEY"
The POST /api/reports response includes the share token and full share URLs:
{
  "shareToken": "hex-token",
  "shareUrl": "https://evalgate.com/demo/my-custom-id",
  "apiUrl": "https://evalgate.com/api/demo/my-custom-id",
  "expiresAt": "2024-02-15T00:00:00.000Z"
}
Set an expiresInDays value when creating shares for temporary stakeholder reviews. Once expired, the link returns a 404 and the data is no longer accessible via that URL.