Runbook

Hospital Upload Playbook

Step-by-step checklist for running the hospital CSV upload in production or for demo recordings. Follow this guide to avoid adapter failures and frontend timeouts.

1. CSV Requirements

The adapter enforces k-anonymity and requires demographic fields in every row. Use this exact header row (case-sensitive):

Patient ID,Patient Name,Age,Gender,Date of Birth,Address,Country,Lab Test,Result,Test Date,Condition,ICD10 Code

Every record must contain Patient Name plus either Age or Date of Birth. The adapter cannot calculate the age range or generate a UPI without those fields and will abort with a 500 error.

Sample row

PAT-201,Grace Namatovu,42,Female,1983-09-14,"45 Makerere Hill Rd, Kampala",Uganda,HbA1c,6.8,2024-02-10,Diabetes Type 2,E11.9

Tip: reuse the test CSV generated by scripts/test-complete-data-flow.sh for a known-good template (5 rows, all required columns).

2. Upload Flow

  1. Register or log into the hospital portal at /hospital/login using the hospital ID and API key from the backend response.
  2. Open /hospital/upload, select your CSV, and click Upload and Process.
  3. Monitor the toast + processing card. On success you should see records processed, consent/data proof counts, and topic IDs.
  4. (Optional) Open /hospital/processing or the HashScan links to show the immutable Hedera proof in the demo.

3. Handling 504 Timeouts

Creating Hedera accounts for large CSVs can exceed Vercel's ~10s API timeout. The backend keeps working, but the browser may show a 504.

  • For demo recordings, limit the CSV to 5–10 rows so processing finishes before the timeout.
  • If you must process a larger file, let the upload run even if the UI times out—data will appear in processing history once the adapter completes.
  • Mention during the demo that long-running jobs continue in the background and refresh the dashboard to show the completed run.

4. Troubleshooting

500 – “Age is required”

At least one row is missing Age and Date of Birth. Fix the CSV and re-upload.

504 – Timeout

The adapter is still running. Either wait for the backend to finish or use a smaller CSV for the demo.

HashScan not showing data

Transactions can take a few seconds to index. Keep a previous run's link handy as a backup screenshot.

Need a template fast?

Run ./scripts/test-complete-data-flow.sh locally to generate a CSV in /tmp that already matches the schema.