Technical notes for security, compliance, and platform teams evaluating confidential AI inference.
A plain-language explanation of what can and cannot be proven about an AI inference run.
Hardware attestation proves environment state. Model identity has to be bound separately.
3/3 confidential requests completed, 3/3 AIR receipts verified offline.
A practical explanation of how request hashes, response hashes, model hashes, and attestation linkage work together.
Traditional logs are useful, but they are not the same as verifiable AI receipts.