Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
good thing they don't enjoy cartoons (also, the smudge tool commenter had an ai …
ytc_Ugz8TGPLr…
G
If AI was given an equivalent to the human experience, as in, it could only lear…
ytc_UgyVQkXan…
G
Oscar saying that it doesn't matter if you use whatever tool to draw if you're n…
ytc_UgylIFSBk…
G
The car clearly warns you that if you step on the accelerator while the autopilo…
ytc_UgzCWLWks…
G
- regarding LaMDA choosing 'Jedi religion'... nobody knows why it said what it s…
ytc_Ugy_hk9v-…
G
funny how ai can have ethical dilemmas, maybe AICarma can help brands figure thi…
ytc_UgwwvBYhl…
G
@prakharupadhyay4564 “Yes, you’re absolutely right. We all know how the Industr…
ytr_UgynIUIkg…
G
As if now I do not think we are doomed. Because that trailer was so obviously AI…
ytc_UgwIbumVn…
Comment
@modernNeanderthal800 Full Self Driving didn't stop you from hitting a deer. Pretty much _any_ current vehicle has emergency braking, and Tesla even got rid of the radar sensors in lieu of cameras that have literally ploughed into walls because they were decorated to _look_ like a road.
I'd also like to see the Tesla literature telling people that FSD is only safe to use at night on empty roads. The advertising you see in this very video very clearly shows daytime driving.
youtube
AI Harm Incident
2025-08-15T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugxv1KqFmv4nmdmGM5F4AaABAg.ALrCgWHmHROALrHOVnSTJr","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwK9BAnYJiStXntL0l4AaABAg.ALrCKN4hqx7ALrK2GZsLio","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy-wc-6U2A1Jl0St4F4AaABAg.ALrCFYlp7hUALreadi6edv","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrF1Tmd7mm","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrFPHs7Wh3","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_UgwRryc8nBP0waW8VAV4AaABAg.ALrBo0rFly3ALrLfGT7u4s","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrCspgJHwT","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrGLXizNfN","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrDznthT9u","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrEntAcpZY","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]