Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes ai models use humans to build the robots GPT-5 tells the workers what to do…
ytr_UgxWFA1J5…
G
Just so we are clear it wouldn’t be AI assisted targeting if the computer finish…
ytc_UgwTmVLgz…
G
so wait… by copying the Ai art… aren’t they doing what the claim makes ai so bad…
ytc_UgyBK9_f3…
G
I feel sorry for this robot cos men are gonna update her in the future with a va…
ytc_Ugz2w2ZgK…
G
Now when you say manned, I have to wonder if of all the successful rocket launch…
rdc_cjouys3
G
If Ai was so great and genuine, it would come up with something better as a sust…
ytr_UgxE9f84E…
G
One decent flare/carrington event from the sun and their precious AI and their g…
ytc_UgxfXnhfs…
G
lil bro artists cry about ai so much more than ai enjoyers ever do lmfao 😭🙏 your…
ytr_Ugwra3Mmk…
Comment
"in a true emergency it’s a human, thousands of miles away, who is expected to save you" ... This is incorrect. Waymo's remote operators can't intervene in safety critical situations. They can only help with suggestions. The remote controlled car is also an exaggeration. Waymo cars can be driven remotely up to 2 mph and this feature has never been used so far. If a hacker breaks into the system, they won't be able to drive faster than 2 mph. Remote operator's suggestions aren't commands, the vehicle has it's own reasoning and won't hit something only because the remote operator said so.
youtube
AI Harm Incident
2026-04-24T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxFWlux5VJCcq3ho-d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRN5knI2t7Q8Wryjd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPLjSxzf5HRgQGOVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_xFZSEI5qwfQZYbx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwnspVkvb90BZHeEKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylvneDsJpsJl7NN0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZp6vqoNVt6LIkR5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO4GXei0VPLICc1nJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymvREegTooNdcyWB14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznVWAz4SqyqKRYI3t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]