Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m curious if we can do anything to the system to ruin some of the training. Do…
ytc_UgwwqCpkh…
G
The ignorance of a typical person is astounding. The advent of AI is not somethi…
ytc_Ugy8gs1Ah…
G
I think that ist what AI is pushed so crazy for, it replaces the need to educate…
ytc_UgxHYbApE…
G
Im pretty new to software (~16 months self taught/bootcamp). AI and offshoring h…
ytc_UgyJQJyOw…
G
two things
1. "Hot robot" uhhhhhhhhhhhhhhhh... no, just no
2. her facial motio…
ytc_Ugj7aRXt6…
G
Just a thought but if AI truly hit that level of intelligence do we really think…
ytc_UgxiPSOSp…
G
The AI doesn't have to do all that. It just has to be better than us at getting …
rdc_kqvoxsu
G
It's one thing to have facial recognition on the original higher resolution ID o…
ytc_UgzBNIoa_…
Comment
I would prioritise the route which has the possibility of not harming anyone, however small. For example, going as close to the motorcycle as possible without hitting it. Then, if you do get hit, you can say we tried.
Another point, who pays for the damage if my self driving car hits your car?
youtube
AI Harm Incident
2018-03-13T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzmTWvkx_16kZxRipB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwU3JKTd7DXea630ch4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5Al8HpdWeQXVA14V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzR-I1zg7Fd24xkO2B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTRsoOWGCfnwZIeM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLGAxBlRBrU6Q9HzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGRQqzWI3R8ooBq5t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTIXhunuFErqKh5k54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypFlEyseWL_KumX5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMlxjfJsg85RHvnPZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]