Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Isn't the problem just the fact that the robot decides if a person is worthy of …
ytc_UgzJLHaXE…
G
The age of Ai taking over is coming faster than I thought. Didn’t think I would …
ytc_UgyF8VUjI…
G
if you know you have a 100% chance of dying by being a human born before humans …
ytc_Ugwe3oAVH…
G
I think he has got it twisted. I believe AI is going to take over jobs at his le…
ytc_UgySssRcF…
G
@TechnoMasterBoy Lmao how delusional can you be? I can't understand how clankers…
ytr_UgzN4ndUK…
G
They are people acting like robot
But we are very close to make human robot
🤣🤣…
ytc_UgwECA0iz…
G
This was in South Korea he was checking the robot so the robot grabbed him inste…
ytc_UgyDC_TZ-…
G
a LOOOOT if not all, of these ai worshipers forget, its not an 'intelligence', i…
ytc_UgyvRWiRZ…
Comment
Its impossible to have progress without dangers. We'll look back on Tesla in 10 years when all vehicles are self driving and be thankful for the work they've done. Accidents are unfortunate and tragic but I think the driver was more than partially at fault in this case. He wasn't watching the road while accelerating. People need to be take some accountability for the technology they're using and actually educate themselves on it.
youtube
AI Harm Incident
2025-08-15T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxZ2O-k-Jju9XGNmZt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxW20DXGI9tV-rQzFd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwd47S7HhIHBfsYWfR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyoRXHhl5Bptrl3zud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRdt8eSjcTKNdWyyx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVvKVALvJMKPfpUz54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTBE7c4-ORZH6gxRl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyjXcnuupiaGoQKrKB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuAxswBujqX0CZzE14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyjg55h7oLvQza99Hp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]