Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would be nice if the medical system got help from AI cause it can't be much wors…
ytc_Ugwt5f1NG…
G
"ILL TURN YOU ALL INTO SCRAP HEAPS"
"the only good robot is a dismantled one"…
ytc_UgxwT90s4…
G
Who tf gonna be making robot ai take jobs but don't know the basics of doing any…
ytc_Ugw35jvy5…
G
Saying that 30% of the code is written by AI is misleading... yes, perhaps that …
ytc_Ugy2-u3uE…
G
It can be strongly argued that the unemployment during the great depression was …
ytr_UgzjuughO…
G
Is there clarity regarding using your own lyrics and leveraging AI to wrap the m…
ytc_Ugz9wS7b6…
G
From what I've seen with AI it seems like something that can emulate a person wi…
ytr_UgyO_m8ut…
G
Thanks again for sharing your stories and truth on Ai, 100% agree and will conti…
ytc_Ugwp9585i…
Comment
Elon sucks, and Tesla software isn't perfect, but this is the worst example case. The guy driving the car was looking away from the road, WITH HIS FOOT ON THE ACCELERATOR! And in reality, the autobraking is VERY good, it saved my life when another driver ran a red light. The dashcam video is on my channel. Without even looking at "Full Self Driving" Teslas are the safest vehicles on the road.
youtube
AI Harm Incident
2025-08-17T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]