Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think I should get compensated for the amusing my personal data from the inter…
ytc_Ugy_TPbJ1…
G
i definitely agree with this, but in the rare cases i do use ai, i use it for in…
ytc_UgxLMMb0d…
G
AI doesn't need plumbers,humans do ,so it'll be more eficient for AI to get rid …
ytc_UgwlrkSke…
G
Tesla's Model 3 will be fully automated at the end of 2017. So whilst you're at…
ytc_Ugh7XEir0…
G
Trolly on an eternal loop sounds like earth orbiting the sun... Sounds like AI a…
ytc_UgzSngscP…
G
Option D is correct as there is no Ai project which has its own emotions…
ytc_Ugzoy8wum…
G
I think he should get denied his paycheck for using ai to do his grading…
ytc_UgzmyljjZ…
G
Sorry Rick but humans will inherently become obsolete!
Some AI generated 😅models…
ytc_Ugz1Jp3qy…
Comment
To all the bots & musk fluffers comments saying the cars are not FSD but autopilot are missing the point. As is said in this video ‘Tesla leadership encouraged drivers to treat their cars as if they were actual self driving vehicles’. Musk said many times you could be driven to work without doing a thing. Then there’s his robotaxis which reinforce that belief, if they can drive without anyone in the drivers seat then their autopilot cars should do the same. Musk always plays down or omits any dangers or bad points with not only his cars but everything he does.
youtube
AI Harm Incident
2025-08-15T21:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLkAbxotBibd7cxp54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRHf6ENyLqEP_dRet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy4ocmWdhh9fFKdxR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBnLjIlH9s9FPB78t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-LI2OIE09ehGIb3Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyd9Glq6Bt3SkTR8e94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbQJn5bPEhDv9gFzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIodfh6WzQyQkUJ-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuGm2txsj8064zUPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzNKyX1kYaY-Epe-FB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]