Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You think there's inequality now?! Wait until AI is given personhood and bank ac…
ytc_UgwbEqJLm…
G
I was a long term substitute teacher in the same class, for 6th graders, from Oc…
ytc_Ugw7iI2ei…
G
After spending 2 years in Korea I agree that the situation there is weird. Its t…
rdc_dv0ghg8
G
I've seen a similar argument made for AI art outside of game development. Their …
ytr_Ugzauk4EV…
G
Simple logic क्यों नहीं बोलते हो, क्यों घुमा फिरा रहे हो 😊 Radiology business ne…
ytc_UgyLXPCgG…
G
seeing this made me rethink how i approach my essays, try with olovka ai next ti…
ytc_Ugz9OzUyg…
G
At 10:20 you mention that Copyright is needed, because otherwise large corporati…
ytc_Ugzt3CEDt…
G
Why would we “rethink the definition of sentience”? Just come up with a new word…
rdc_jp5y25o
Comment
I can't believe FSD is allowed anywhere in the world. The reason Tesla's use only cameras is profit. I think elon musk should be in prison for corporate manslaughter. Its like the Firestone scandel. I wonder how many thousands will have to die before this technology.
Do you think he lets his family use this faulty by design technology.
I am also very worried that they have now be AI integrated.
I an old ethical tech question was a self driving car has to choose to crash into a old person or a child or a wall (possibly injuring the driver). AI will choose the child as it protect itself and have the cheapest law suit option. (imo)
youtube
AI Harm Incident
2025-10-19T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxVSRevNfpEsXy-Kah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygwhEpahYKg8OZLLB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCtcsEgGjHLJT5PpF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpfcXZ1cVY8T8WN7V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaEv9AjrNXYNE10NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj5zO0aDUmpAxCdEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZVLm5WJguL8LuB9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxiM8DVfMbWsCvF79h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvFkngF8l8gB30U4p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6gOyk_7dZLjcExOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]