Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Prisoner’s Dilemma. The best outcome is for others to cooperate but for you to r…
ytc_Ugwu_OjJI…
G
„It’s copyright to an extent almost, like these artists can’t sue for copyright …
ytr_UgwCyVgXJ…
G
Haha, I'll choose AI over a Doctor, Lawyer or any other lying bastard, who has a…
ytc_Ugw9oQQzT…
G
I found a ai saying they hate emojis and I sended this: "🤦🥺😖😕😡😕🧐🤯😕😤😤🥺😕😒😠😒😕" exce…
ytc_Ugwxm22pr…
G
From the experiencing consciousness perspective, what do you think could be desc…
rdc_iomssh1
G
So, maybe I'm just not getting it, but how do we know AI isn't conscious?…
ytc_UgyNhDJpr…
G
And we can see that AI is not the problem. The problem is the people and instit…
ytc_Ugy-OO3is…
G
saying that people will work less and have a better life quality/better payment …
ytc_UgyPgAPjN…
Comment
shouldnt driverless cars have the technology to sense people and animals under the car too? since a child could also be ij the same position. Waymo deserves to be sued for the lack of safety their cars have, imagine not having proper sensors. Embarrassing
youtube
AI Harm Incident
2026-01-13T01:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwe8mh5yAlVTCaXOyd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdSoQ-O-5kpzs_mu14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUtXh20mR5CG8zzg94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypSixbkmyE51HU9Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzDOs6670zkriCvyAN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcbvNCmQqduil8X6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy6uyWIDx8d09oTTuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzldzy3zbQGx6lCfEV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz__WTHihozScBSphl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmESAAaT1GSNyvzcR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]