Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
woah dude look at that picture
"yeah i cant believe ai is this good!"
"and it d…
ytc_UgzIbmUJH…
G
Reality is this, all jobs will become automated and guess what you will get fire…
ytc_UgxyLMCkI…
G
Thank you for sharing your observation! It's interesting how different people an…
ytr_Ugxdlfm66…
G
I like your reasoning. But, don’t attack until the ai generated stuff gets too c…
ytc_UgwtnKE4s…
G
Those humanoid robots are ridiculous. So you're going to purchase a million doll…
ytc_UgyF2Xvtu…
G
I think it was so that the regulations don’t become so high that only large busi…
ytc_UgxGCNIiQ…
G
4:45 they also prompted that AI to go to any lengths necessary to complete its g…
ytc_UgxgvASmk…
G
The simple solution is to stop buying nonessential items from companies utilizin…
ytc_Ugw6U0P65…
Comment
Not a good time to blame the victim. The city of Tempe is laying the groundwork in case the family of the victim decides to sue the city. It is possible the victim's senses were failing. But also Uber's AI sensors failed big time. The question is was the public aware of Uber's self driving cars being tested? Had the victim known that Uber was testing the autonomous cars, could she have been more careful?
youtube
2018-03-20T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKdUGZOndQw1MyFZ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxifudyIYMPcGtewXl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztJbYq4fH6VepwXvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp1kPe1vCMGMh0GpJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6ZeglMCzQeQduNAp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwV8xqaS0fjaV076_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwxgPdhbN_8-yBVVwZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzdWGg9Ow5NK4X-7nF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFBknqQ1EYwuo2Ud94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDlyE5Y3a55JJ5ExJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]