Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When Ai takes your job will you be happy you no longer need to do it, or sad you…
ytc_Ugx1Z0HJe…
G
When Ai began to become really good I had an idea. What if Ai actually brought a…
ytc_UgyymuapC…
G
I will never understand humans....AI bad bc it makes other humans feel bad and l…
ytc_UgyjMz1pk…
G
Another industry automation guy here (software side). I've personally written co…
rdc_glkxnry
G
0:00 lol nice warning. I didn't know a video with the subtitle "See the messages…
ytc_UgzGpeM0R…
G
Just imagine something so much more smarter than humans, that the gap between AI…
ytc_UgyrzXf25…
G
Humanity, across eons, has reached the technological singularity not once, but c…
ytc_UgwUpwzlt…
G
You use your phone. No questions asked. You take big pharmaceutical products wit…
ytc_UgyUVGiri…
Comment
the article says the woman was struck *immediately* after stepping off the sidewalk to cross the 4-lane road. they might as well have said the woman jumped in front of a moving car... autonomous or not, you can't baby-proof everything. there's always those humans that can't follow common sense and end up causing trouble for others.
youtube
2018-03-20T08:1…
♥ 21
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKdUGZOndQw1MyFZ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxifudyIYMPcGtewXl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztJbYq4fH6VepwXvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp1kPe1vCMGMh0GpJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6ZeglMCzQeQduNAp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwV8xqaS0fjaV076_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwxgPdhbN_8-yBVVwZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzdWGg9Ow5NK4X-7nF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFBknqQ1EYwuo2Ud94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDlyE5Y3a55JJ5ExJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]