Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep, in an effort to remove white people from searches (you literally could not …
ytc_Ugzpk9UeX…
G
I am an artist, and I support AI art, because you can always just say that your …
ytc_UgyYG39mx…
G
Honestly, I think telegram could be banned, yes it’s helpful in warzones but it’…
ytc_UgzSEIA9Q…
G
While the conversation in the video does not directly address whether robots dri…
ytr_UgzMKgz4g…
G
It's weird how people are so attached to the premise that AI is... a person. A p…
ytc_UgyMqVwpW…
G
So what we need is a fat incel internet troll teasing Google A.I, for sure it wo…
ytc_Ugx1S8k1Y…
G
The thing about superintelligence is, by definition it needs to be smarter than …
ytc_UgwqEcV4Q…
G
If AI wasn't essentially shooting a flaming arrow into the amazon or dumping a c…
ytc_UgwHJPO35…
Comment
Stupid is stupid does. Who the heck do you think wrote the code for AI platforms? Cutting AI loose to just go is like letting a toddler drive a car. Executives are like crows, except crows are smarter. They like sparkly things and are consumed with making change for the sake of change without fully understanding things. As a retired IT Director, I saw it all the time. People come up with “awesome” ideas, move to implement, get promoted and transitioned into a new role, and the next guy has to clean up the mess. AI will be a mess unless people slow down and think it through.
youtube
AI Jobs
2026-02-06T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOAGYJqQJZNXiOqoZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXVKeBSdHAzqhF0LJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwK7_ixZnc95ZBySAF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzL2OsFjkgsgyWLlMV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxTso8uwltMTMJE8Ct4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw86jtv3GeyQ6cLah54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxs2wBy4SwHgETWNhF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygJMC0qAmr_mwoajZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygVW9ZjADTypr5Dv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrasYhq_7QB2Uq7zV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]