Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ll laugh in the Scammers ear and say do it they are so dumb I hate Ai…
ytc_UgxGGgiZB…
G
And there come the fearmongerers, who completely ignore the benefits of AI for t…
ytc_Ugyce8AMk…
G
Hinton seems like he's gone all sour grapes, as HIS AI sucks and does not work a…
ytc_UgytqWycw…
G
The chatbots are glorified autocomplete text engines trained on an unimaginably …
ytc_UgxeXjZqN…
G
Yes, if someone asks who will pay if all are unemployed, asks the wrong question…
ytc_UgxySqRTB…
G
Ai has huge limitations, AI cannot stand alone, but possibilities of incorporati…
ytc_UgwG3ptGr…
G
The "accessibility" thing isn't any kind of activism. It's insulting. I have dys…
ytc_Ugwdye8ms…
G
I mean, the same was being said about self driving cars twenty years ago. So far…
ytr_Ugxv_UEXy…
Comment
We've already reached peak AI optimization, stop trying to glorify an auto-correct engine as if it was an actual adaptive organism, AI has struggled to get better for a while now, the reason being that AI is trying to Train AI and data is become increasingly less reliable as Hallucinations feed Hallucinations, Models can still be improved with some manual oversight, but not by much more. AI is already an incredible tool, but Executives at big companies keep jacking off at the thought of infinite growth through an autocorrect engine telling them what to do.
youtube
Viral AI Reaction
2025-11-23T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4CxAbpuvVFO03H3d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxhTjph16BvofA8jJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwyO3JrqVqu9fHDnR54AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlVilpbIDBMwwnZG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfioCEoLC7MDKa1KF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwfX8-K7hPf-O8ce2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzAbc2hEbdEsby_eAJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy3vu0GauOiXZ-q04Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCzb1tIU70HnxL1e94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZbHxzPMRGPsoURQh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]