Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is controversial but we can all agree that it's better than a banana tape…
ytc_UgzWqIjI6…
G
Not going to provide AGI, but to automate a significant amount of jobs you don''…
ytc_UgyxD_SOU…
G
As long as we don’t develop sentience for AI we should be fine, if we do somehow…
ytc_UgxYeEyZm…
G
Another brilliant interview. I do wish you would have followed up with him on h…
ytc_Ugz1dBoxU…
G
Heres a solution use ai it just for entertainment and boycott all ai exclusive r…
ytc_UgwO2c-Oo…
G
@Speaker-Beater
You're exaggerating. Nobody wants AI? Seriously?
You do realize…
ytr_Ugyq-_kW7…
G
I don’t care about the divide lol.
There is a reason 20% of the jobs have a dec…
rdc_nnudm2a
G
👎AI dangerous for many Reason
It's not time for Risk
It's time to be secured
Wit…
ytc_Ugx1PpLGO…
Comment
the thing that bothers me the most is this sh*t isn't even actual artificial intelligence, there just algorithms made by an automated peramiter adjusting system using vast quantities of information to fine tune results.
none of this BS actually thinks in any capacity, it's just made to try and look that way. nothing but a pale imitation of real intelligence, or what actual artificial intelligence looks like.
youtube
Viral AI Reaction
2025-10-27T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz-jIyY2MiCiCRt9RN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwz6oHPs16WFNlPEMd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_G1NN8fniguNjbct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDHdSv8hiVrYoGGAV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9giWbdm_3dJSU4LJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbcp7BHO7sRFX2rQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDArQGo6qiB-tpphJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMQoI2szjdIuTkcZ94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSbT0-yJqz74pMHGd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwd3Q5neYLkQ1ME3od4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]