Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai doesnt need clean water it just needs water. this is probably why its allowed…
ytc_UgwpmCxY0…
G
Honestly, a person is slowly starting to feel that soon everyone will point out …
ytc_Ugy5mgWy6…
G
I would rather grab a phone from 2016 and learn how to draw shit, than to use ai…
ytc_UgzbEbMh7…
G
This happens every time AI models are used and only respond with "societally acc…
ytc_Ugy2eAaWk…
G
AI be like " here is how we can reduce crime by 58% and overpopulation by 13%…
ytc_Ugz7LsxKq…
G
Soon will come the day if not already the design intent for AI where simply givi…
ytc_UgxRVhWEU…
G
Omg this is so true. Can’t show gratitude and compassion to other humans but cha…
ytc_UgyyVmxRA…
G
Now talk about what you would do vs what the car is doing. That’s the flaw with …
ytc_UgwzXUjD_…
Comment
I know in the beginning on comment 1 you mentioned a fundamental misunderstanding of the technology of "AI". I would like to expand on this. EVERYONE IS WRONG! What you call "AGI" (artificial general intelligence) is actually an LLM (large language model) that compiles & steals data from every corner of the internet and then combining it into images, words, videos. If we had real AI in the modern era, then Alexander The Great had an F1 fighter jet. We are not even close, and that's good, but still won't stop the slop.
youtube
Viral AI Reaction
2025-10-23T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwQ-mTQTMs7SDogeB94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZTw0_W8s89U73KtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzC3IgTJJfQTfY4uf14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUt_ikFE186ckxQWN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD5d87zgYp_MDe68t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpQ1ONlv5IJr-4nDV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxN0GxGdrmUofw0GOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNRQDPXqmMve2yofx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwIEGsTwjsmRN5dHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBNk2NU5A8U0acx-d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}
]