Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro you're livin' in 2025 How r u saying this? there's soo much Ai art everywher…
ytr_UgympXp9d…
G
My question to bing, "can AI survive without electricity?" ~~ Its answer, "AI re…
ytc_Ugw5eTYiG…
G
I’m not convinced. I don’t see how licensing the training only is enough. One ti…
ytc_UgyEKXU5Y…
G
I'd be most worried about anyone who is gullible enough to buy into this bull! S…
ytc_UgwWrbh28…
G
as a layman....if AI autopilot of a plane does not allow human over ride....this…
ytc_UgwH66oJM…
G
@techpriestalex8730 even so, art is ever evolving, there is not a single artist …
ytr_UgxGXAJsQ…
G
Robot:
"This is my rifle, there are many like it but this one is mine"…
ytc_UgxxGpen3…
G
Chicago’s educational system sucks anyway…not that the students’ parents are any…
ytc_UgxiJwUG5…
Comment
jeez these individuals creating this are so detched from humanity its just rediculous that being humane is not a criteria for something dangerous. this chap knows exactly what AI is caable of and when which was prob along time ago where it surpassed the human ability. the absolute best way to regulate all this crap is to stop when it becomes bad for humanity. there is nothing smart about creating something that can destory all of humanity its clearly perilous. these tech turds have shiz for brains and so clearly the very reason they are in these positions as the puppeters plan is to extinguish everyone eles that is not them. once the humans become uncontrollable they need to reset
youtube
Cross-Cultural
2026-02-11T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyeo0kCtlPBTG6XdFR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBcizrkEknTMM2RZ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwwwq9Ehajp5rIBO6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4MIEYggDhXUN2F_V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwuda5GaNwSKZ2agHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrbK6gIoQ_PW9uSKJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxEP4o8z8dtx6TsFEt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqlxMEmKIi9q8Q5S14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyyq65D10QBtonyn0h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtcWxfebi9nQICwIx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]