Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So at 3:30 everybody is just gonna let it slide by that both these founding crea…
ytc_Ugwdqpg-_…
G
7:39 as long as AI robots looks like they had a stroke, they are easy to identif…
ytc_Ugz0Ul4Wr…
G
@grant4360 you will have a human in the loop. The AI will be a massive acceler…
ytr_UgyLZNOXI…
G
Leave it the Russians to do crazy crap like this....fighting a Robot...their goe…
ytc_UgydEnmMd…
G
If you create realistic scenery, i don't see any legal issues there unless the a…
ytr_Ugy-HXFfp…
G
Sarah Connor fought the machine.....the machine is real ...AI is doom for the hu…
ytc_UgyiHiCnu…
G
Will UBI Work? UBI refers to the growing discussion around using Artificial Int…
ytc_UgzGYnxlN…
G
You just inspired me to seed my paper evidence with prompts that break the LLM's…
ytc_UgxGAHBNu…
Comment
I'm sorry, but does Ai have emotions? Does AI have to feed it's 3 month old baby? Paying AI just sounds like a way of screwing over the working class/poor. Did anybody vote to be replaced by a higher intelligent species? What about the ethics of ramming this down the throats of everybody on this planet as long as we're talking ethics? It's a joyride for you guys, but that ain't going to be the vast majority of this world's experience. The dystopian 15-minute city is sounding pretty good after listening to you guys.
youtube
2026-02-10T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0D0BqJIoJtPN-bnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc-5WPzZ2MsuWxCwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy66fS1HNyCAt1z7IJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNDYe2N9T01_JR3nx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxW3CtlfcG03TqcNjl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPB46nHqsrKhz9Exp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwz7iNk2pAlvbEqOH94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyDvYV5JmdWL8oLdJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWZqpHvcU6aCVM3zN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDyfj2iqMlQclHBDd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]