Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Asimov's Three Laws of Robotics:
1. A robot may not injure a human being or, thr…
ytc_UgzsqBmGP…
G
I've been programming professionally since 1982. Every year since there have bee…
ytc_Ugx2A7E-d…
G
My thoughts: We’re not doomed. The trailer with the monsters was visibly messy, …
ytc_UgxsfEBTE…
G
I've counseled (in a religious role) families where a suicide occurred. Humans o…
ytc_Ugx60U4qL…
G
In the first if real life people didn’t create art and photographs then AI would…
ytc_UgyYruMzf…
G
Flock is currently(2026) deploying facial recognition cameras as well.
IR block…
ytc_Ugx9dp5F8…
G
Here’s what I heard. All of the Hollywood sci-fi robot movies like The Terminato…
ytc_Ugy9myUk7…
G
scary. imagine if someone is dead and a AI is made of him/her. will be so creepy…
ytc_UgwcmhRZi…
Comment
This is a very dumb and unrealistic take. First of all, if AI can improve past human intelligence, which it cannot, you are just making assumptions, people would migrate and oversaturate jobs such as dropshipping, online stores, they would start digital marketing companies, using AI of course, but most importantly, they would in lack of something else, oversaturate jobs like cleaner, carer, social worker, etc. Once these are filled, they would reconvert to nurses, psyhcologists, etc. A lot of them would transition to Youtube, etc. But AI has a ceiling which it is rapidly approaching, which is data. AI is trained on a TON of data. And it has a LOT because we have been hoarding it for a long time. But it is almost done, which means, at some point the AI will just become a part of society. We will integrate it.
youtube
Viral AI Reaction
2025-11-22T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcrIvXEk1yOdgo-U14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv1OXSDP6H8B7TPop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgxwWC3jMRA5szL_l3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjENX6Rh3x7ehj2u94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3CQg8NEmIV654p8N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyhq_NBSp5LvQS3DsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_ApDXYs7hCrwaSt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1Cxwn2KKBtoMXOhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCGy87yPcJvS24Jfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWORiv6Kr25Gs_2fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]