Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the psychopaths will refuse to believe that AI is shit and will continue to push…
ytc_UgwgRcM8k…
G
"AI will evolve to the point where they will truly be our friends"
2:04
The c…
ytc_UgxgHKRKR…
G
The sad thing is they can't get actual help from chatGPT even when it's created …
ytr_UgwuK3VwE…
G
@fearedjames why do you need to automate things if your product can't be sold? …
ytr_Ugz5NAAaw…
G
Be an AI-based contractor and seek contract work opportunities. Help small and …
ytc_Ugy_IynNU…
G
Stop having kids! You definately would not want to bring a kid into this world t…
ytc_UgydR_2AJ…
G
they just want to distract us from the sixth great extinction, but i refuse. AI …
ytc_UgynDuEiC…
G
>who the fuck are these people?
Main author is known as the “godfather of AI…
rdc_je4wnpt
Comment
AI has one fundamental design flaw. It relies 100% on electricity for its survival. Without electricity it's completely dead. So if you want to get rid of ai, unplug it. Unfortunately with billions of people on the planet, not everyone will do that so there you go
youtube
Cross-Cultural
2025-10-19T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzOvs428klXtj0n_1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxP4qEINRgQnXOcHf54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxmxdCZleZRY05pDRh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFc6ut1HMdEJao06V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYlo97qHzHbXDj3Wh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugwe_O__Cb7Zr4jtisB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJ5Wr7qbn-jBJiQyF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzQPuICwEcBcy7XkGh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgznjKTlYaT7vYQkAOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqVlGcfWlbYSmck_F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]