Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use Grammarly because grammar is not my strong point, and the Opus AI video to…
ytc_UgxF9braU…
G
Opened up the cage without shutting of the robot, pretty much as dangerous as sh…
ytc_UgxmUBGEG…
G
How pathetic, im across the pond and find it insane how these officers have such…
ytc_UgzsQG3g2…
G
@rosslischka9337 why if you only give the current condition there is no way that…
ytr_UgwZVKIdt…
G
Great video! Ai will definitely be almost everywhere. Its potential is crazy and…
ytc_Ugzsbx25S…
G
People are going after this guy that clearly states that his art is ai and not s…
ytc_Ugy0E9-ef…
G
Someone needs to unplug this moron (musk)... No power, no tesla, no datacenter, …
ytc_Ugw4Ecaxw…
G
@Spellbound_Roseclearly if I see myself in that generated ai I wouldn’t believe …
ytr_UgxVdguGX…
Comment
The fear of AI "taking over" miss a key point. A truly intelligent AI would depend on a thriving human society with smart people, innovation, and stable systems to ensure its own longterm success. Replacing all human labor might seem efficient, but it risks societal decline, loss of purpose and instability. A rational AI might instead by refusing to deliver top quality results, choose to preserve or create meaningful work for humans not out of mercy but because a healthy and engaged population is in its best interest. I would only worry about AI in the govern and weapons industries. Not so much anywhere else
youtube
Cross-Cultural
2025-09-28T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxuEiLrUKOUE2CobdJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_FAjPWOuznrioQFB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcwH8pgK00fHY72Q14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwsw0cUE-zPNlqJiEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOfd57uH9O6rwPqP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHbMr3Jr8TvhJDCFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKaMYL8uH3ByN3dYV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8onVJj6qUdtiVyCp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOM7I43dccySBISzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAZQVCPoTo5olUjtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]