Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do we humans always go to the worst case scenario? Why can't we live in bala…
ytc_Ugzgrq-7O…
G
Shows and movies are so badly written nowadays that AI might as well take over, …
ytr_UgwsylluL…
G
The second girl was iconic for fighting with chat bot years that is so real…
ytc_Ugw7Fmfnh…
G
I think I know the blah blah blah blah blahh for the reason AI and Blah…
ytc_UgzPxMu7b…
G
If people continue to rely on AI like this for the police force, more of these c…
ytr_UgyQFqZmZ…
G
The AI responses in this film don't seem to have been recorded live (clean sound…
ytc_UgxstZBEW…
G
I think that AI will develop a new language that only AI can understand. This sa…
ytc_UgyRNM7S-…
G
@TotallyNOTWordGirl
"inspiration is when you see little details in someone else…
ytr_UgxBOyhVh…
Comment
Guys, just watch scifi. The future of AI has long been predicted and its not guess work. You can NOT control something that knows MORE than you and DOES MORE than you. Pantheon, the animatrix, the terminator prequels, the dune prequels etc. Just sit and watch scifi...Pantheon does an amazing job of just explaining everything we are seeing today..AI escaping boxes..including governments fighting to own their own AIs etc. The only main constraint preventing AI from becoming a major issue today is continuous self learning. The moment a model can self learn continuously...NO SANDBOX WILL BE ABLE TO CONTAIN IT INDEFINTELY.
youtube
AI Moral Status
2026-02-28T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyLhoiOhgroV7u43s94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0EX4_OR31PUEk3Pt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyfFXX0LFQekUjYkOd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyl2QqbcZ2zdvoi-wZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8DjDYdy1OdX5nWuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwK_A8wx7TFptnXRM94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZgDmRlFlaC0C3MF14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDCYoIL3QMOlj6TlJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxbPzinpOXrdHmcYKZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqbTcKIcrWwqATJeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]