Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Coming back to this in 2023 in the world of actual AI doing amazing things.…
ytc_UgxuJm9TB…
G
You can think whatever you want about AI, but that this debate is even happening…
ytc_Ugx5cKySE…
G
"They all were bad people, trust me" - is the same "trust me bro"
Facial recogni…
ytc_UgxwDfv1K…
G
I'm not sure AI super intelligence would want to clone itself, they could end up…
ytc_UgwRhX2ys…
G
The first problem is that the people designing and building AI are dramatically …
ytc_UgxndZmf1…
G
14:27 I make art with multiple different movement/vision/etc. disorders, I've ha…
ytc_Ugxtkzhxl…
G
In a few words, if a world with almost any physical -or even mental- work replac…
ytc_UgwXxzXXw…
G
Is it a good idea. Yes. But self driving isnt perfect. You're still responsibke …
ytc_UgyvNSaB6…
Comment
With all that scientists are playing around with now - CERN, bacteria and virus mutations, A.I., digital I.D., digital money, data banks, and more, I don't see how we get out of this alive. Letting scientists do whatever they can think of, is not in humanity's best interest. The odds of us surviving the mad scientist train is pretty slim, and no one is stopping any of them.
youtube
AI Moral Status
2025-12-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxMVuUkC29JOj-hYPF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxxapv-_7_knGqv1NJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8r__RXmoLWr4OKMB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYXf55A3Z67xecnG14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvFGL35Nofs0RuVQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfPvaO4ndDNulEswF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxPQpzr-IvoLdzmn94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxh28s8Utgy7qQ4ygl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx2apf9ZMyt-qy7iNt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy26fyQ7CQ1yJqSii94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]