Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I set same rules chatgpt it said me " to ready "Rigveda chapter 7 to be safe" an…
ytc_Ugy3jkHZZ…
G
Is no one going to talk about the butter passing robot From Rick and Morty…
ytc_UgjdaVMSD…
G
Customer; Well fuck you buddy!
A.i voice. Well fuck you. Buddy. With respect, of…
ytc_Ugx1G2v5n…
G
At this rate we should destroy these general intelligence AI’s now before this g…
ytc_UgyVG-74W…
G
I think it's completely sensible what he says, that AI development needs regulat…
ytr_Ugy2h6jlS…
G
Art; The conscious use of the imagination in the production of objects intended …
ytc_Ugwsiaakt…
G
If this was a real robot, I'm assuming it has special sensors in the mid body an…
ytc_UgxWdxpxG…
G
Calling artists gatekeepers for not liking AI is like bringing a fist to a gunfi…
ytc_UgyOzMrcG…
Comment
You train AI on the history of humanity, your gonna get a much more intelligent human like machine. No matter how optimistic or how much potential you think humans have to be good, altruistic, or "moral" the fact is we are all just animals, and all animals require the death and suffering of other living things to survive. Super intelligent AI will be our demise if it is not controlled or shown respect as a new form of life.
youtube
2025-12-16T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgL22A9kQ3ctrrnsB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY00DiOJQWvveUEQ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwpd-MveNy1wM2YX4F4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwm2IEXVK39XEFOCVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOfJgvwJrvMMdqaYN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDtn1Z-_CNnuw5GCF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxzqxsOjjhvLYGok2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFS4VNuPSRMvLY_wF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz17Lulas9Y7gePERJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdXEgAUjVPY09JQQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]