Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand how people aren't able to work out that these aren't being bu…
ytc_UgxQwfhxt…
G
I think a thing you're kind of glossing over here is the difference between the …
ytc_UgxG7iGnZ…
G
the argument does make sence. but if ur an AI advocate and user... probobly not …
ytc_UgwW3IQFd…
G
I don't hate AI in art.
I hate those fucking companies laid off teams of employ…
ytc_UgyBvzdKX…
G
Laugh out loud on the truth seeking idea how far has he pushed his AI to the rig…
ytc_UgwVKgaw0…
G
Well... ehhh... I have heard the same narrative as centuries before - "the new t…
ytc_Ugza8zGRk…
G
The algorithm brought me here from the Anti-AI trend, but I unironically love th…
ytc_UgyYB4A8R…
G
A consonance musulmane Je dirais, pas étrangère. Les prenom peuvent etre classé …
ytc_UgyliGmxL…
Comment
We don’t need robots to give us the morals we already know we should be practicing and instead trying to make this planet a true heaven, instead for most we have created something more a kin to hell. That should be our primary goal, make a beautiful planet. There is also a reason why the phrase the devil makes use of idle hands, because it’s true. Yes it will be great having use of technology particularly in the medical field and other industries. But we need to ensure we can control it. Once it is self aware, we already know mathematically as a species we are not a positive upon this planet. Any choice AI makes won’t be using morals, or ethics it will calculate mathematically.
Even with humans in charge or it, there are bad ones just as there are good ones.
youtube
AI Moral Status
2021-02-25T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzUrm5yHx2lHPMqEhp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoAzZAv7FLCz5pcV54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1gSdRJ-FnCGaswS14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6tDiQnXPjOgifGJR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyz2e_T8Jp85qdGpDh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWsPmNtCCHTrea6Rh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyA7OA0Dp4ZuMNHGp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2SOHR2kAWEoc92194AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMgG0A_L3UPUYglT94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzPCWW3mizKjyda6WN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}
]