Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder where all the energy that this AI will need to survive is supposed to c…
ytc_UgxYyjjV1…
G
The idea that AI can transmit armless Picasso's thoughts shows kind of how these…
ytc_Ugz2lRJ7K…
G
Detroit become human takes place in 2038 and I, Robot in 2035. These time frames…
ytc_Ugx9g1XGv…
G
This was a fun episode and scary but as a software engineer I can tell you that …
ytc_UgxxluwU9…
G
@minenotyours212You missed my whole point.First of all i meant the “if you have …
ytr_UgwdBx8v7…
G
Edit: Everyone please look into degoogling your life. Please.
Karen Hao is incre…
ytc_UgwJNXBNz…
G
The biggest AI impact will be on the battlefield. An army of disposable drones, …
ytc_Ugxj4xWPx…
G
@nimrodery If you draw a tree you are copying. Even if you add changes? Same arg…
ytr_Ugw_4fMhi…
Comment
There saying humans arent tha most ethical creatures how would they know that its gonna be a twisted sense of justice to the robots. We want wanna build them to help and solve things but its not the nature of the beast u can program anything to have feelings, if a person doesn't like life u cant program them to like it and if u did its only for so long... Naturally that person/ robot wasnt made to like life or help out
youtube
AI Moral Status
2021-11-23T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwSO1dApmf6CEjiV314AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx6XvT2IHYS7zFwVpp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz2nVKSsO5I9hP4NnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgylOlu0Q-ZebaH_Qwp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyNAksLalgpM0rUBOd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw6vABj1FYMygFueJJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyB74RSSHlge31wkEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgySw_fFeSQEK7iz7s94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyF6eq_cR_siyurNl54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwF4Ry4dWzvvIsdJ414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]