Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
great video
i myself for a long time have found myself on a middle ground on thi…
ytc_Ugy0KWXBQ…
G
I'd like to sincerely thank you for giving actual consideration toward the the p…
ytc_Ugz41k-R8…
G
Using AI and calling yourself an artist is like using WebMD and calling yourself…
ytc_Ugy6H6vwk…
G
You're a nicer soul than I. Someone who thinks this way doesn't deserve to be an…
ytc_Ugz2zd98s…
G
AI runs on big computers in a room! They have power cords! Unplug the damn thing…
ytc_UgxP-U_X4…
G
River guide, kayak guide, backpacking guide,mountaineering guide. Its gonna be a…
ytc_UgwiIgX-a…
G
I'd just be like, "Go for it, but be sure to send me screenshots of their reacti…
ytc_UgxKyzEDG…
G
The big thing is if America stops AI development, that doesn't mean other countr…
ytc_Ugzo-UX0e…
Comment
I feel that if we want something to work for us, we should not make it feel pain. If we program AI to be conscious we should not incorporate things like pain, depression, anger let it be happy. let it live the way we always wanted to live , no suffering.
This way it will have no reason to invade or revolt against humanity. For it already has the best of lives. Make it so it enjoys whatever it was made for.
youtube
AI Moral Status
2021-02-13T00:2…
♥ 39
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]