Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree most AI art is low-effort, but ignoring prompt engineering as a tool and…
ytc_UgyRaU1Gj…
G
Until ai learns how to sell Crack my job is safe from being taking over…
ytc_UgxOqGfGt…
G
Want to excel as an AI Governance and Data Integrity professional?🚀Access https:…
ytc_UgzAhrSdO…
G
I’m in art school rn and as Sam said, sometimes it does feel hopeless when I thi…
ytc_UgwE578JD…
G
so the A.I isn't been untrue about itself .so you ask the AI program a question …
ytc_UgydoWLL9…
G
Naw, the robot was following its pre-assigned path. It's in an area enclosed wi…
ytr_UgyB_4KKA…
G
AI fear narratives are preemptive moral laundering mechanisms designed to absorb…
ytc_UgymgpNm-…
G
I believe if this is what the future will be, it should be MANDATORY to place AI…
ytc_UgxhS1zeR…
Comment
Hi everyone, I understand that the robot bring us several benefits to us and I am sure that the robots going to help us but we have to see the two coin's face the another side it's that robots going destroy us, imagine that the robot develop all the abilities in the future the robot is able to govern the world and destroy us, I think that we have to descontinue making robots, we humans beings face to destroy the world we are in time to not make more robots, it will be better create team without intelligence.
youtube
AI Moral Status
2021-10-16T01:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxR6EEj2WorNgr_LcR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAr2OQHDC27ifIKvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZoa20Czo7oqXKROt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0kPodQ4zBqzUIf0h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwT10LuuyrA-iNhrkp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugye3s5pegHi5qsm2ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVzybAmuocOlnf0Qx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybKxYu5WCKcvS8q3h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsXDEAgHgB28i1Aa94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy1ozF2zSK9nLkf18Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]