Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is programming. It's not a malfunction. This robot has been programmed to a…
ytc_UgwjulxhK…
G
oMg ThIS sEnT cHiLLs doWn mY sPiNe i CaNt BeLieVe wE lEt Ai TaKe iT tHiS faR iTs…
ytc_UgzEmAkLk…
G
@BrotherNuh 🤦♂️
Be patient and try to understand that: What AI "really" doe…
ytr_Ugy-upBKE…
G
@OliviaK_-ts7ul i wont have that problem pal. i have about 7 years left to live.…
ytr_Ugxeh1PZt…
G
I hate humans but AI is flawed, until it is perfected and actually HELPS, then a…
ytc_Ugw43gWbk…
G
I would say that AI should not be used for MRI scans or as content moderator, as…
ytc_UgxpfaRKi…
G
1-day robot will replace us when the time is come what will happen to us.❤❤❤…
ytc_UgyIPQ9qy…
G
I am not an artist or anything. I am a disabled autistic person. Google offered …
ytc_UgznerxUH…
Comment
I believe we should bake in quality to the AI. Look at how the United States Army Drill Sergeant Academy molds young people into soldiers and citizens that contribute to the country. One reason I am patient with young people is because these values are hard to impart.
Another track would be Critical Thinking, something we need more of in this online world. Critical Thinking would be a great AI marketing scheme as well.
Alternatively, if you are candling your brain with micro doses of Ketamine, and in charge of training the AI, you are headed for a storm.
youtube
AI Moral Status
2026-03-02T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwGa3XRkMYeYoRCgHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyD527ItMQH-D4pYGt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFHtuA0IijlC_-04N4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTGHJyu2SQixoM5EN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgGY37RfV1dRRTuxd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJAO5tZ5U2Pxx-o8F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx_oPNfNqyGY5GnecR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDiMPNiM9fhLFy1Qd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytwZPaQQRo3vxRgdh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwnMBo4cl1TGXoeuAN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]