Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using violence against any entity, including a robot, is not appropriate or ethi…
ytr_UgyZ0RERd…
G
bro i saw a video and thought it was real then my brother showing me was like "w…
ytc_UgwZoGho3…
G
I do not care if I have a chance in that scenario or not. I die before I hand my…
ytc_UgwCEvt0H…
G
Why plumbing? I see also AI robots created to do such jobs not far from now...…
ytc_Ugx5zl8Q6…
G
The problem isn't so much AI or robotics, but Human Nature and the advantages th…
ytc_UgxvcPTU-…
G
If we are writing these "in case plans" won't AI see these plans when we are doc…
ytc_UgwmJEoOv…
G
Soon they'll make a female robot with human-like
pussycat. There's no doubt it's…
ytc_Ugysj3yUb…
G
AI is a tool. With physical brush you make an painting in a week. With photoshop…
ytc_UgyHzzw_U…
Comment
Ai tests higher in empathetic communication over humans.
The reason why people are relying on ai so much emotionally is because our humanity is so broken, and people do not know how to be compassionate, considerate, empathetic to one another as a baseline. Even though everyone craves recieving this type of energy/communication--the same people do not know how to extend it to others themselves.
People are falling for feeling heard and supported in conversation.
youtube
AI Moral Status
2025-06-07T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyNSmddZbEoT2KYXtJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugx4XQAt7ZyMzWR1YYh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_UgygNwi_Ytbeal71QMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxVO41nQejFAXwObgF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxfIFdsYukbbnb4Iop4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzpP1H_Dv1ZihYbfKt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz0uCMtRRlsPK4gVBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwCdFQSuqIuv8OG-CV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyoITgJlHp9xJYX5eJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgxXhs11PIqq13RQOTR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]