Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She are so wisdown that she still waithing for human give her more power of know…
ytc_UgyfLwwdY…
G
This isn't a problem with AI, it's a problem with people. All decisions made by …
ytc_UgwmEOPzN…
G
Why do I never hear anyone talking about collage. It's the most comperable thing…
ytc_Ugw3rgpoN…
G
Even daycare workers will be affected by AI because they won’t have as many litt…
ytc_UgwM41YdD…
G
90% of concept art is made with photo bashing, and 90% of anime drawing on a 3D …
ytc_UgzQTwBwF…
G
An agi will be agentic and stateful ab initio. The dualism between llm as "groun…
ytc_UgxM-tGCf…
G
It this the AI that Sam Altman says will one day come up with a cure for cancer?…
ytc_UgwrhEoZG…
G
Well yeah, art is about self expression. When we look at someone like Van Gogh…
ytc_Ugx01_F3k…
Comment
This is pretty dumb. Its unnecessary to humanize robots. We wont have to worry about giving robots rights if we dont give them emotion. The robot wont be sad if we send them to the mine if they're just programed to mine, instead of being a true AI robot with feelings.
youtube
AI Moral Status
2017-02-24T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ughl6WSLm9wCB3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjpW_cqqeU343gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgiYhlUpCB2i23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0N_B54KvacngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjGkGMrvCMT_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj8xpx1PUjL6XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghP6IRxjakkx3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi2WXL0T1TMH3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjCRASqFFZCF3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiPBTwclustlXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]