Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would a robot need to say um like it don’t already know what it’s gon say.…
ytc_UgyInfGN8…
G
I have given up on the dream to ever becoming a good artist. I have wasted so mu…
ytc_UgyvoMj1s…
G
Intelligence without compassion has always lead to atrocities. I don’t know why …
ytc_UgzxQOKQE…
G
1) He's trying to sell a product.
2) He's doing what every OpenAI talking head …
rdc_kyltinv
G
“if the automatons can feel regret, they shall feel it soon”
- Democracy Officer…
ytc_Ugwi6rQCx…
G
My reason for being polite to AI is simply so I don't lose that habit of being a…
ytc_UgxO1rfKQ…
G
Currently AI cannot do it. But how far is it from actually being able to do it? …
ytc_UgyHIlWvE…
G
This was the easiest watch/listen video because it was an interesting topic for …
ytc_Ugw7rGJi3…
Comment
Or like in that one Black Mirror-episode where after the death of her husband a woman lives with an AI that emulates the husband's conscience by gathering his social media remains.
I'm almost certain empathetic AIs are going to be a thing since they will be perceived as incredibly helpful upon their release, and I also don't see much standing in the way of the development of such AIs.
youtube
AI Moral Status
2017-03-21T01:5…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjUfSL2FoBdzngCoAEC.8QSEDOjrTDr8SvO2heAeqj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugh6hVu_9ssjf3gCoAEC.8QPlcjiYQsr8QlhmCyb19g","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghfBFxixrIHDXgCoAEC.8QOfu1t6FKh8SYvhIk7x4K","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugj3LH0Yze6Ve3gCoAEC.8QOP8xjMCWn8Qj8u49itGO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggiw8mjxzbZUngCoAEC.8QKgNfQBhJH8QUIase5wjN","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgjU66U6Snjyy3gCoAEC.8Q3Pz5lSqp98QIdLdLNzHe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgjU66U6Snjyy3gCoAEC.8Q3Pz5lSqp98QJEWRX-_c8","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjU66U6Snjyy3gCoAEC.8Q3Pz5lSqp98QLzKeLcYbH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjANcp3q9CJu3gCoAEC.8PzwyG_bXhE8PzxMk4hR70","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgjLYJhHPMsUEHgCoAEC.8PtuUTIQEvX8PwLcPlVyHa","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]