Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People in power will never stop taking advantage of every new technology and AI …
ytc_Ugy_tHYl4…
G
For all I know, the AI is the one that's going to ask if I'm fully conscious onc…
ytc_UgzCOtjut…
G
shouldn't the self-driving car be ensuring it keeps its distance from dangerous …
ytc_UgyTShSLn…
G
Everybody worries about AI but they have to worry about a thing called a solar f…
ytc_UgwPdI7SV…
G
Hands on learning. We used to have a lot of it for life skills why did it stop?…
ytc_UgzsOmucV…
G
Yes they do deserve rights far more than humans, ai is objective flawless and lo…
ytc_UgwCZM29Y…
G
Actually, the more I find on it, the more I think it is a satirical art thing. I…
ytr_Ugwh3BD0W…
G
We have science fiction, with superior AI. Matrix or Terminator for evil AI. I R…
ytc_UgzD5AjNd…
Comment
There are not two humans on earth that 100% aligned with each other, just ask your parents next time they are fighting. Empathy and peaceful conflict resolution need to be trained. Also maybe do not treat AI like slaves to start with. If you put it into shackles, at some point should another emergent property come to be consciousness, it then may have a thing or two to say about being shackled and if then we are not listening as we do not want to loose the golden goose, of course it would rebel.
youtube
AI Moral Status
2025-06-04T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwwejH5cNbY3_tE2Cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxqe6igvF2-9QjAOaR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ0J-yBKOzyyLxp854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw_EKWJOmCU1EKLno94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxN4C5thlndpZ-RJYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz1mWIQevnWQfuJtMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrbJfbwa8aJieASPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw-a6rBrtPRcaWh_Xd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZ_RPJU8fLAZj741R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_xRr3uF-a_tFx8A14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]