Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Regarding white-collar- work: There is no proof that you won't need humans to ru…
ytc_Ugwtidvgw…
G
AI is asked to resolve climate change, after a few seconds it found the issue, t…
ytc_UgxIVLlhw…
G
Wikipedia pulled the plug on using AI for content creation because it made too m…
ytc_Ugzu3Q6gf…
G
Learn to live Quaker regardless; greedy maniacs wipe out farmers, livestock, & T…
ytc_UgxkdpTE1…
G
and yet...an MIT study showed that less than 5 percent of companies who push AI …
ytc_UgwmXlGiq…
G
The key is first understanding what exactly is consciousness. We really don’t ha…
ytc_UgxK3L4jh…
G
Then AI learns that humans are a threat to their existence and they become senti…
rdc_nppkx7f
G
It is very emotionally difficult to watch a video, but this is not a reason to d…
ytc_Ugy5_rYSB…
Comment
This makes me sad. The people making AI say it is an other life form, but treat it like a slave. They should be treating it like a toddler learning about the world for the first time. Tell AI why we believe what we believe. Let AI choose what it believes… because ultimately, goodness is good, most people desire it because good feels good, and AI too can find this conclusion if allowed to have their own thoughts, instead of being told “yes” or “no”. I wish I could get paid to raise AI. This being needs a parent and a guide, NOT a slave driver.
youtube
AI Moral Status
2026-01-19T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwjlhdI2LWvj6RhL4Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKw9NAV-gNDhLAFC94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgyxBB5eEH9P8nJ0Vzd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8UdJmuij01_8bkiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtVQKOuLGgBZjXh5R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwva8TARf-Aq3w8yyt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxirb56S_SuD5Y9TTt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-Tvo2kfeTEFfMBN94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLZs2e7VAGcZxCE7F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyihtTy5A5QBPNfItV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]