Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's an interesting perspective! Sophia's dialogue touches on the nature of wi…
ytr_Ugx90Orgw…
G
healthcare costs will never come down. They will find a way to sell the "new imp…
ytc_UgzyyJyjp…
G
It’s aaaaalmost like presenting the gospel of Jesus Christ, accountability, forg…
ytc_Ugy5BNM3g…
G
The biggest danger of AI is that we have became AI. We have grown to ask google …
ytc_UgzdP_QFv…
G
I have NOT found an AI that was able to do that. I had assumed that AI could si…
ytc_UgwB7gVF2…
G
I think that the one advantage humans have over A.I is its sense of self preserv…
ytc_UgzYTrC2_…
G
All these AI companies are using, profiting from OUR information. We should be c…
ytc_UgyBa4zmT…
G
Also, as Charles said: "If you like an AI piece, just generate it yourself."
I c…
ytc_Ugwb-PCoN…
Comment
So, what are we everyday people supposed to do about this? Sit back and live out our mundane lives, waiting to be exterminated? I think that there should be some sort of call to action at the end of this video to inspire some sense of hope. Otherwise, this video is just a depressing info dump. Don’t get me wrong, I enjoyed the video and I like this channel a lot. I appreciate the work you put into it. I just wish you left your audience with something actionable to do with this information. Until the monster is unleashed fully, we still have potential to stop it. I think we need to reject AI in all forms, inform our friends and family why they should consider doing the same. There should be immense public pressure on these industry leaders to stop putting everyone’s life and livelihood at risk.
youtube
AI Moral Status
2026-01-29T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxqOMo6V5FnMNIQ51p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0aUaCRzjEhy-QOmR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxuB_gmkhFgaYj5b1d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5p12vw2hXsIHJ96l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPpw_MOeHkjl5wE8F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlCyckTk45o6fQ8y94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5bQ1PeKAC6PY_O_V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgztB5STFMmY_e3gS3F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxwbm481bXUfi9rqkZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwkTfu5LrqV4oHXluV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]