Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The majority of AI researchers (academics and engineers) say there is a signific…
ytr_UgwdHxOW1…
G
Idunno how much of this is real or not, but if you think about predictive text a…
ytr_Ugx3UbfEt…
G
23:25 AI development would need to be regulated the same way drug development is…
ytc_UgzWqGaAS…
G
oh i know ! any robot ai that gets within 6 feet of me will be considered a thr…
ytc_UgxRYBgF1…
G
Sure, it's important to understand that he's only talking about areas where he w…
ytc_UgxxeRhg8…
G
Proof that AI does not make you a doctor. Sounds crazy, but I have patients who…
ytc_UgzCO37Zj…
G
Hey guys food for thought: 1 railroad container car can carry 4 fully loaded 53'…
ytc_UgxCz69FW…
G
Thanks for a great video. Please update when you get a Tesla invite to their ro…
ytc_UgzoO5yUs…
Comment
The way things are going with politics I don't think we even need a super intelligence or even a real intelligence at all to end humanity. Social media algorithms were already tearing apart the fabric of society when they were just algorithms hand coded by people. Kicking that up just a few more notches seems like it could easily lead to worldwide nuclear war or something even if the "AI" is literally just a recommendation algorithm that is just trying to increase watch time.
youtube
AI Moral Status
2025-11-02T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySjw3HUbNfgUPHoo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxbjWjDSEm4eWtkIUt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTSUZO3MOmecGIYI14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyuLJ9LfUm5FJ10v54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwD7DtAACh07ZQG7TR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4QWkWYAhuENknySt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyeB0f8JDA-7a4_EW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLNMQxSFcaMU9y06V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzI0FSrTlVZXfcim5x4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgySU7nxn2Fy84EqAjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]