Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Sealsaregoobers If you arent a certified engineer; you can't design in most pla…
ytr_Ugw6n4aRu…
G
I understand why people dislike AI but bashing this dude for simply making somet…
ytc_UgxVxEV0m…
G
Her eye is so wonky that I thought ai couldn’t possibly fail that bad so it must…
ytc_Ugz2dZb_j…
G
hello there, hope you are having a good day. I noticed that in your video you de…
ytc_UgwZKWHCz…
G
I've already seen what sort of problems ai art can create: Given how I remembere…
ytc_UgwjdMTLc…
G
If its real it have to look at the camera and robot they dont look
So robot…
ytc_UgzovPQ_-…
G
I read an article yesterday about an AI that deleted a production database and c…
rdc_n4g85av
G
This guy looks untrustworthy. I don't trust anyone who speaks with their eyebrow…
ytc_UgzIVIbRr…
Comment
Been an admirer of Yudkowsky since spring 2023. But I try to believe that AI isn’t something to fear, it’s something to understand. Maybe I'm a little blind. True that doom talk dramatizes possible risks, but real safety comes from literacy. I'm a professor doing research on AI use for students. We must learn how to live, write, and think responsibly with intelligent tools we’ve built ourselves. Of course, I'm Catholic, so that ameliorates my fears. And I'm an eternal optimist.
youtube
AI Governance
2025-10-15T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzWiIXAjDg0tjbF4tt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHViP3OLpuQuFpVLd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfzezCQNqU3fyS2E14AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx7UkPa6VVfn63d0Hh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrB660yzL3h86JoLt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyaxYhHVSAwI_ufYIV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBts37g2KceghGV5h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzzcNxydKDvw_X1ON4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyyn2jtDUoFS0ZsUyF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyyqDp3_-Mllx8n--t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]