Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not against new technology. I'm against unethical use of new technology.
I w…
ytc_Ugy2xbJi_…
G
every action and situation has context you can't completely call using ai bad wh…
ytr_Ugz3-czWI…
G
#tl;dr
Microsoft's ethics and society team has been laid off, which raises conc…
rdc_jcbxllr
G
Can you try talking to Claude Instead? I think it will give better answers. GPT …
ytc_UgzM6ZVAS…
G
It's probably some guy in India remotely driving these Waymo cars, and they driv…
ytc_UgyqmM8kw…
G
1:27:13 Mo Gawdat thinks if we're kind to eachother then AI will learn from us a…
ytc_UgwSn2kym…
G
I have to disagree with your point about using AI as a reference.
"Why would yo…
ytc_Ugx9eeB3s…
G
To people who still defend this, imagine if someone made deepfake of your mom ge…
ytc_UgyvhYQk-…
Comment
AI will take a lot of jobs away but it’s not killing off humanity unless we allow it to. End of the day there is billions of humans and these things are held on giant physical servers. A few bombs and it’s all gone.
youtube
AI Moral Status
2025-12-13T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyaKl8IZO5D7w3Rkk14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzF8qUgdttemRw4Z7x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy2ie-upxxtvBilFHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGDTMkJK5_ZCeDfOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiLfSl74aDyNpP-Ol4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzqV6T3IpnFeda6mEh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1Fv7PllyCUNbDlbh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2dkk_YHseodvIA654AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxFZtc2qXOq8F2Ep8l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKS4U7Wzj8WzRH1dt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]