Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for sharing your observation! Sophia's responses are indeed programmed…
ytr_UgyDB4MPQ…
G
And here we are. Robotaxis driving around with cameras and an AI brain similarly…
ytr_Ugz_CfRHn…
G
That's sad but I'm glad that brother was still alive to see another day and to f…
ytc_Ugxa6i6BT…
G
@Eleyrica and what's exactly the problem there? AI still won't be replacing huma…
ytr_Ugyt7WRju…
G
The bigger ethical dilemma of self driving cars is what happens to the economy w…
ytc_UghRftAaj…
G
Robot is always think destroy human.
Human is always break all rule from God. …
ytc_Ugz6yFKhd…
G
Completely need to shut down all software especially AI and get back like way ba…
ytc_UgwxVOkcA…
G
What if teachers actually taught kids rather than dumping them in front of scree…
ytc_Ugwe3xMwJ…
Comment
There's literally no reason to think companies care about public health/security in regards to AI when they never have about anything else.
youtube
AI Governance
2025-11-24T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyT2Mzs3_o48fVR8B54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwQ2nx8xeAY5g2NKd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6RO6YSnxVMWVdrUZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZK-V7ZKSsD6bUoZ94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxLamLi3D2p3mSVHEV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMsb1X0JWwx-yG7f94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk7IlFQlAlTpLDFPF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZke9gmioT-6tb7V14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw0aKR7VWx-UfHLYot4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4ZEqP02Dqo1vorc94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]