Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For people who don’t know
Microsoft is providing azure and AI that are used to …
ytc_UgwHn0lyh…
G
The movie Elysium comes to mind were American workers are slaves to build AI rob…
ytc_Ugyv107RE…
G
As for layoffs or whatever you might want to call it or refer to AI doing human …
ytr_UgzjrcWNh…
G
I literally cry after seeing the list of poor k female idols being rate who has …
ytc_UgwVhO2XT…
G
Don't worry about that drinking water, they gonna let us use the AI for free... …
ytc_Ugy4q8y0L…
G
Any animator who said those were an exact match are not really animators. If tha…
ytc_UgywBwj0E…
G
Remember this is all human faults computers won't do shit if you don't excplicit…
ytc_Ugw0CpEOA…
G
10 ways to stop AI from harming humankind...
1. Dont allow AI access to nuclear…
ytc_UgyGB9d8G…
Comment
Ai will rob those writers because it will do a better job.
And people will be happy, thats already been proven with the likes of Alexa, ai makes our lives easier.
youtube
AI Governance
2025-12-20T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx8-2Fsh23rmg2nAbN4AaABAg.APBf3vzvU0bAPRVB-HLNXi","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwLunTLDdBoaZK3qe14AaABAg.AOQ6DiqT0WOAQxjxhOsL1V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwLunTLDdBoaZK3qe14AaABAg.AOQ6DiqT0WOAQyMVwvTAhc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgweYvzR7AbEU0dfKkF4AaABAg.AO3YD-ZoSKLAP1flM5aRtv","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgweYvzR7AbEU0dfKkF4AaABAg.AO3YD-ZoSKLAPsIaekn9Um","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx8APfoGBCNKH2AXsB4AaABAg.ANodqmo0jxuAOC_SCkCzwH","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz1MPphQwYIWmG6PUx4AaABAg.ANjEIH4HIhFASOMp6chXEx","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugz1MPphQwYIWmG6PUx4AaABAg.ANjEIH4HIhFASPsx9lh6zK","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxgkWBxZcmQj2HYK8l4AaABAg.AN4BEgi6G0DANt4w3l6gLv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx-OwVlLwtmvNKDiQJ4AaABAg.AMXiVangWuNAMXvLBsuEOl","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]