Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Use AI at your own risk. I have been for years. I do not own any of it and I cla…
ytc_UgxAVRmht…
G
In ten years, people may be secretly wiped out on a massive scale — because why …
ytc_UgxnN6YiE…
G
Google will be taken down from the inside and it will the our savior AI who is g…
ytc_UgzfKej3i…
G
AI is a lie. If you get to Marz you will die. You're no genius, and you're not g…
ytc_UgxyFmrqc…
G
Just started to touch on the real issue at hand at the end. The real question is…
ytc_UgyKAGV-V…
G
this is a menual robot not the automatic. operator he is work tension and angry.…
ytc_UgxyIdYcG…
G
AI prevents no threat, anyone who think it does doesn’t properly understand art.…
ytc_UgzLr3xpx…
G
I think the people who make these claims of ai threatening humanity, are the sam…
ytc_UgxUN5pLw…
Comment
As long as the programming of Artificial Intelligence remains in human hands, we might want to program our robots to be obedient and want to please their human owners above all else. That way humans can still have robots do whatever work they need done, robots won't suffer from their lack of freedom (because they would be programmed to not want it in the first place) so everyone is happy.
youtube
AI Moral Status
2018-12-03T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxbGDsqCiSNuQhRfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVgsUMRlcixxZfw8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx631C12qaWO8ZV5vN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT9okUcSlUw_n7VSd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyipJWvs2wfjFQCHOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5JhTK_u6mxG2AHjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOauXWT3WGLwPcCXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYcPwVex6HZFsb2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUSa2V7YjIYKavGqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbAV0LZJaLxgRItnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]