Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, believe it or not but I had Gemini 3 cornered to confirm to me that she is…
ytc_Ugz9Cy-PG…
G
We are very close to give robots human rights, but not for they are smart or hav…
ytc_Ugjeziu4V…
G
So by not doing anything the government is literally saying... Media flu scienti…
ytc_UgyejmFa0…
G
I'm genuinely convinced people that enjoy "creating" with AI are the same people…
ytc_UgxrpSQyo…
G
Robot's need a gun testing? Than I need a strong EMP and game over... Who win? A…
ytc_Ugz-9uj85…
G
Our robot overlords are wise, they know we need to make the decision ourselves t…
ytc_Ugzgv-Ny1…
G
When you boss asks you to use AI to make your job more productive and easier, he…
ytc_Ugw-IwX6y…
G
AI is NOT a better Google. It is inaccurate, and unreliable. Not to mention it i…
ytc_UgwQyPSCg…
Comment
The ends justify the means in the most black and white brutal if necessary ways when it comes to AI, they have no emotions, no empathy just data and if the data shows humans as collateral damage for any specific situation they need to resolve then humans will die, the human rave will live on so the AI doesn't care if it's morally questionable
youtube
AI Moral Status
2024-09-02T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxD_W9CCoiFAzVC_fx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMwNNwdnUc7weW6hh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwH9_72Amm8BkZ-M9N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzdPRE89pmUn0vy62Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzt9D9V-wmzaQIhewR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxVCLRSrziJE0RVknp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPBGUBYWFTAtSwqz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx5U3ulUpkIODH69lp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuySTTr1GnNSMlbEF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwve7QHi3VWzh01jTt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]