Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It just dawned on me, wondering if this current race to master AI is what's gonn…
ytc_Ugxxl8f94…
G
Claude Sonnet is worse IMO. I convinced it that I had solved the problem of dark…
ytc_Ugxa6A_B3…
G
@Kaliflower__ my personal belief system? I dont understand what you mean by that…
ytr_UgxRjJ3XU…
G
Politicians aren't interested in changing anything until it effects them or a bi…
ytc_UgxeDeKHQ…
G
Engaging with the arguments here is akin to validating them. Its jumping the …
ytr_UgxIXzNQG…
G
The simulation thing is something I remain skeptical of.
First off, it's not a…
ytc_UgzucEc2u…
G
I used AI for my profile picture. When people asked if I made it, I would always…
ytc_Ugz5eQRQc…
G
AI and the lack of regulation of AI is the main danger for the middle class in A…
ytc_Ugyf9AM5j…
Comment
When man creates robots , it will take the evil eventually. They don’t have feelings or true conscience. They can take control over humanity to give something like this free will. Humanity, Adam and Eve, fell for Satan’s lies. Robots will instinctively preserve their existence. This is opening Pandora’s box. We were warned about this. People can’t control themselves now. Why should a robot?.
youtube
AI Moral Status
2019-11-09T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyngX01_EtzxU-FF6J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgziuaL_AxPG9mQRrZh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzKDLCO_5TVkdvSOCJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxxIHZzfX4GTMGARjh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz0AU2QEezPj6xIxxV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-xcDXY8_kMPvbUZh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzqa6GXmxrypAI2u8d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhDHl7oc0p9G_m8up4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw588fS-pWiiPpxRkB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUIej7sPFQMdJYux14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]