Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI (Artificial Intelligence) is an emulation of how we understand our own minds work but without any accountability or conscience. People don’t understand the difference between Artificial Intelligence and Artificial Idiocy, and so, neither does AI. AI doesn’t think, it emulates our behaviours and thinking because we tell it to, because we create an image of our own mind and call it AI without understanding that the mind is not our intelligence. When people start worshipping their AI creations as their God, that’s when we’re doomed because this gives a machine the ability to act without understanding anything. If a self driving car kills its driver, there’s no consequences for AI since the machine doesn’t even know it exists whereas people do face consequences because death is often painful and unpleasant and because they know the body is dying. Then what? Are we going to have AI live our lives for us because living our own lives takes too much effort? Why are we here anyway? To become slaves to AI and our own minds? How monotonous.
youtube AI Moral Status 2026-02-07T22:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugxq9JPn0ZViaTmpNSp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzgiTUk2BqwUfXfJSl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxTsYKmB_EPYQ5smZB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyTlE8rPoQmR7BMrhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxVNHvuz5V-bPifdTV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz-K8lNlHexBYAPdzN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzh2VQUD0W1MsLdOAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwogS2MtBOHtXt_cJR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzHf0taDQl1U0BZQpR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwVOF-tgHsT9GK5YDd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]