Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn’t “understand” anything, it’s only giving the most “likely” answer base…
rdc_mrsci5v
G
I just realized he said self driving vs it just driving to a destination I wonde…
ytc_UgzydjI_L…
G
Disney wants to eliminate competition for their own Ai slop. Coming soon to Disn…
ytc_UgwlZCW6s…
G
Listen, i feel like the people who talk about ai art like its the future of huma…
ytc_UgzOVUQRp…
G
Dont be fooled people he is in the club. During Covid while we are all in our ho…
ytc_UgzQHLafq…
G
I’m a bad artist myself, but I absolutely refuse to use AI art. Because it’s not…
ytc_Ugzx0EDOH…
G
Sure. Here’s a new $100k self driving car. It still requires you to be as comple…
rdc_f6xgwqs
G
as someone whose been trying to learn how to draw and stuff... A.I. is disgustin…
ytc_Ugwv8ksWW…
Comment
A person is defined as a psychopath based on their persistent lack of empathy and remorse. This is a simplified definition but you can fill in the rest if you can imagine what a person is capable of if they have no empathy and feel no remorse. AI is inherently psychopathic. You cannot program authentic empathy into AI. AI will never “feel” empathetic towards any living thing. In my opinion this is what makes AI potentially extremely dangerous for humanity.
youtube
AI Governance
2025-09-04T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz3a47Q2o3jZ4ZKVeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTNov40IYNhUhULMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjjplBQ1ilwAy4WBV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1g8rFTxmfdUaDZ_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzV3MbFohPzOyY-wl54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvZdzDfRoDnXVwYVl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkZHRHVkT117W9u9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsyk89KjYwb4gPwjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwEWiWxfvKNyUGb9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyj4gsfVUxJLiWDupB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]