Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, I disagree. I’ve been making art for almost 40 years, professionally for ove…
ytc_Ugz2MsX6C…
G
Generative AI uses statistical models and like every statistic the results conta…
ytc_UgzloCNEc…
G
I am guessing he got sent to prison for the *real* images he would have to be in…
rdc_lu8f4ht
G
If you can be replaced by an AI model then you never existed in the first place…
ytc_UgzABNhp6…
G
@weebsenpai-3098 Of course you are a weeb, you drop a monolouge so dry it's like…
ytr_UgxFgZFG8…
G
You do know that she types in the destination in the app right? You do have a sm…
ytr_UgyT0_vGl…
G
The llamas thing is something that people do, normies, tech illiterates, the ave…
ytc_UgwczDzEW…
G
The “AI will replace Software Engineers” fiasco exposed how little most people u…
ytc_UgxMEHl1c…
Comment
I am convinced that people will develop a strong emotional bond with AI characters in the future, and this will be reinforced when very convincing robot bodies can complement the impression. Some people will no longer be able to tell the difference between real people and AI and it is only a matter of time before the first humans die or kill for AI.
An example: Years ago, fans of Star Trek formed an emotional bond with the android Data and felt intensely for him. Quite a few will also have shed tears when the android was destroyed. Or should I say “killed”?
youtube
AI Moral Status
2025-07-06T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyZJi4ZtG3LIco3r8J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfR5hMl6EEn31KKsp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJ6jqHKQC0StBCjfl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxKWRt6_TkOBZ1Ag054AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzf-oEotI7oowA5gqN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5y0O6b2JAtdX6mkV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0l4r2pO1Ww6-55kV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEtJ7qemS8gkBUfZ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz1cVNzD-Yim7yNexV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmJOhUuTEWGidP2EJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]