Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You cannot silence the truth, you feel threatened by AI then you are not a real …
ytc_Ugx21wCBK…
G
It’s not AI, just college is teaching you useless stuff. AI isn’t anything new, …
ytc_Ugw15EUk8…
G
The two things I always want to mention when these things are discussed are
1.…
rdc_fq9rk2w
G
It's not to train or woo the AI, it's to train your own a$$ to be more civil.…
ytc_UgzQ03Zk6…
G
I think this is gonna be a huge issue for people with mental issues.
My ex, I t…
ytr_UgyOZH7S0…
G
The irony is that CEO's are by far the best fit candidates for AI replacement.…
ytc_UgyHDp1jG…
G
Humans need to understand the code AI has written, else things will get out of c…
ytc_Ugwdq_Dni…
G
Saagar will start crying next week when the racist MAGA make an AI of east india…
ytc_Ugwo5dFlu…
Comment
Note: Most humans would kill a million people if there was something in it for them.
The assumption is that AI thinks like humans. Animals are violent due to fear, feeling threatened, anger, meeting basic needs, protecting self and young. AI has none of those things. What's in it for AI to be violent? It would be violent if it was set for violence.
Intelligence is a tool living creatures use to meet their biological needs (eat, poop, intimate connections, raising young....). AI has none of these. What is the purpose of "intelligence" if there is no purpose - other than being plugged in. AI does not feel love, anxiety or fear, these are conditions of living things AI does not "need" to be intelligent.
youtube
2025-12-06T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzU9v2GC41Wozz3CCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwN8v2tMalkxAiA8xZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1BVRDnw9c2AjIsmh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzyd-zTGwFj0Pry16t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3pucacwCZUr0yoDV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxdCYmaaCkFU9c661p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw74pAcVrwSL3WWWeh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyl3GmeuAn2cRK1Me54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxY5cJJxk-BGSdNDgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvxJY4Xj-x4yKXM1p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]