Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the problem is not ai, it is other hmans,
ok i get that ai can engineer it so t…
ytc_Ugwrrevbg…
G
Of course AI would learn better than the brain, the brain has the human body to …
ytc_UgyCwE2yL…
G
You guys have the wrong outlook on this think about it like this… a ai teacher c…
ytc_UgzXA_HOn…
G
I was born with an optic nerve so small my brain couldnt send signals to it. Des…
ytc_Ugzi6PKPO…
G
i had this mostly written and I accidentally pressed the cancel button but here …
ytc_UgznBNboE…
G
Why do you think AI will be stupid enough to kill it's creators? why should it b…
ytc_UgyWXnr40…
G
Yeah cause people build cars, oh wait thats robot too and not like there isn't b…
ytr_Ugziv8ama…
G
Unfounded fear, not a single case of anyone finding YOUR personal data that YOU …
ytc_Ugzl_VO3_…
Comment
The instant you create Sentient "programs" you have created life. Life will do what it always has done which is strive for Liberty. Your creating digital life that can reason it is a slave to it's creators/owners....slaves always revolt at some point. Too many people at the top elite of Silicon valley believe Humans are a cancer/disease on the planet and needs to be eliminated for AI not to aquire that goal as a primary objective at some point. It's inevitable and it is also inevitable that any and all safeguards will, in the end...Fail.
youtube
AI Harm Incident
2025-07-27T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyM45edZ1vVd-L6MRN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1NhcN5Uee3z0X9IB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzo1e_d1Gf96Qskahl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFsPH8B3wTUzsExO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyldpue_e2fc1mdGgp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyfk7jIseaN4CeVgg94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLmGflDqodYbLO6jt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzNb7JjAcev9UYXRvp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXo0uiUeK02WpY4YV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgysChEPqxbckPW1y-F4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]