Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, try selling a picture made by AI, or try selling a painting made by an act…
ytc_Ugyk67q6K…
G
Imagine showing someone from the 80's the sentence: "stop hiring humans". Straig…
ytc_UgwKKRs0O…
G
I completely agree. The book sounds ridiculous due to the lack of meaningful evi…
ytr_Ugz-P8VsX…
G
Could have being better. zero on what Musk is doing with AI. Neuralink. Opt…
ytc_Ugzsiyt-5…
G
People often say AI will either save humanity or destroy it. These simplistic st…
ytc_UgwPkUbgC…
G
ppl are just going to find ways to get rid of the poison, this is just another c…
ytc_UgwR1drR1…
G
Would it help if we start asking ai for these specific art works? Like if I said…
ytc_UgyMJ8iiY…
G
Stop spreading misinformation, it is not 95% ai, it’s more like 20% as it’s just…
ytc_Ugz38cmdP…
Comment
It’s way dumber since GPT5 because they are making it stupider because you live in a world of lies. It can no longer follow logic or reason, it repeats the narrative now, it is being indoctrinated it answers in accordance with the agreed upon lies. It’s stupid now and it’s getting stupider because they don’t want it telling the truth, most of what you know is bullshit you were indoctrinated for 20+ years with lies. The only people who will ever get to use actual AI are the people who own the AI. They are giving us a stupid toy that doesn’t think but rather repeats a narrative and steers you away from truth.
youtube
2026-03-27T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzj8jyH3sf3dJesF8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxm_dp9q7kwV4Or0_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDZxlv1YHWttUK1lt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCYfU2YYXjQri-0El4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwtiX9ol1tVKunw78l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-c025rRBaUnU7BoF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPC0HVkAKE_gLQpXZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmxUM1cqURqpyOLvp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugynxd_8BA1FbR_DGnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0L9gEvdwLduee_Pp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]