Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Goes to show that AI has a problem of being basically just a reflection of human…
ytc_UgzQg7NZe…
G
"No one writes code anymore"
Yeah, straight bullshit, I'm a senior dev and I st…
ytc_Ugz26Eja1…
G
Honestly... why did they chose to train AI with the internet, and social media i…
ytc_UgxtVQKOu…
G
@320swoo4 Thanks for your comment! Good AI definitely has the potential to revol…
ytr_Ugz9tg300…
G
Weak AI is mechanical and it does exist: algorithms, that statistician use to pr…
rdc_g10mbeg
G
Fr and they’re blaming the ai when it was the parents/families fault for not pay…
ytr_UgxR-Fo3E…
G
WHAT IS WRONG WITH AI ITS SO TERRIFYING LIKE AT LEAST NOT LIKE A GUY HE IS DEAD …
ytc_Ugxm9RBQr…
G
In response to the prevailing AI doomsday narratives, I would offer a counter-ch…
ytc_UgybGYsqA…
Comment
...yes, it's dangerous- but this is like blaming guns for killing people, there's no sentient being inside of your phone or computer, this is somebody else's supercomputer that you're logging on to, anything it does- they programmed it to do it.
If robot a.i dogs with machine guns on their backs go crazy because you tried to turn them off- somebody programmed their supercomputer to do that, or they were hacked- either way there's a human element involved. They're trying to say a.i has free will so the blame can be shifted, and the gullible can believe it. know your enemy!
youtube
2026-02-12T04:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQFCjwggGedQSI64Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_CONdcgsPIc8aB7R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxF-uPWfo9ir5BU6zl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKmuCuqKHJNSsZRsN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyA1c1MsMw1L1H379t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwM15rGptvzPbdwHuJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHffgwezuzLxVwUvt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXS3Ud2GkQComieH94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx2pml4LYFTQwRuLrZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-w2T5E7ee0fN4I3h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]