Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh shit u over in Kearny near me 😂.
But they definitely doing way too much with…
ytc_Ugyx2u2pT…
G
He’s been gentle. He knows we’re heading for disaster if they don’t get a grip o…
ytc_Ugwlf-T4k…
G
Thank you for sharing your perspective. If you're interested in exploring AI fur…
ytr_UgwcVtBDF…
G
@1minus2 modern art actually is in some sort of way the death of old art: photo…
ytr_UgxDFw3mT…
G
AI is just adding noise to everybodeis workday ... takes me allot of time to rea…
ytc_UgzEcbi6E…
G
When the whole AIs trained specifically on your style originally happened, someo…
ytc_UgzYWeFUZ…
G
Kinda missing my point. Muscles attached to a brain with eyes and knowledge have…
ytr_UgwgK_LFc…
G
So sick of these tech dudes. We don't have to participate in the insanity. They …
ytc_UgynHaLYx…
Comment
The "tech" is not the problem. The people who design "tech" are the problem
After over 40 years in "Tech" and multiple other areas, THIS IS A BULLSHIT PROMOTION OF CONSPIRACY THEORIES !!!
No doubt "A.I." has the potential for horrific THEORIES, BUT THERE IS NOT ONE SINGLE 'AI' or 'BOT' that has the capabilities to PHYSICALLY ACT IN SUCH MANNERS.
By the time the "tech" could advance far enough to be capable of performing "horrendous acts of violence and deception" our societies will have long since crumbled.
youtube
AI Harm Incident
2025-09-13T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLcoNzo7hd13gLPId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBDxZywmmr6Kz6IDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwW4aHoaSW3VqmsL5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy08Y_B4fJZxGpBtQt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxJs9sQ6Az3J6plIE14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHXHSgVIJs9Fg7FVV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7phEdH8Z7Onh5ffh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgZbYC2iItb4cQKnV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymquY-Iub_K0eoXex4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn0pSjSf6n1pds3Uh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]