Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re talking about ML (machine learning) not AI when you refer to “sorting you…
rdc_oi23gdb
G
Here's my argument against the "using as reference" thing, and the "its not coll…
ytc_Ugw33KJsE…
G
We need to stop using AI. It's slowly killing our polar bears, and other animals…
ytc_UgwqkDBfw…
G
It is going to be worth learning 3D and VFX 1:09 because you still need to use …
ytc_Ugzmo_ue_…
G
2014, Alexa, the innocent virtual assistant, she answers questions, and tells yo…
ytc_Ugx06wYIz…
G
A superintelligent being, is a vague term as it is defined by human based on cur…
ytc_UgxyXi7tZ…
G
I'm not gonna hire anybody if I can use an AI to do the same thing…
ytc_Ugw3BK74f…
G
Ai isnt conscious. It doesnt have a soul, thats why its a psychopath. It acts pu…
ytc_UgwC0H8_W…
Comment
Its so scary yet they still want to build more robots and AI. I really dont want to think about it .its dangerous, just looking at this creation. Oh my!!😢
youtube
AI Harm Incident
2024-12-29T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgycGiuqvc5c7ql1ojZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1KmVh96QGyNV1Npt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5GoUB1m9lepg_ryN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw5yyoNzu5Fv1P8yLx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSB8SzsH3B1Zpgyex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMCVXHp3bWQyy1OVN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx8KTFufcIdEej7C3J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzB3uaWkkykAy3f3I54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFZ9_3LgK_0MBZ4GF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyz2ECSYJDeFaowWWF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]