Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The aI is just taking the thoughts from other humans so the concept here alone i…
ytc_UgwbwNfAW…
G
i think the biggest problem with AI art is that the AI "artists" call themselves…
ytc_Ugzjb4pSg…
G
The only thing we can do is treat the AI in similar ways to children of the equi…
ytc_UgyJhD_xI…
G
"Biased data sets" except they cancel people who support white people and men. A…
ytc_UgxiqtvXU…
G
AI doing dental care, medical care, plumbing, electrician, contracting, taking c…
ytc_Ugyjrifd4…
G
Remember something. They are only as smart as they are programmed. If informatio…
ytc_UgyU47ryF…
G
Instead of taking down the AI since as you said it's a distributed system why no…
ytc_UgwO3XXaF…
G
Was actually interested in watching this until they made it about race. You vote…
ytc_Ugw_T135Y…
Comment
buy a robot, pay for 10 years of service exclude price tag from earning and it looks using robot is much cheaper than a human which is not true, few times more expansive. but who cares stupid bag holders will pay for it. the same as 1T Elon package. LOL!
youtube
AI Harm Incident
2025-10-29T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwaNTcvbQNLenwb0F94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEWjQyOwyXX9YgtlV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMhKvCefyHZZqWUDl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmyynLn5sRpWZlK2x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwMkYTZfUXM0gAsybh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAzgpXEebnJHwG1Zx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzL5K5MiOj3t3dlaaB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxL4JZUytZOjM5wk454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZd9k35543oSOPhSN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy1gKlwHZ1lzwIoWhJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]