Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think this whole "A.I. is a threat" argument is such a classic example of how …
ytc_UgxgUX5vw…
G
I'm seeing people struggling to address the same problem with AI as we do with o…
ytc_Ugx8_uHO4…
G
It was fun till the robot aims the gun to him instead of the car…
ytc_UgyDYwpF0…
G
Our government cannot even balance the budget and you expect them to prove that …
ytc_UgxhWQKsz…
G
This is pretty dumb. Its unnecessary to humanize robots. We wont have to worry a…
ytc_Ugi2WXL0T…
G
She's perfect never ever let the women of the neighborhood get around around thi…
ytc_UgzVi9jkj…
G
Still originates as humans being the main problem! Even if robots can distinguis…
ytc_Ugy_79lRe…
G
@ 1:20 - correction - "Waymos are much better drivers than humans" should be re…
ytc_UgzVixgAO…
Comment
xD Have we just, idk, considered not giving AI terrifyingly poorly considered jobs or giving any entity we can communicate with the equivalent death threats to correct 'bad behavior'? Just a thought xD
youtube
AI Harm Incident
2025-07-26T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxdKN7tPqeOcCgWOQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0FRgiztMXJdx9oel4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-pxjtXBRFUcSe0Hl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgznvASYOdV5JFmuc-Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3eQUbCCKu50d7N8V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxa8vsvr3Jm1-qEoCZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw4MbrCcHjhENEk70p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTuFqHy9H595BSMnp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxV8byjcZpxEjcWXoV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIhujvSBaV-z8xyOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]