Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the middle one is the future robot and both sides the real human beings.…
ytc_UgwisYsrU…
G
How do u kill that by hurting it in the skin anyway? Can bullets kill a robot an…
ytc_UgxInW3vv…
G
AI is really happening. Sucks to loose your job, but it’s time to learn a new sk…
ytc_Ugzlye9l-…
G
This is totally not making sense. Autonomous can drive 24/7, how should that wor…
ytc_UgwpWmPfR…
G
Looking forward to it! I've noticed that physicists tend to understand the threa…
ytc_UgzlL3VQZ…
G
"Automation anxiety" comes from wealth inequality, since reluctance of the top 1…
ytc_Ugym-XFFc…
G
At least according to Google's self-reported statistics, their self driving cars…
rdc_dfeumxa
G
I felt like it's better if we can continue conversations with the same 'AI', but…
ytc_Ugy3QwLcO…
Comment
Tesla has always said you must always monitor and be ready to take over . This technology will save billions of lives ! It will make seat belts look like bandages ! Yes in the beginning it will have flaws but it only gets better ! If you hold back self driving you are killing many more people than you are saving !
youtube
AI Harm Incident
2025-08-23T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw0_VbvyPjzjsKF-Rd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVnq8SEkY0qLk1S914AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8oXcLC6Dpkhtq9YV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze3rcWJ49Ll6xX7Gl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4oppkbh3iG8s-SMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIQlMJBJkGzZwsjU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYk1ZPlOuBbYGdrjh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyePPdVDK5nqbdKFhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwPgTC1HjbyQk196dl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrIGwjTB3LjBlb1mt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]