Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's funny that it's 2025 and people are still having the "Digital art is not re…
ytc_UgxeFeGck…
G
Yeah, real paid Therapist will share private stuff with their friends, family or…
ytc_UgzbGFwTr…
G
Im not a fan of ai "art" but look in my opinion its inevitable that it's gonna …
ytc_UgyT_GlOt…
G
One truth that Alarmist Hinton knows but is not telling is that FOR AI to becom…
ytc_UgzBU4iAp…
G
In response to 4:05 In my view, AI invalidates that mentality. Rather than famil…
ytc_UgxEir4Xb…
G
According to AI writing detectors the Constitution of The United States is 92% A…
ytr_UgxMI7pj2…
G
Thank you for making this video Sam,
Any time I see someone defending AI pictur…
ytc_UgzuTFIEm…
G
Ok so will have AI replacing my roof, rewire my house. Get it re ' some ' knowle…
ytc_Ugw6KtI4N…
Comment
Tesla Full Self Driving does not exist. All crashes have occurred with a human in charge of the vehicle. There is no ambiguity on this, the human is at fault.
This awful accident happened on Autopilot, not (supervised) FSD. It happened because the human was looking for his 'phone and not paying attention. For no other reason.
Humans drive with just one sensor. Two eyes. A Tesla has seven. Additional sensor types are unnecessary.
Data proves that with (supervised)FSD you will drive seven times more safely that you do now. If you want a better chance of avoiding a crash, just get a Tesla.
youtube
AI Harm Incident
2025-11-29T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxCRQyA8IAOvPm4rKZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmtXeWka1Uiz7reSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx5r1VWmFXxVFttx5F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxq0hbVHrGijYJygzh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwE3NZEXcL27YighJB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAJCcv3yJzsNn7xiN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzzACJsiLYJFPqj7h4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzfKqQq1Ddn4gfODM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwR8VCE27lyOF9hcEt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgySNoqKRYn99j5dc0d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]