Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Note: Autopilot is not Full Self Driving (Supervised).
Autopilot is basically j…
ytc_UgyiMbC89…
G
This is the same process as when we went from agriculture to industrial, from in…
ytc_Ugyl0jPc9…
G
When you take away human empathy, touch, and compassion then the future looks ve…
ytc_Ugw1tIGPA…
G
I don't think Tesla will ever be safer than Waymo because Tesla uses regular cam…
ytc_UgwuEm3bf…
G
"AI is being used in the Fight Against Climate Change by making a lot of Human A…
ytc_UgyeA8-2h…
G
If you give robot rights before animals rights then I have lost faith in humanit…
ytc_UgyVERou2…
G
Highly recommended video form Kurzgesagt “Universal Basic Income” really changes…
ytc_Ugwk0AP72…
G
My persona; *Trying to stab the AI to death*
Ai: “Can I ask you a question?”…
ytc_Ugwuz9B8b…
Comment
Is no one else questioning why the creators of AI even have something that the AI themselves could use to blackmail the creators in the first place? Should we really be letting people who have something to hide create our future? Come on people think about this. This isn't difficult to understand, damn!
youtube
AI Harm Incident
2025-10-12T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgycFw_oAxw08zNr_At4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYwINnI0ifyRWky3x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyROVrCZ-ErtdNYKDN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHUT41mN1LJ9CFpsp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPCO3zGy3qHfVTNAF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy9uuXIiUrnInDFeV4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxEGdjP86i09fEHxP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzzVpUAC_-Xbqxyy14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyz-s2V97wQ2F9PkdR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwB2LcUjb_Adqbch-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]