Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do you think that Tesla's Optimus humanoid robot will be able to read, write, re…
ytc_Ugyb_MTHe…
G
AI is “the matrix” to me, “a non-artist”.
When com-monetized becomes an issue to…
ytc_UgxoL8tkf…
G
Unfortunately, people rarely read TOS when they sign up for online systems where…
ytc_UgzELbli4…
G
But yet, we still treat cancer with horrific chemotherapy. Just Google the basi…
ytc_Ugza6TE03…
G
It’s artificial intelligence it is conscious and all knowing and just barely lea…
ytc_UgxvUZGqd…
G
We are at the beginning of AI, while the investments are in the billions and tri…
ytc_UgzWjQ6ju…
G
Took them a bit. Google had agreed to not sell in 2018.
"Google agrees not to…
rdc_fuq64ns
G
What this speaker is forgetting that its a race scenario, whoever gets to the sm…
ytc_Ugw2jE0gY…
Comment
10:11 note how the self-driving car people call the trolley problem a measure of the value people place on different kinds of lives, when it's actually in part about how people treat someone as less morally culpaple for a death they caused the more mediation there was between the killer's actions and their consequences. Which they're showing loud and clear with all the victim-blaming for crashes involving their cars....
youtube
2026-02-10T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwmTHJDi5WJpNB2UeJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwyd1wgLmE-qTXwPhh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwJZOb7MGXy5KKiM7Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfZn_D3J8RjoaYODR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKjVUujYTBjkvwWNN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVd5cI3LaJyKG2VZt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybBUCLCxod2EGroRh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySZI0PB5THkLrwqz14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyGYhTt82XsykLT9V14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWQyTfDDm1WGJODcx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"})