Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference between an artist taking inspiration or doing a study on another …
ytc_Ugzt4fvms…
G
All these things: Retail Checkout, IT Support, Waiter/Waitress, Sports Announcer…
ytc_UgzceWyTf…
G
I feel like it's easy to read too much into the images that ChatGPT generates fo…
rdc_ktqthfd
G
Dude I'm someone who uses AI for a lot of stuff and this video is hilarious. You…
ytc_Ugxen7dp9…
G
I've become old fashioned minded before my time with this stuff. Machines and th…
ytc_UgxhRAKsx…
G
Autopilot is NOT full self driving (FSD). Autopilot is just a weak, free produ…
ytc_UgwzS1VZj…
G
Relax, this ai is 0.1 alpha version... When we get to beta and first 1.0 AI then…
ytc_UgyrTzXLD…
G
Inventor: "After many years of painstaking work, we have finally reached perfect…
ytc_UgyviMeco…
Comment
As far as I understand it, LLMs like ChatGPT use statistical analysis of words to arrive at their answers. I think that probably explains why ChatGPT can discuss things that are well established like the trolley problem and even the fat man variation but not something novel like the gun option.
youtube
2025-12-15T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLgHnxE5ScoXPMbOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIRtybodLaXXIEKfN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1xpVd3IfTwQ-bkDx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxn5hmtMts_EUQmhqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGjzLThRXLMX1XFuV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9c9bUvMlkX6fGeKN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgywdmA2-UwJhRvKJ6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvISPCLiz0CjrEnHF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy5IHrx2jehtBIXqbF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-3dxs-Cmlg-p4is94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]