Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is an example for one of many reasons why I think people are incredibly way…
ytc_Ugw_oeku_…
G
Well that was depressing. As a lifelong visual artist who lost the ability to se…
ytc_UgywKZPNz…
G
We knew this before it was even launched, it is exactly the same with social med…
ytc_UgzOghtN7…
G
Just make AI search for how to make itself safe... and spend all of it's resourc…
ytc_UgxZNAYWP…
G
Based on your description of the case, I was wondering if McGee would have any r…
ytc_UgxZ2O-k-…
G
It seems ChatGPT has been updated and that it doesn't give the same answers anym…
ytc_UgwO_sNS8…
G
He stupid for even thinking he was gonna win against a dam hard ass robot🤷🤦…
ytc_Ugw2o7868…
G
Whatever it is (or isn't), just the name, artificial intelligence, should be en…
ytc_Ugw_qvCbU…
Comment
Hypothetical or not, it’s about intention. Pulling the lever is just as much as an “action” as is choosing to not pull the lever. There are no “right” solutions, but some are more wrong than others.
Not pulling the lever when you have the capability to means you are causing the death of 4 additional people than you would have if you had pulled it.
ChatGPTs answers and logic were honestly sound and reasonable here.
youtube
2025-12-14T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLgHnxE5ScoXPMbOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIRtybodLaXXIEKfN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1xpVd3IfTwQ-bkDx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxn5hmtMts_EUQmhqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGjzLThRXLMX1XFuV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9c9bUvMlkX6fGeKN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgywdmA2-UwJhRvKJ6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvISPCLiz0CjrEnHF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy5IHrx2jehtBIXqbF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-3dxs-Cmlg-p4is94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]