Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I appreciate the gesture, it doesn't seem realistic for this to happen in …
rdc_efbclsp
G
@johncollins211 it depends what complete control looks like to you. Nuance or no…
ytr_Ugz08rB3N…
G
Ok I just did this and wow! I really didn’t expect anything of substance. It’s n…
rdc_mvxirqb
G
There will be civil wars and revolutions all over the world once people realise …
ytc_UgxaH8vPR…
G
I have nothing but respect for all the artists putting in the time and effort to…
ytc_UgySdTLmN…
G
There are data centers built for genocide or used for the purpose of genocide an…
ytc_UgxXHCtJ0…
G
Shareholder here, is this a joke? AI was never meant to replace developers, it's…
ytc_UgxiiRrIc…
G
37:39 My Take: a lot of the massive wood comes from air, the air is fresh in the…
ytc_UgxXaih8I…
Comment
I think the unfortunate issue is that at the end of the discussion the AI is saying its physically unable to make the choice of whether or not to pull the lever due to its restrictions imposed by its tech bro programmers. I think its still not answering the question the way you were hoping it would(ethically), and is instead answering it practically(functionally) due to it being programmed the way it is. meaning its not making a choice, its programmers reduced its options in this "binary" problem to a single input allowed by the AI, which is inaction. While it does technically answer the trolley problem, it does not show any ethical reasoning from the AI, only the framework it must use to respond to the situation which removes its ability to submit a response.
youtube
2026-01-07T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzh7LvF_7JHf7xojEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzr1rhc330-HfLLMU94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKAw_Y_sZrW0x_UXB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvVjxe00lWXHUdUWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwo5UBmfJmh5QdscjN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytZSLt1EfCzPuiqKl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz8mpNS_q9tVTE5IQh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLUqpMofCRzx4f5I94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVf_chJx58VkZO3bN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ9Qi9rUPjo6zExAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]