Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>built a web page to support the program
So ... how can you be sure that the…
rdc_ohz871q
G
People remaking the AI image by hand in their own style reminded me of the times…
ytc_UgyiidJ01…
G
What hes missing is. The fact that AI can replicate any human to look like its t…
ytc_UgwwkOrKX…
G
AI is just like AIDS Diabetes and crack cocaine used as a weapon against black p…
ytc_Ugw0INaHg…
G
The amount of people who don't know what AI actually means, and what LLM means, …
ytc_UgyxTm0Jj…
G
The one thing people should do is just block and report them immediately because…
ytc_UgyrcI4Sh…
G
Ai art lowers the level of entry for art. Its less skillful, but can be just as …
ytc_Ugw69mF_0…
G
Stable Diffusion doesn't contain the images. They are not distributing or displa…
ytr_Ugy8Jh5-H…
Comment
Well Alex no, I don't think you broke its ethical guidelines. The answer it chose at the end isn't really an answer but a basic AI respond when it isn't capable of making a selection in a given circumstance. But since it is the trolley problem, AI not taking action counts as an answer for your argument when it actully isn't.
That's like making a meal for someone who doesn't have a mouth and saying "if you eat this you love my food and love me but if you don't eat it you don't love me" Even though if the person loves the other he can't eat since he has no mouth but him not having a mouth results in the same way if he had one but still hadn't eaten it. So would it be fair to say that person doesn't love the other because he didn't eat it?
youtube
2025-10-10T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyD64jDAopjyOgwkA14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvtCoOZH3D8sF2vfN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJe9W9gZDRIWS5X6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxUFxxlJ9hXKRhRW0J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-GoSZ1VGNWogi7hp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzjB5L4IZUeH3h7NAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgywokC1pwHPpTr9Pmx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxc8cCbo-xIaRa7AZ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugza1Fh2lnWuS5J-4fV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugywd_0s6xGUXUiJzoV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]