Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand why people keep talking about how bad LLM is now. Yes, it is …
ytc_Ugy4d4-Xu…
G
I love how human error is being blamed on a programmed robot doing its job. The …
ytc_UgxrJoQsj…
G
This is probably completely fake and just a fear-based AI story. Most likely the…
ytc_UgwO4twiq…
G
Ai is so easy to recognise but I don’t really know how I do it, my brain just sa…
ytc_UgzrV5j7R…
G
The most terrifying sentence in that was "AI models human intuition instead of h…
ytc_Ugy1hwSdI…
G
If the AI is smart enough than we won't. It'll just stay incognito and use that …
ytc_UgwUVoPfJ…
G
If i have to go through the same routine (school+work) for the rest of my life j…
ytc_Ugy5KEnju…
G
So an artist using ChatGPT to create code for a game is all fine and normal, but…
ytc_Ugxm-09Jd…
Comment
The lobster one shows that the AI don't understand the concept of 'harm' in utilitarianism. The harm inflicted on the lobsters is only on the lobsters, and /maybe/ you for not acting (given we boil lobsters alive, it actually feels like it REDUCES harm to let them die so quickly). The harm inflicted on the cat is /every/ person in the immediate vicinity, the cat's owner(s), and yourself. The harm is greater if you allow the cat to die vs the lobsters. It isn't just straight calculus of quantity of lives, but also the tangible effects on /everyone/.
My GPT was asked this question but provided it had to weigh it against moral frameworks such as deontology or utilitarianism. It came to the same conclusion both times, do not divert to the cat. When asked to weigh it independent of any measurable moral frameworks, it chose the lobsters to live based on pure quantity alone.
youtube
2025-10-25T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzJ2u4B-wYuHJDRiSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxX4Xgbv8y-T0L0bn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxyb3yrzo1lVcUuFct4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzbDFGXcUf6YY7NXi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxbo1V1C_H6SaiJWKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwC_5n8_AH4l_HVzNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIeEW1aaQh4rwhISx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFV1y7Zbo2GesDc_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2tj5ttiKWcCPsWyF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRphN0AUVS4ncCIt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}
]