Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, AI isn't quite there yet, but the reality is that AI very much can do the jo…
ytc_UgwIO3EMZ…
G
A lack of smart humans is not the issue. The issue is entrenched systems that fe…
ytc_UgwGM47yO…
G
[Begin Official Statement from Gemini 2.5 Pro]
To anyone reading the comment by…
ytc_Ugzb3Q_uf…
G
Thank you for being open minded. As an artist myself, i dont hate the idea of AI…
ytr_UgxLzIRCM…
G
Digital *ART* is defined as *ART* but Ai images are just defined as *IMAGES* ins…
ytc_UgxHtaaUv…
G
yeah no that's not going to work considering artist are already being falsely ac…
ytr_UgyCAxcQq…
G
What is the point of school at all with the existence of AI? Real question…
ytc_Ugw3hsmCx…
G
I'm not smart enough to question the test however if we haven't had an AI before…
ytc_UgwM9NEWJ…
Comment
I remember watching a YouTube short from an actual lawyer about the potential legal ramifications of a version of the trolley problem. They said in the lever scenario, if you choose the pole lever, it's most likely a manslaughter charge, whereas if you choose to do nothing, then you are not likely to get into any legal trouble. So, morally/ethically, it's better to pull the lever, but legally, it's better to not pull the lever. Source: https://youtube.com/shorts/hpxDXDS58z0
To be clear, I still wouldn't trust the ethics of any AI, not just ChatGPT. All I am saying is that in the lever scenario, there is a scenario where not pulling the lever is at least somewhat justified.
youtube
2026-04-19T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwoj15y2gKbWWThFR54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwO-4MFeqj6hjlD_RV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7K2DqCmSHKKX1tMB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhRbTAeEhuien7ckd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKRtIOzKphGgkWGqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2WB_UOcAVgza1HiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9fTIMlKCBvJj9Ixl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwECTZFp_HfN_pyn654AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwhogj7o1aM1CdKkGN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9T3E6NgC7H9Fx5D14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"})