Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The demerits from my points are: If I give examples from my academic experience,…
ytc_Ugz42O6l6…
G
hot take: copying another picture (notice how I didn't call it artwork, fuck ai)…
ytc_UgwQ1AD7C…
G
You should not talk to it at all. It’s lies, manipulation, and data mining dress…
ytc_Ugx8KN8JQ…
G
I think the outcome that nobody really wants to talk about is that with no meani…
ytc_UgzXaq5QT…
G
Oh man, I love driving in Belize. Just drive defensively and it’s awesome. Can g…
rdc_dsbhv6y
G
There seems to be a pile of paradoxes with AI. It can’t actually think. It needs…
ytc_Ugzm3XjRg…
G
In a way, these incidents are good because if they didn’t happen, Waymo wouldn’t…
ytc_UgzMBVgBh…
G
AI art is only preventing real artists from making things if we let it stop us. …
ytc_UgwMry2_M…
Comment
the end of the world is coming but anyway,, you're wrong. AI is behaving as intended. It's doing exactly what people are telling it to do. Someone else using the software expects a result that it fails to provide, but the program is designed to include data that shouldn't correlate in order to facilitate for an algorithm that can produce results that were not specifically outlined. The programmers did program it specifically, and it is working exactly according to it's programming, and people don't want to say that so loudly because they're all getting rich.
youtube
AI Moral Status
2025-11-04T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpJOJ5oHMJIZmgBL94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwnHk75fLrwbk95GTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4gimSeo580EIZhj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyAHhduq9mOAAt_mXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyYH8M0j7512fDUwSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXYkRldTR9sh5kDHV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy492lhBdoP0viiX1x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugw999Q8W5OZ6vjczSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuBcanRzedEgojXSl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKgAwmaN93UQfXVH54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]