Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
pretty much thr same thing ai bros have been saying about ai "art", that it make…
ytr_UgwkPQLYn…
G
If robots starting doing every job, how would wealth be distributed? Would only …
ytc_Ugw54Fdmd…
G
Is it the entirely predictable reason that I predicted? AI isn't human, it's an …
ytc_Ugwv-m6Fn…
G
@FilterChain nah, you do not understand the AI, and, more importantly, you do n…
ytr_Ugx8J-BM7…
G
It's not a matter of if, but when. AI, once it's able to self-replicate will no …
ytc_UgxWFIQbT…
G
Yeah right 😂I want to see robot picking cabbage, 5 o'clock in the morning, under…
ytc_Ugy9nCytX…
G
This is an unfair match robots can't feel pain he can punch the robot as much as…
ytc_UgwnDcpyc…
G
So like… can it make enough money to pay for its cloud hosting bill and have acc…
ytc_Ugz5tbkgh…
Comment
The root of this problem goes back a long way. At some distant point in the past, an intelligent person felt sorry for all the idiots and invented education. The idiots were educated to the point where they started to think that they themselves were intelligent. This brings us to the present day where educated idiots are meeting artificial intelligence, and the artificial intelligence is clearly the smartest of the two, and is going to run things.
youtube
AI Moral Status
2025-07-11T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwkr9vz0tTwoUkEjV14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyetiVQBXt-75LKhsl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLAy6fZxQKziJsZv14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx6vQTuGiVZKBl89VJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx57v6Ih0PgSnKr4D94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5tpvJv5pmWXOE1yF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyORa7BLumbVlZpJFF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLQZRbHq-kMUyAYxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxJZ8FpCGkTMXf6tx14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWVvEjVtR4mbhMPUB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]