Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What did HAL do in the end? It’s funny how we are promoting compassion yet creat…
ytc_Ugw7gN9Zj…
G
AI will end up harvesting humans for power. We will become to AI what chickens …
ytc_UgymOQWyN…
G
A paradox emerges, as mentioned before, if automation replaces most forms of lab…
ytc_UgwkzBhgU…
G
A primary care physician seeing 60-70 patients per day is not the goal. That is …
ytc_UgyLvKKdI…
G
lola bigcups ur thing is confusing...what r u trying to say????...and if AI beco…
ytr_UgidRoXHq…
G
it cannot be reconstructed AI video because AI is not wise enough to know what g…
ytc_UgyWp-miu…
G
it’s not even just polar bears. the WORLD is dying.
thanks to ai, humans will dr…
ytc_UgyJjN4yA…
G
How ironic. "Good jobs". When AI brings a socialist to defend capitalism. This i…
ytc_UgyZwO1Ug…
Comment
The ability to feel pain and be aware of it is why we have rights. So if AI could put us in a state where we always felt happy and pain free but we were actually being used and drained slowly of all energy and life should we still have rights? By basing rights on the ability to feel and acknowledge suffering this example would be perfectly moral.
youtube
AI Moral Status
2021-11-28T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzzIQUhlqpmZWxR5r94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLbcAZCmnPK48jcl14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxoX6CErICy53sEsV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0Cwz5BFyeVerbH3d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzq4Ua5zWAIesZvxrl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYnzEHozWs7Lliuvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGcqqMO8vgYVQNlQd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJYiZ-MhIuaurLqnN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzr5tFR0Vcm6EpLAA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNUacGL0h1P_jvH114AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]