Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
now we need one with full body that can walk and you go hand in hand on street w…
ytc_Ugz-poCCI…
G
but for real.. if most of them will be replaced by ai... at some point there won…
ytc_UgzX_Fd3M…
G
and this my friends is from the war in Ukraine and Russia. Drones being able to …
ytc_Ugyl796yA…
G
Walking bypedal is extreme difficult for any robot the amount of data is why in …
ytr_Ugy1r7ufq…
G
That trap would have been easy for a human to escape from.
A non-literal truth …
ytc_UgxUNOSvN…
G
I’m fine with people messing with AI art but posting and lieing about the art no…
ytc_UgysVCbbC…
G
But the robot is still a programmed computer that does not have a mind of its ow…
ytc_Ugx03iE8l…
G
@123owly I asumed it, because part of creating art is the process. If you use …
ytr_UgxFLd54c…
Comment
I view it like this…AI yes it’s artificial/digital/computers whatever, but we’ve gotten them to the point where they are replicating human intelligence and so we are essentially creating an endless amount of brains on this earth (or people/agents), and what happens to a world with overpopulation and too many ideas that just cause more debate and conflict…I say unplug AI right now, make the sh*t illegal, like Dune we can have some folks have access, but not the general population…look what weak gun laws have already done to America…weak AI laws is really going to make society unsafe….but of course AI will help find a cure for cancer..that’s even more people on earth
youtube
AI Moral Status
2026-03-01T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHRerdztEfbxVwzp94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8WC6Ga2hTc1XWSLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzhmwMjZKNEAlyJn54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyi6rGX5WjEVJwgix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzItwd5phga2CZZ1qZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQhKze8Ue9ng7UgX14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZIHf8JPWBSGS2bP54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKXdE5ekjTgPyLhXN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0rfX4KN4ZvYpzmdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZdciUVdoBb5cHTEB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]