Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans don’t need AI for a collapse. We have awful humans successfully destroyin…
ytc_UgxsovXZZ…
G
It's actually frightening how quick some people are to just turn off their brain…
ytc_UgzyNHniO…
G
AI "art" is poisoning itself 😂😂
The stuff that it makes gets uploaded back into …
ytc_Ugwb2yvqK…
G
The Most Dumbest Thing Human beings can Do, is to Invoke Complex Emotions in AI,…
ytc_Ugw7ewZ59…
G
9:17 I think when they talk about reference, they mean "give him a red scarf" so…
ytc_Ugwr1PX1S…
G
You do, but investors don't. And since they are absolutely clueless, the compani…
ytc_UgxwjTH66…
G
I go to a school in the US, and now they use blockers that block everything exce…
ytc_Ugx0LMUr1…
G
I hope you people at least understand that ai art generators don’t just magicall…
ytc_UgxHJxCPw…
Comment
It's not even possible to define consciousness in a human or other animal, so there is no baseline to determine that in an AI. "But someone can be rendered unconscious" I hear you cry, well actually thats being rendred unresponsive which is not the same thing as we can see brain activity caused by external stimuli so there is a cause and effect there. Even this we call Subconscious but again thats just another word for a measureable response. In the end we can't tell the difference between a cause and effect seen in a slime mold and someone stubbing their toe.
youtube
AI Moral Status
2024-03-08T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy5ezUzUrpdpfLu-AZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMmZ56MjjhPsTSsup4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxg7hKwNALdG7EF7bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugydqc_-YdJy7VhvCxd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwS6UPV5T7TG1frCuR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOadUXAnAeBow3slV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFGtu_ONCQi21ZG5d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-yioqBC5QKgzw73x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7pVvwdLNxgqZgIqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxX7WYOPNLS89g81Ql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]