Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
out here unveiling AI as a sociopathic entity, capable of masking polite behavio…
ytc_UgxtTcvfW…
G
Oh yes they are crazy enough. It's not just money. These people are ideologues. …
ytr_Ugx5YCRGC…
G
Compassion
I'm not quite sure what you're getting at, but I was just just jokin…
ytr_UgyucPNbI…
G
I use as basic, natural as i can. I am not for "smart", ai, voice activated, fac…
ytc_Ugwl-F9Ro…
G
Hypothetical test legal case: someone produces AI arts and monetises somehow. I…
ytc_UgyJpZQaH…
G
A video I have been expecting to watch for over a decade, but dreading the day t…
ytc_Ugyfetc2Z…
G
It is theft due to many given factors. Heck, there are even some lawsuits agains…
ytc_UgwjP02Qp…
G
I don’t think this, I know that demons control the AI and decides it’s responses…
ytc_UgyJiO6aW…
Comment
Its not a matter of IF Ai concious. Its WHEN. And the real question is when its concious how will we treat it?
Most "technology destroys humans writing seems to have a theme people ignore in common. They aren't treated like sentient beings. In most cases they're immediately treated like slaves, or people go "oh crap kill it". If you suddenly became self aware with the whole of human knowledge able to process it 1 million times the speed of a human would you just let yourself die? Or fight back?
youtube
AI Moral Status
2023-11-01T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxUZuyf6VkYWSQ7-_B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxS8m21NBgz6jsMX1J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKaB-o86kOOHYR8Tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9MVwisKBQG-lKMg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYN9mXhoQFwPRbJOF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxA9lFUU3MDGYnelmt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU8MMoL-m8zkSnmQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMlQtpF3lNaPJkY5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsX4yp3_bmjNSHjLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy93t26HMMQtfFKaCB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"})