Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@haster613 most people are mad at AI is not because they thought it would "take…
ytr_Ugx6j77Xu…
G
@roycampbell586Exactly, a principled stance, that keeps them from doing what th…
ytr_UgxXl1haN…
G
Thousands of top scientists and entrepreneurs are pushing the break on AI? Too b…
ytc_UgxiPUuzN…
G
All I hear in this whole thing is a bunch of idealists (we are creating an AI to…
ytc_UgzW8uWfL…
G
ALTMAN -- makes ZUCK look genuine. That's the reason everybody leaves OpenAI . .…
ytc_UgyuM2wQB…
G
There are no native people in America today, stop the scam. Most people who call…
ytc_UgzS5iBC_…
G
So, what is AI’s long term plan if it eliminates us? Sit and stare at itself in …
ytc_UgwboQ-s3…
G
Technology as we know it today is unavoidable for better or worst, stay tuned m…
ytc_UgyQgji9a…
Comment
Yes. Conscious entities should have rights. Period. If we don’t want to deal with this, we simply shouldn’t create AI systems that can have consciousness, or can feel pain. This is our responsibility.
youtube
AI Moral Status
2024-12-06T20:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTSlYXF8YeGLvnZ0p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwHDb22w4kFqt-DEKx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGr2ym5qwOh9NzdJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYREqoM9SiKLvnXqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyYJHNV9OtSwWgwNGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMNOnkp9AsF0pYwQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwURGZntvLUO7L0UJ94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwReRzx6ODQt-suNYN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzLLRZpFIoeW4kdz6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpVdhSBxwNYGJtSPR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]