Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean, artists ARE better than others. At arting. It's what makes art valuable.…
ytc_UgxZukvYM…
G
Hitch Hiker's Guide to the Galaxy Robot- "your plastic pal that is fun to be wit…
ytc_Ugx5F-ZyR…
G
He sound stupid. His knowledge is just limited to programming and maybe some tha…
ytc_UgyHSHPwi…
G
Hi Charlie how can one obtain a non-premium phone number to continue registerin…
ytc_UgyIatQuc…
G
AI is very stupid; it doesn't think, it just answers questions statistically. Th…
ytc_UgzIyIBhr…
G
I have ASD and programming is my special interest. I took a lot of extra math an…
ytc_UgyIyUJf0…
G
Make no mistake, the #1 priority of EVERY company using AI will always be how to…
ytc_Ugxnp3IvH…
G
The globalist psychopath's technology is decades ahead of what we are allowed to…
ytc_UgzoZQXlV…
Comment
At what point will we stop pushing for rights, in this instance we'd be legitimately giving metal and code the equivalence to that of us humans. Even if you believe consciousness can be artificially created it would not be the same as our own and they will not be experiencing our condition. AI is completely unnecessary and they do not reserve rights, we SHOULD have complete control over these creations and we should not look apon them as equal. Keeping them advanced but not self aware would be ideal.
youtube
AI Moral Status
2017-02-25T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_V3vu2DuvengCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgietbweVEt0NHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghyeUksCRmVYHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg1cdAYAiAmpHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghvGb_0icgToXgCoAEC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjgBssLGskAt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj2OLPFihnkBXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjHqU5fojdo-3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggDe-aW7XmtPXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggCn9WTTgjXRngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]