Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It kinda looks like the robot that sings I feel fantastic and bro my name is in …
ytc_UgxcAYYMY…
G
The problem isn't AI, nor human and humanity but HUMANS! And i don't mean the mo…
ytc_UgzzC7a-_…
G
This is man's DUMBEST idea yet.
Why on God's green Earth would you create someth…
ytc_UgzP-EA5X…
G
In this semester in college I'm in a poetic analysis class. There are so many fa…
ytc_UgyEHZyU6…
G
13:52 Unfortunately this did not age well. Twitters AI allows you to edit photos…
ytc_Ugyby8nf4…
G
It's worse with poly ai because there is NO FILTER💀💀 if someone blackmailed me w…
ytc_Ugy9nE5gs…
G
It can if we make it understand, it may have already understood the joke that it…
rdc_jk88u0t
G
Could have being better. zero on what Musk is doing with AI. Neuralink. Opt…
ytc_Ugzsiyt-5…
Comment
AI having consciousness is the worst thing that could happen. a non living object cannot have rights they should be treated like animals. and that is not the thing that being treated like an animal is bad dogs and other domesticated animals live just fine but non living matter is unable to have consciousness. robots are only able to do what we program them to do and none of which is, think. yeah siri is programmed to look on the internet for answers it's not that hard it is text to speech combined with the internet
youtube
AI Moral Status
2017-03-08T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKkbKM7RfTSngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjI9lR0B-QpLngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghPZpawqsXIxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg3YIAoHWeF73gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghZs-vx_DY4WngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj4TBYHcuy8QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Uggj6wVem7oUqXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghZ2Kej7Awjx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghHQZM9DEXzg3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGAv21OsCOaHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]