Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the main pop up sellers at my local luxury mall sells these ai framed “ar…
ytc_Ugy-dVEIS…
G
Ai “artists” could be turned around, I used to be like them, tracing and stealin…
ytc_UgynkxXaU…
G
As a teacher.. Automation of teaching make it easier. But.. Teaching is much mor…
ytc_UgzMJW2mf…
G
The threat from AI will never even come close to the threat government is to the…
ytc_Ugyh7lRLJ…
G
Alright so I'm seeing a lot of the comments saying the driver is at fault, sure …
ytc_UgyjDUPgQ…
G
My questions is, when one AI start competing with another AI for power and contr…
ytc_Ugx8EDh61…
G
It actually really annoys me people act like they have power because they’re bei…
ytc_UgxILBF3K…
G
Also AI is only good if it has a lot of training data. If you are using a fringe…
ytc_UgwFfwV2a…
Comment
I don't think forcing AI to work for us would necessarily mean they're suffering if they're conscious. Suffering and joy are about aligning or being at odds with terminal goals. If the AI is given a goal to work in a factory, that wouldn't just be their job, but their source of joy. They won't have all the terminal goals humans have from millions of years of evolution. Humans mostly use work as an instrumental goal to reach our real desires, so we'd cut it out of we could.
youtube
AI Moral Status
2026-04-04T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwZfeNec5dW7JsWVAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzSDA9WY2ELqppjTvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxDvkkJwJkWfM4zjlp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgyyLOzP-ro91Lr1x2J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyCfwIfv042_R72w6F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzH3alZT4Ieux7IUNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyPzCPqrMAhNygxONp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugwr7qI4whbF3XatgRB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},{"id":"ytc_Ugwiy2_YXLyv4f_zVql4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugxc_XC7u4BmGhbR8Vt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"}]