Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s taking jobs now. When some of these people are asked they always say “it wi…
ytc_UgxT6FkFX…
G
My company is experimenting with this as well. Microsoft claims copilot increas…
rdc_jptr1wg
G
Hey man. There are many false assumptions made here. The reasoning is then false…
ytc_UgxGuo7FJ…
G
Oh no AI is coming and is going to get rid of the the working class. We aren’t g…
ytc_Ugx9bGclr…
G
I really don’t like that’s it’s called art in the first place. It’s an image, th…
ytc_UgwWrhC_5…
G
Good on you for not letting your family post your kids. Even without deep fakes,…
ytr_Ugyitunny…
G
In Tempe, AZ a self driving car killed someone. 😅 Not too sure on the date but y…
ytc_UgwQpoVWq…
G
Honestly with the way things are going this whole AI fear push is hilarious to m…
ytc_UgyX8I14Y…
Comment
There's another big issue I see. If artificial intelligences have all the same rights as people, it would be possible to abuse systems meant to protect the rights of people which were never intended to cope with the strain of the rich and powerful being able to order custom people. The most obvious way to abuse this would be to mass manufacture robo-people programmed to trust one political party or the other and release them into disputed districts shortly before an election.
youtube
AI Moral Status
2017-02-23T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjfVoL_clccOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiPI-YOPMt3eXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghKTXEJdE2k03gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjzKBW0d4zvsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggzWaALjepZ8HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugi1-8Q9o8b7SHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghZpuKPn1eld3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizDdmtVR9s7HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugh9tM2DGn-Y5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggry-BHMQAuF3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]