Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This will be the end of 304’s, all the simps will just buy this and forget about…
ytc_Ugyz7uNay…
G
I think the reason for chatgpt saying 'I' and 'chatgpt' as different people beca…
ytc_UgxAg9afT…
G
If Ukraine and Palestine are completely destroyed by on going war..... can you A…
ytc_Ugxsysz4d…
G
People who have mental health issues to the point of not remembering that ai is …
ytc_UgyX0O6qJ…
G
For a start, get rid of capitalism, where the rich and powerful exploit the poor…
ytc_UgyhQRwrK…
G
This feels like one of the most empty of content articles I've seen. It kept loo…
rdc_jie6n24
G
We as a society don't owe anybody driving jobs. If you're working in an obsolete…
ytc_UgiBF9knl…
G
I get this is bad but it’s not like he purposely tried to show everyone. Everyon…
ytc_Ugx0O2Jx2…
Comment
This is ridiculous that something like this has been made. It is designed to take people jobs away, and eventually they will. These things should not be made. And when he asked the question he said ''Do you want to'' not just destroy humans but ''Do you want to''. That is asking a question, and the robot gave an answer.
youtube
AI Moral Status
2017-05-02T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgivtIdgVocyFXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiYKx3M8o_e-XgCoAEC","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiMsYLdAJDJn3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgibSee-lUws8HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghV74iRtFuu9XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiZm51uxnz_23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjpMHMadcrd5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi45nUFMM_AvXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghOeUpr1CRDF3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiJ0QWK2DdTGHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]