Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if all programmer how they are creating AI to put all information about religio…
ytc_UgwPTS7iL…
G
Cause they take their time to perfect their craft and get better overtime. Unlik…
ytr_UgzICME2u…
G
I don't think he answered the question. Is he afraid AI will stop people thinkin…
ytc_Ugyufk_Zw…
G
Easy we can just use all the people who lose their jobs to AI as batteries!
(Ye…
rdc_lp6rd1s
G
AI companies should pay copyright to everyone on the internet each time they mak…
ytc_UgwP04l_F…
G
Americans and the world gov need to wake up. All this talk of automation is goi…
rdc_jehlq0r
G
They believe that ai can provide more control and power and the góod can become …
ytc_UgyzTa6XQ…
G
Can it teach a douche how to dress? Chat bots are not coming for your job. Unles…
ytc_UgxIGsMqH…
Comment
According to Detroit: Become Human, yes they need rights, or else they'll feel it's unfair that humans lead them when AI has more intelligence than humans and that it's unfair that humans are free but they're not free but it really depends if AI will learn to have free will.
youtube
AI Moral Status
2018-06-12T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJjtlBKdJ2L2lqAA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmnmrVSPHaqkaxkGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7qorsYArkIGF1s4p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlZ3Ms6kUw2WCKVBt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGZFDgijaips-reex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgygbmYy05g0fZggxEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVyEX6KWQGSSYJsGl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz93n58zMs7nMglsmd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0s2_2SFYACMNZ-t94AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyp72jiEr2pCmxzp9p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]