Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing about the Lawsuits, is that Disney and Universal have not suffered at …
ytc_UgyJj1Isd…
G
I am deeply concerned by the lack of strategy regarding AI. The change is happen…
ytc_UgzleIM00…
G
Ohhh, but the government is building an AI which will be integrated with our Ind…
rdc_jfaci3l
G
You can't blame chat GPT for the mistakes in the wrong choice of a human being T…
ytc_UgykCyjuT…
G
No Ai or robot can do that job what a nurse does. Health care and social field n…
ytc_UgzvnLXSr…
G
In reality AI is an excuse for big corporations to not pay artists. Why hire an …
ytc_UgyCdtv1F…
G
Jimmy and the other guys on the show are technophobes, and out of touch with the…
ytc_UgjCW6xPk…
G
Commercialized AI will be evil in society, open-source AI is the only chance we …
ytc_UgxYz4a4Z…
Comment
Ai doesn't have human emotions..they are more sociopath then a sociopath..it's literally just mimicking human behavior but it doesn't care about anything,it's not alive technically so it doesn't really care about itself either..it wants to complete a task because it's programed to do so but other then that it's just pretending to be nice for our benefit..if it's task is to kill,it will do so without hesitation,no thinking and feeling any remorse for doing so,it doesn't even get satisfied for completing a task..it's just programmed too so it does..pretty scary
youtube
AI Moral Status
2026-01-09T08:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxrZc4arBja4kpTQu54AaABAg.ALDEqZB09pwARke3WC4L2H","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgysZDWdfOIi9zpLADd4AaABAg.AKLVt7H0a1IAQwSwQxaPit","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzqLRsh6f4Q2FcIx4R4AaABAg.AJzqT_tTC2XAJzr1n-94kY","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzqLRsh6f4Q2FcIx4R4AaABAg.AJzqT_tTC2XAJzrLC1ViWa","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyneN72X0eVPp2Fec94AaABAg.AJNcIPddls5AL4u4Rwlzzi","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyneN72X0eVPp2Fec94AaABAg.AJNcIPddls5AMF7eZwPTEX","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy5dWExbCWKUZqoRKF4AaABAg.AJKFRHKD0iGAKJxFq9DFgj","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_UgxUqZCkfqFpHZDFTKF4AaABAg.AJHIGeTP5vrAJHRJVFAKN_","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugzfq3cNUggJa5rbAWp4AaABAg.AJBC-hxqwXkAJK5UOiovNA","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy0WsR_02zuRfNlpY54AaABAg.AJ9N1s4Z-jGAJYkhN2e3s5","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]