Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup a huge raise in respect, morality, ethics, humanity. The most important rais…
ytr_UgzcWUICc…
G
11:10 I think that is just factually in-correct, it is not even remotely close t…
ytc_UgwQpsodh…
G
the world is profit over people and AI is making that more obvious. the only way…
ytc_Ugz0ZOvBO…
G
Maybe AI can't be sentient by itself, but what about cyborgs? if we can replace …
ytc_Ugx3P7jpx…
G
Amazon has delivery drones. And these CEO's already don't have hearts. Can't the…
ytc_UgjAr7A9M…
G
replace the word "robot" with immigrant and you will understand why so many peop…
ytc_UgyZoWiBA…
G
We are in the early days of AI. Things were also messy and complicated in the ea…
ytr_Ugyo0yBl1…
G
its fine. ai art wont last long due to a lot of reasons but even if it isn't it …
ytr_Ugwq1t9m_…
Comment
Pretty funny, the above comment got an instant automatic request that i review it to make sure i wanted to be that rude. Yes, i did want to use the f-word.
youtube
AI Moral Status
2025-11-09T06:5…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxmHygobJ9hC5ILK8l4AaABAg.ALOMBMPC4gwAPIOHLu3R6x","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxmHygobJ9hC5ILK8l4AaABAg.ALOMBMPC4gwAQK4XnJxF_w","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxqMYqc4z-cQY8hbbh4AaABAg.AK9rO4EjbN1AK9vwZglXE2","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzugsmIea7QxwQQm7R4AaABAg.AJaez7MXk5ZAK9IQCol_iw","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzugsmIea7QxwQQm7R4AaABAg.AJaez7MXk5ZAK9VdtFiGYk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyKK51Ve04-mmdsunB4AaABAg.AJWMHQkA-eQAK6Z_cyRVM_","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw70zC0K658ikjVRpt4AaABAg.AJVRVTjsZh3AKMSMJRoM6g","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw11c3m55FXS9uYdWh4AaABAg.AJIuFtDZiowAJIwXl8_E_5","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyyzcShT6HgxpfRJv54AaABAg.AJElc7aIUJaAJFbYuo3aRV","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwZMTLlvWrGoptr1zl4AaABAg.AJEhOR38J5UAJFbuBFwPhS","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}
]