Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is legitimately distressing I feel bad for these AI they just exist to be t…
ytc_UgwNugAyU…
G
Yes but that person with AI might not just replace your job, he might replace a …
ytc_Ugxgjh7wX…
G
"I'm gonna use a tiktok filter to prove my point". This is like trying to prove …
ytc_UgxMSbWcT…
G
Ill give me vote to a robot than our PH politicians in a heart beat…
ytc_Ugy8cfdJ_…
G
Even AI will tell you that AI is not good for humans. Humans are not known for g…
ytr_Ugzl-MNQS…
G
I’d like to see more podcasters bring people on researching/learning who WE are …
ytc_UgxkgnULL…
G
It's called war... population reduction, no need for the slaves once automated…
ytc_UgxAzpEKJ…
G
If studios wants AI to do the work, then soon everyone at the studios will be la…
ytc_UgywFpwcC…
Comment
many ppl say''nah its all bs" but im curious will they take a plane under control not a pilots but AI?
youtube
AI Moral Status
2025-12-15T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx68QHImgnlDhskKmN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwb8I91rHf0cMr-Tpd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQmUVWzbBFiYfbpD94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxT2OoRJF0au_IRg1J4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgySkI3kvq8P_4M8fF94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgydnvNH8dTmcIOlrm14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzi3QulioVQHe0mdmd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzeNuPapPNrCG24srN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzceVb04sde8LxYlPR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxfe_eb-UX_cC3oub94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]