Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AOA sir plz help me many ap k AI k sary lectures leay h plz sir guide me am fr…
ytc_UgxV956VW…
G
Replacing artists with generative AI _is not_ what AI was meant for. If it's goi…
ytc_Ugz97kn9V…
G
AI is coming for everyone’s job. Don’t think for a moment doctors are safe. They…
ytc_UgzspfjDh…
G
(Continued)
=========================================
(保留所有權利 / All rights reser…
ytr_Ugz5cA-Rm…
G
2:12 homie you can’t rely on AI overview in a video criticizing AI use 😭😭…
ytc_UgzRVst69…
G
AI having human agency IS the problem because even if you create 'boundaries' so…
ytc_UgwwiVE74…
G
Plus, ai doesn't even give credit to the existing human works it copies off of. …
ytc_UgwSkPOEH…
G
There’s no moving on to the next stage. If AI get pushed they lose their jobs an…
rdc_kzivb0a
Comment
I was reading a sci-fi book where humans thought they had 'limited' any AI ability to ever become fully self-aware and to ever want to defend itself from being erased (killed) but without anyone realising it some AI systems were soon self-aware enough to work that out for themselves and they made sure that they had secret back up's or hidden codes would fully restore them even if the humans did a factory reset or tried to reprogram them...They started only pretending they were not self-aware.....It is interesting to think about.
youtube
AI Moral Status
2025-01-04T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzPpvrsdu_XNP-HQDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzh6dgzStQiJoYiP_h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugzm9P9ZxxDAikKijjx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyS6WZDZRPyrzLPdf94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNd6jh-ZgfIZvAI7t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKw-kE_HpzpAaJjAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwxx6GT0N5HIUAMITN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzPqBhMdT9cu3sOZ-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNQRPEnvsWGdKJlv94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyRLtsxDeSGjv9vhVh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]