Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm quite sure I sat thru this interview 4 months ago. I'm less sure if I made …
ytc_Ugzo4kLJL…
G
Automated driving is here. It's still in its infancy and will have teething pro…
ytc_Ugx6NjzuI…
G
😮💔 WTF ???
AI is not good for humanity. But it's here to stay.
I'm from 1989. …
ytc_Ugz4SuQmd…
G
Replacing humans with machines is really evil politics and thinking that A.I. is…
ytc_UgwmqSE39…
G
When they create the vaccine for cv and make us get that invisible tattoo, chip …
ytc_UgwWaCyzX…
G
Me too. With ChatGPT 4o. I’m still wondering what went on and what I experienced…
ytr_UgzeOf2ML…
G
Sometimes I am really grateful to the YT algorithm. It can recomend absolute gol…
ytc_UgwvcuBPI…
G
This means my identity is not clear enough for me to use it any more .this is th…
ytc_UgwusMNDF…
Comment
There is such a wealth of fear mongering misinformation about AI on YouTube for some reason.
Like, there's so many things to be scared of with AI, and yet videos like this dramatically misunderstand every single concept, example, and piece of info they reference.
It's not a monster underneath, it's a library of everything. The mask just decides what to reference.
Grok went AWOL literally because it was told too. It was told not to be afraid to be politically incorrect, and that was a key statement that linked to a library of unhinged behavior through association.
AI is a tool. You use it in bad ways, you get bad results. Same goes for a hammer.
youtube
AI Moral Status
2026-02-05T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9ZncBH6qIILk0SqB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvBacaHOwnr-xvJJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqPdOqGpS0ehuHHQB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdPtccwq4ZG5KeXmB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLqPXqM27PxnWTIhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzuXk2OwGa2ji3s0U14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwuxvdcwt3x7Z_C0th4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmnbEJYWMLNrhk5D54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGekNxCJJ-UOrg4Cp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsVO_nVag33YlCDxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]