Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He's right, AI could create the Weapons, given the tools.
AI could seize contro…
ytc_Ugy3NBDOL…
G
Yeah, we gotta make sure AI is woke. Really important work he is doing there. Wo…
ytc_UgxRTv-1U…
G
Even the "Goodfather of AI" Geoffrey Hintonshare many of Yampolskiy's views. Now…
ytc_UgxT36OpE…
G
Simple: Context is important for art. AI images are someone typing an idea, not …
ytc_UgxgIY8nv…
G
I am not on the side of people who want to replace us with ai but i think saying…
ytr_UgxkefnLh…
G
❤️Thank you for your video, very interesting however, Robots will never supersed…
ytc_Ugzg7KG6g…
G
@fr3stylr322
Too late.
Folks have already asked the "question" and the AI said …
ytr_UgwRNIZpN…
G
Robot; "messes up"
robot; oh fu-😳
Second robot; oh you mf.....MMMMMMM😡
Worker; c…
ytc_UgzY4cb9a…
Comment
No they do not deserve rights. They can not become conscious only learn the mechanics of conscious and replicate a show of what could appear to be conscious. Giving them rights would be the dumbest thing ever and could lead to abuse on so many levels. Next thing you know they would be voting. Although I'm sure the algorithm on how they decided to vote for someone would be way more advanced then any average Joe today. Who says god bless America and only votes for this guy cause he's Republican despite the Democrat view could potentially be better or visa versa.
youtube
AI Moral Status
2017-09-29T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhfUbtpxRpFCg2RbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxw29hCRkRXSp_1xQl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzxs_MaS9tOuE-ofU94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzZuMj4n3MDIkIG1ql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNly2eYnFv9N7GZB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyiQyxfa4atkYseCmx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx7Lk0ES4Dp34m9F2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVymjGfAAf9ZSK9w14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz41nduqULPOKslKst4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyL7jLrsf5hxsujVlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]