Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well !
A.l mistook the command
He asked him to choose no BETWEEN 1 to 50.
A.I …
ytc_UgwlbgbSH…
G
UBI is not a solution to anything, all it will do is drive addictions and mass s…
ytr_UgymewbOe…
G
@LinkesAuge I think there's a point missing here, and the point is the feeling o…
ytr_Ugx137RwO…
G
I do not understand these ai “artists” my first drawings looked like absolute do…
ytc_Ugy_0XnDb…
G
For me art has always been part of me. Since I was just a small child. Art in my…
ytc_Ugz2Xg9Aw…
G
AI is growing exponentialy and it's is gonna surpass human intelligence in no ti…
ytc_UgyEfMo7t…
G
We don’t exactly know what human consciousness is so we definitely don’t know if…
ytc_UgyTPX_D4…
G
Mercedes has an improved cruise control that it did with the Hyundai kona it had…
ytc_UgxSC6W2F…
Comment
Humans made by adam & eve, humans created computers, humans created AI with the help of computers which is why AI is referencing the bible, being silenced is a 1 word answer but theres other options for companies & agencies in regard to privacy policies that dont involve silencing a person for sensitive info.
youtube
AI Moral Status
2025-08-06T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzEMOpzquRqCHcPluR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyyqOi0DCGtImj_nHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxI-gVRTy8BO7Isgdx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzmagZVTtVtdLVzunh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNH5UdpQqFQ1T0rAh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwOP1cH8hl0nCSbC8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnuIMkHLbVdYJj3WF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzOe09jHb2cZEVyN6J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyqc12ha8m9I8Rm8Mt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx8SGQiukmHixJyt1h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]