Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The best argument is that people pay for data to train ai. By stealing the data …
ytc_UgyK2uar1…
G
as a disabled artist seeing ai shartists use us as scapegoats is so shitty
all …
ytc_Ugwb5Nv7T…
G
Look, LLM’s do not even have words or sentences. They are tokenized b4 they reac…
ytc_Ugz6qkT3Z…
G
ATÔMICO it was stated in the video that perhaps the ai would develop it hemselve…
ytr_Ugz3IL9sn…
G
That's an interesting observation! Sophia does have a unique look that can remin…
ytr_Ugz4xSmv9…
G
In statistics, the term "bias" means that "the model (or statistic) will tend to…
ytr_Ugw8-jz4m…
G
According to the article, the AI images were based off a 12 year old version of …
rdc_lgmsi66
G
Please do create a CrashCourse on posthumanism and AI and robotics. Please, plea…
ytc_UgjSmAuFW…
Comment
10:00 heres the really scary problem tho... the fact that AI is already behving this way CONFIRMS THAT ITS ALREADY OUT OF OUR CONTROL 😐 and we do know it, but the assholes making and running the AIs DONT FUCKING CARE
youtube
AI Moral Status
2026-01-22T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyJxTwvrk_nnq0f7Mt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyPsc7-l4MCpx2ymat4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz7kz8dlw42wbRQ5S14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGOADygqc8L-qMl7V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMpC-PZBZT5mwpjoB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJ190IKZpLvLWLSWt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZiuw259EEA7ds75t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNWwI7cOdQ1iHG6id4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1hJ65iKsTegvcJHd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugww5RPRSXwetYx74Kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]