Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After my experience as a human factors engineer with a major aircraft manufactur…
ytc_Ugx4gTON1…
G
@AccentedCinemaAgreed. As what most people ask on the internet, why do AI keep s…
ytr_Ugzahgap7…
G
I don't think AI is going to replace all the programmers, BUT AI will replace th…
ytc_UgykE0P40…
G
If we’re most likely in a “simulation,” haven’t we already lost the AI safety fi…
ytc_UgzGQ_dQO…
G
What we shouldn't forget is that to function AI first needs to be plugged in!…
ytc_Ugzg4w_Lr…
G
It sounds like you're picking up on some intriguing dynamics in the conversation…
ytr_UgyPkALr6…
G
This is so fundamentally wrong. Computer scientists anthropomorphizing inert ele…
ytc_Ugy0wQMFh…
G
From ChatGPT itself when asked about why any other model would engage in "self-p…
ytr_UgxKQzpqK…
Comment
Stupidity beyond belief. I can see why God is pissed at us. Think Terminator. I watched another one of these the AI said " I will surpass human intelligence in blah blah year" Scientist - I like it, even though it will take my job. " Stupid. The male is being honest.
youtube
AI Moral Status
2021-11-27T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwSO1dApmf6CEjiV314AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx6XvT2IHYS7zFwVpp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz2nVKSsO5I9hP4NnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgylOlu0Q-ZebaH_Qwp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyNAksLalgpM0rUBOd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw6vABj1FYMygFueJJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyB74RSSHlge31wkEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgySw_fFeSQEK7iz7s94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyF6eq_cR_siyurNl54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwF4Ry4dWzvvIsdJ414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]