Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you can tell he doesnt understand it he can only describe adjectives of it but n…
ytc_UgySdJbPd…
G
me and you know very well if ai companies made their own art pieces to train the…
ytr_UgwK7-t5o…
G
You mean how ChatGPT is extremely left Meyer because I’m biased programming and …
ytc_Ugwwk3XWW…
G
if an AI actually advanced to yhe point where it saw a need to program human lik…
ytc_UgwpCqPJK…
G
If he doesnt stop them they will carry on about ending the human race. There is …
ytr_UgwznJetO…
G
Oh yeah! When AI was blowing up, people were saying how in a few years you can j…
ytr_UgwDXGuw_…
G
I think the rapid job loss that we're seeing in 2025 that is attributed to AI is…
ytr_UgyXrr6Ft…
G
America is becoming a joke, im sorry i like america.
But this sh*t isnt even sat…
ytc_UgwcDm_9h…
Comment
What if a day comes when we create AI better than us and who can create Better AI in every field required on earth. That day the question will be asked to humans "What is your purpose of life " and will have no purpose or Meaning if the take on the world
youtube
AI Moral Status
2019-05-23T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzaERbUY6aNv0bDWn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKYdbdZLIQtogqrGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzScYccHO5Bt5h9B714AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyesJQ9EnB4XZnPZyF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw6S4JWM68GaEcAKa94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwsK3U1Js6lsqygvYZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNmFP6KjDf5Rwnv6Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCFn4HpjcCAWJbW214AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyc49PobndzcEhmq7t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPhNMxQ5gyc3xTg8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]