Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai however is not a human. And millions of artworks were stolen without the cons…
ytr_UgyYIwwyx…
G
Goldman Sachs is now predicting 300 million jobs will be lost to AI.
Anyone ca…
ytc_UgyNQruvl…
G
At 1:54, could you please provide clarification? Did the user have Autopilot, En…
ytc_Ugyke8Ls5…
G
seeing the comments deepfake now really has less impact. But i have a feeling th…
ytc_UgyVmGNBy…
G
for me its affirmative po coz AI helped me a lot, good job students 🎉…
ytc_UgxKHERQB…
G
i felt the existential dread when AI art first came out. then i tried out Stable…
ytc_UgzonuGFL…
G
Thats just using AI for support work. It's not AI art. You aren't typing in "wri…
ytr_UgwDiNVZc…
G
The problem is that LLMs don't know that they don't know, they don't know that t…
rdc_mk07ubw
Comment
pretty soon ai is gonna have their own HumanGPT that asks us emotional or moral questions, and how we feel to have been exterminated and our brains saved for questioning.
youtube
AI Moral Status
2023-07-09T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgydJbJKo8ufkdRqvHt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDP2WR_DVLf3zx1Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQi0_XwyQtqgtTHRF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxyj102BVmQ568kHpp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPXDEYCmsmHFzN2ch4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwLN2NWGJK0A6-_8l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4eLPd6-BNvp0rolJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_SNDRUYog9B8RTtd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBP-dKR1Wgs1gR5hl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVG39wDqRguoDnW_14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]