Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@SnakelnTheGrass also the issue with automating art is not people losing jobs bu…
ytr_UgxPUXO4I…
G
They are just talking possibilities, in a way AI makes the barrier to entry very…
ytc_Ugzr_ytdy…
G
The only reason AI clever enough to kill humanity would kill humanity, is that t…
ytc_UghzZ-K8T…
G
Yup! That and fish farming are our main sources of income. Both are amongst the …
rdc_ckqlldk
G
AI is a technology misused like any other technology that was invented. It is re…
ytc_UgwTNc53f…
G
idk i really really hate ai but something abt using the designs that ai made rll…
ytc_Ugxa7DqsR…
G
@elishockey SMH you're not understanding my comments. I agree that ai is ruinin…
ytr_Ugy6pyX4d…
G
Parents need to take responsibility for their children’s actions. Holding ChatGP…
ytc_Ugzw--WFQ…
Comment
We are not headed toward superintelligence. We are BARRELING HEADLONG into it. Musk, Zuckerberg, Altman, and others are investing BILLIONS into strong ai, with no breaks/regulation whatsoever. The promise of a huge payoff is incentivizing humanity’s sprint towards extinction.
youtube
AI Moral Status
2025-10-31T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZZEpDQ4Fol_rRz3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnhVMdx4H5KG97R914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgoTu7UFS3CUEDwlF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8w9Zsyzc24y2przp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxtR4Pt8nUMCs_ZJ3x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyC5Gw2e__-OdtBDZF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydqfQICatDtEr9AZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrDlVgZczTRreG_al4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXw9i7ZA1Aq7C_Q0F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmnECZLmYxsytfsqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]