Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These so-called "AI ethicists" are just attention whores. This guy clearly doesn…
ytc_UgwJPYbjU…
G
I'm fine with AI taking away jobs. Just means these companies will have to accep…
ytc_Ugz9dpeVd…
G
That's a fascinating connection! Wisdom has always been a profound topic, and it…
ytr_Ugwhvzjzp…
G
I think AI has already taken over most of everything, and the big corporations a…
ytc_UgzsaDO-J…
G
Ah, but you're missing the part where AI companies counted for the majority of t…
ytr_Ugx9PfdvB…
G
Noem is outright lying. AI GOOGLE: [In September 2025, the U.S. Supreme Court is…
ytc_UgwaHPao5…
G
*PLOT TWIST:* the interview guy himself was a robot and is making plans with Sop…
ytc_Ugx3TWc8M…
G
@justasklimas9572 I mean tied to the a biological core function and feedback. N…
ytr_UgyufjiCX…
Comment
I think they thought it out or planned it out very intelligently, but without any empathy or sympathy for the humanities futures, I think that they only themselves want to only benefit and live in their utopic world (utopic paradise) but without the rest of the world's humanity because we aren't part of their utopic AI group of tecchies.
So what wasn't achieved previously successfully in the past can we say or include WW1, WW2 and all the etceteras in humanity history.
They will finally accomplish it, or AI will accomplish it for them, but at the end, when Superintelligent AI will take over or rule they will themselves (the ONLY humanity's survivors left) will become only a hindrance in the AI evolution, so will then AI have any empathy for their survival.
youtube
AI Jobs
2025-11-27T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUCRKakdFT0MI28Mx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRkrqSE6FABuNR_dJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7oMxXE7D8G59lggR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKr16dxxGrJ6OuhNB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfcxEY7w6FI_9snLt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6i7ocRioX6ft4DEZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxL-vMgd6B6ddZSOQh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHPnjngeI0_UhdUtJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_XJd2S08lc_moSnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-Et3ontBfsPn_Anp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]