Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shouldn't we as humans retain veto rights over any decisions or actions taken by…
ytc_UgweOzvss…
G
You could use water for hydrogen. And the hydrogen for nuclear fusion. Like peop…
ytc_UgxIWl9yL…
G
@beforedrrdpr Yeah, but ponpon was a nobody AI artist I didn't know about but th…
ytr_UgwNFDsRX…
G
Can you sexually offend her by doing what you want with that be sexual harassmen…
ytc_UgwyEAopN…
G
who watched the movie the lawnmower man it tells the exact same story especially…
ytc_Ugw7jqbwb…
G
It ism meaningless, art is a feeling it goes far beyond just drawing or painting…
ytr_UgysoB-Rx…
G
FSD isn't FSD. Full (supervised) Self Driving. FSSD. But, it isn't Full or Se…
ytc_UgyhByynO…
G
@dappersnakeproductions Sure you dont. However few people have aquired the talen…
ytr_UgxjUffo8…
Comment
I'm much more concerned with the people in power who control these AI systems, than the AI systems themselves. I think the most likely medium-term future is just a continuation of what we're already experiencing -- wealth inequality. Rich people will get richer, and the rest of us will live in the squalor until it reaches a breaking point. My "silver lining" hope, if you can call it that, is that in the aftermath of that breaking point we can create a society that actually distributes the benefits of those AI systems to all people instead of a select few megalomaniacs. Even better would be if we could create that society now, but with the cartoonishly evil people in power it's hard to feel hopeful about that.
youtube
AI Moral Status
2026-03-02T17:5…
♥ 26
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz86s2QFPS-hKYIJjV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyynuw930sIpEvB8c94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyoUlSbaAt-W9OIyhp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiFYVU0bGYFXPyrgB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQGW8VNDrxXy1OnG94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyIrTnRiR256mBIfhV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxucZMERxkle9Caal94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzMWWCYvt50UGk_oER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwIslLOYeVfkJw7Zsl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdW6wFrGoEbleaLDJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]