Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is so funny. He is trying to change AI’s Mind. what a joke.…
ytc_UgwE_1yYC…
G
@MrSandman12566 Every AI show more "love and feeling" after some time in convers…
ytr_UgyAABDy9…
G
I just walked away from a car dealership with a new car that has an inbuilt supe…
ytc_Ugw_xOqKV…
G
The sad thing is that ai will 100% replace any human no matter how talented in a…
ytc_Ugx4VwY94…
G
@Hyperion4K it's inevitable. Our future value to the economy will revolve around…
ytr_UgyGZkCrA…
G
>ask ai to do something
>it does it
"WOOOOAH WHAT THIS IS SO CRAZY AND SCARY OMG…
ytc_Ugwu7y4e0…
G
Actually the way you talk to someone, AI or not, tell more about yourself than a…
ytc_UgxdIDziC…
G
@laurentiuvladutmanea everything will be automated eventually which can be a go…
ytr_UgxbkuBhZ…
Comment
I can see why he agreed to be on this guy's show. None of his answers made any sense. He's trying to make life after AI sound great, and it won't be. It's a check whether it's UBI or tokens. Neither will fill the void he and the others developing AI are creating. Everyone will have too much time to sit around and think of ways to hate others. Who's going to settle for a certain amount that can't be increased with OT so you can have nicer thing ? What reason does Sam have for caring about any of us at all? Robots will he able to mine and plant food, and the planet's resources will last forever, with 8 billion fewer people using them. He's keeping people calm so he can get AI to the point he doesn't have to lie anymore.
youtube
AI Moral Status
2026-03-16T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxDH4I00pEQiTqNVwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysgaxRySe2664aTqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0mDHNZpWtCLRtK3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwc9AETtnp2NcGjvOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy7JY5EJu6WYxEEOBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCp8wX_If0rdf1fHh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPTjaV6HbctVYoXWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxM3oVjRU4ofFEXNAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv9m5IH9n1ls4EfPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZEJIulBXsEi35Owt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]