Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah sure and Tesla stole the beeping sound coming from the cameras pissing them…
ytr_Ugw5Z2i3d…
G
I tried pressing AI like this but failed, Alex did an amazing job. The ‘lag’ and…
ytc_Ugxq-WDHM…
G
exactly! and quite honestly, the customer service humans i generally deal with w…
ytr_UgwJDfez5…
G
Can you send me topics that I will follow with your teaching lead me to down loa…
ytc_UgxjDwwpL…
G
Listen. There is one type of AI that is good. It can create magical places, draw…
ytc_UgyFUv6lE…
G
Do you see how the data centers are being built in the U.S? It's already being d…
ytc_UgzXhVu1D…
G
Whiel undoubtedly AI will dominate one day, the scale of its capability today is…
ytc_UgznRjOOi…
G
Just today I got my second false parking fee within 4 days thanks to a shitty al…
ytc_UgxSJW10a…
Comment
What a great conversation.
I was listening to a podcast yesterday about Universal Basic Income.
The gist of it was that it would be as low as the people/governments/companies in charge could make it.
Enough to live on (kinda) and just enough to avoid revolution and keep those in power, in power.
They talked about what Sam Altman said about UBI and it's relation to A.I. taking jobs.
Yes, the jobs will disappear, but the huge invrease in efficiency and productivity would mean that the things you need to live would be much cheaper.
He said "We'll make sure that you can afford to live".
Someone said that sounds more like a threat than a promise.
If they can make sure you can live, if they feel like it, they can also make sure you don't.
youtube
AI Governance
2025-06-27T13:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxL-mhhqf9gevy1GcR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIBT6YSj4r3gh9aBt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9rIy6Owjqa_GIVLN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkKKJ7I0wPAOsyu4h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3H496ImRpYQ7H8MJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgzYGXKg3Tj-lbsVu-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5PXGuu2e163spRBp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt8EpWihiOvgckQvx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwC0y5W3g9IDPHpgOZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyadFtvDGk3mXt3Rnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]