Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Deepseek is the number one contender for an agentic model for people who are usi…
rdc_m9gg0oq
G
I have a friend who uses AI a lot for her D&D campaign to generate tokens for NP…
ytc_Ugx-SaLne…
G
You can add math where the human can't see but the ai can and it f*cks up the ai…
ytc_Ugw1D2QA1…
G
People in the comments are delusional and one discussion ahead from being the ne…
ytc_UgxfTy2y-…
G
We people are lazy and thats normal. I hate to say it but we already are terribl…
ytc_UgyiqngKq…
G
Oh F OFF! Blaming it on AI algorithms as if we didn’t just see what the US and I…
ytc_UgzzZd91F…
G
most of the issues of self driving cars are the surrounding bad car drivers anyw…
ytr_UgxYi8GCw…
G
Wait I could see AI being a problem as a CEO... we can't eat an AI :(…
ytc_UgzoG5A8f…
Comment
UBI is about creating a "living standard" income similar to minimum wage in a world where automation is making more and more of us obsolete. it does not require increasing the money supply at all. It does not imply phasing out work and the data suggests that standardized income only slightly affects people's willingness to work according to most of our data.
This is an edge-case scenario of a preversion of UBI, not a real critique.
youtube
2021-01-14T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx4O358ymLHKmZckCV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgySWFm4rrsiZp8xwcR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyh_LW67dhbyH8D8lx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxxks-_QVsluw5TrQp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNDh2tPB3nGYo_0-p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJBhA4hWGm0i5Jas54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwvq381eKfRbgirbaV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnwziVGerShf7-xsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwPcs85BHNU7x_08yN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyahPlmtnomR02J72d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]