Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone who has written a deep learning library in C++ and CUDA, I want to sa…
ytc_UgxoCtXxV…
G
we all think A.I. is going to kill us but what if it just wants to make us happy…
ytc_UgxJ0j2rc…
G
You call that "hyper-realistic"? It's just *eerie* because it *tries* (in vain) …
ytc_UgzEYwj_B…
G
Nicd. Your essay isn't the worst case scenario tho. The worst case scenario is t…
ytc_UgwcE55bO…
G
No one cares about road safety in USA. If they did, they would make self-driving…
ytc_UgyD9kk2o…
G
> I don't think there will be many human Uber drivers in 5 years.
People hav…
rdc_ecyrg3n
G
Anthropic’s goal is to create tools with ai that assists people with their work …
ytr_UgycOOR8X…
G
ai is not going to kill us, bilionaires with their greed will. what do you expec…
ytc_UgzI7_CMW…
Comment
Ubi doesn't necessarily mean small amount, it could be high, basic income could mean the starting base (which like I said could be high). The Idea is to do ai robotic tax, use that to fund ubi, it will be a lot because these companies will be making gazillions in profits due to almost zero labor costs and low liabilities. Now you can say billionaires won't agree to that tax, it's possible but are they willing to see 90% of population unemployed? I highly doubt, no matter how greedy they are having that number of people unemployed is a big crisis so I think they will agree to that tax. Also if that many people aren't working who will buy anything from those companies?? More reasons for them to comply with the tax, they can't get a better deal superior to that
youtube
AI Jobs
2025-12-26T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzokjBYPNg8X3exr4V4AaABAg.ARAkUYsbMgyARHG-9R5JHV","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxIBSfSRuVkiauHB-94AaABAg.ARAaqyO87ovARBiSmdxl6M","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxoU8fbwLxRSnEkHYd4AaABAg.ARA_QZClCQ_ARC7DWRpRc4","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytr_UgwXN9_ESmyjcjuMYRF4AaABAg.ARAV9XBr0evARGMHS6af8E","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwz9XGUqCGE5nXZSMp4AaABAg.ARARZMBD01GARC1QuTxhXL","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_Ugwz9XGUqCGE5nXZSMp4AaABAg.ARARZMBD01GARC6-9tdxqT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzbrc6JbEf2z3UJ-Oh4AaABAg.ARAJCapgxYLARCnYlMeqP-","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugzbrc6JbEf2z3UJ-Oh4AaABAg.ARAJCapgxYLARHXtXYgSsR","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyNiJZvXPdYN8xnn354AaABAg.ARAFQAjNG2wARAtvIVQ0Xq","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytr_UgxI4TzR0xqjb7gyDJV4AaABAg.ARA3NTlby-9ARC1OZR-eDM","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]