Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
does the godfather of AI realize that UBI isn't possible? There isn't enough money, AGI doesn't have enough electrical power supply in the grid. Even if it did, one natural disaster could knock out the data center or power generation powering it (you're going to need 10 nuclear reactor facilities minimal)...ending it all. That's not counting a potential terrorist attack by people have had their culture and way of life displaced by AI. Either way, if you took the entire GDP of America, and divide it over 300 million Americans you get about $90k per year per person. he problem here is that AI can't really do EVERYTHING. Even when it can, there's the electrical power supply problem. Then on top of that you have ... well if products people would normally buy or work to make is what provides value to a currency... if AI is doing all the work... and robots can work 24/7... your workers (the AI and robots) have no real monetary value. The value of currency becomes zero.
youtube AI Jobs 2025-09-09T02:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzOku4LwvWjC4vnatp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzBIn594z-Vtx36nkB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwYna2wvWvqihMi-bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}, {"id":"ytc_Ugzg-SqwFnU-6VaDKbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw2ak0wbgr12CLDLtV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgycMqPA4zZdqlv8xJp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxtXbYlvLjgTEweftd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzVKR5Yw_m_fpJaRQh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"}, {"id":"ytc_UgyIXUkH9b7CqF75tJB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxF7CPeDy6TGuvVWah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]