Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wouldn't you think pushing this AI is just going to make humanity even stuipeder…
ytc_Ugw9p7WG6…
G
My thing with that is that ai gives untalented people the ability to produce stu…
ytr_UgwhaBzJW…
G
Intuitively it's quite clear the AI inteliģence is not human intelligence. The q…
ytr_Ugzs4aCZm…
G
Any argument about how we should "regulate" AI has but one purpose.
They have no…
ytc_UgzsFHCCF…
G
AI will never be better ngl, it lacks soul and personality like all the people d…
ytc_Ugwag4SyH…
G
When computer ask if I’m a robot:
I know these computers are not asking if we’re…
ytc_UgxE1rbX1…
G
I thought that was just a typical plastic woman and not a robot untill i read th…
ytc_UgwdkUImq…
G
@deeplearningpartnership her irrelevant politics rant aside, the points she rais…
ytr_UgxzfxMwL…
Comment
The premise of the argument is all wrong here. You don’t increase taxes on people. You make companies who eliminate jobs because of automation pay into a fund.
How much should they pay? A percentage of the replaced salary up to 100% (exact number TBD). The company will still generate more money than this through AI, and the UBI will be funded.
This would help encourage some companies to not eliminate jobs, easing the transition. Companies could opt to keep their workforce and augment it with AI, and not have to pay the fund. In turn the UBI would only pay out what the fund makes, so if there’s a shortfall because jobs weren’t eliminated, then great. It’s a win win.
youtube
2024-02-24T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1gdG3BA5rzWpcA1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwYai7iJvy-xjYe9_t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxtWXriB-362DAVsPx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxq1HvupjTIeuqFuX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj_UZj2yotKAdhRPJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzNLU_9hC5QB7IZeO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXxKk6uEwvPW6AMPR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVcMMTUkuY5U0-Tkl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyKplABVh1qQGgFRBN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxPIDHIkzsYtU1ykzd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]