Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand the necessity of an artificial human. AI is very useful, but …
ytc_UgjOXbY6D…
G
22:07 This thing still looks like trash. The head is situated oddly and seems a…
ytc_Ugy8sjnoQ…
G
This is your chance S. A., to write history, on your terms!
If he wants a dik m…
rdc_js1pdpf
G
In Beverly Hills Ca Middle school!! A student took a yearbook picture from a gir…
ytc_UgxQN3YiM…
G
AI has such potential to be a tool for artists,( in animation, concept, story bo…
ytc_Ugw25tgCa…
G
In my opinion the only use of ai art that feels genuine is natural landscapes…
ytc_UgwAga_gt…
G
Controlling anything sentient will cause the problem people fear most. AI isn't …
ytc_Ugy5fNfMv…
G
IMHO any critique of ai that doesn’t mention the fourteen people murdered by cha…
ytc_UgxCxCcFK…
Comment
If you fund it by giving non voting shares or every corporations, let's say 15% of the total, you could get something better than what Yang proposed, except it would not be guaranteed, it would depend on how the economy goes.
It would solve all those problems you talk about except the inflation one, but really it's like saying you need poor people to keep prices down, which is a bit ridiculous. But even if it was true, the dividends would go up accordingly anyway. If you have a company that is totally automated, how does it justify itself? it's just a parasite society that makes real people poorer, so people will realize that at some point. You could even double the shares for those places that don't create jobs.
youtube
2024-03-12T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1gdG3BA5rzWpcA1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwYai7iJvy-xjYe9_t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxtWXriB-362DAVsPx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxq1HvupjTIeuqFuX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzj_UZj2yotKAdhRPJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzNLU_9hC5QB7IZeO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXxKk6uEwvPW6AMPR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVcMMTUkuY5U0-Tkl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyKplABVh1qQGgFRBN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxPIDHIkzsYtU1ykzd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]