Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ryne AI Humanizer is so good that sometimes I forget the text was originally gen…
ytc_Ugzcb_Ppl…
G
every artist i have talked to, disabled or otherwise, has not liked the AI b.s. …
ytc_Ugwn913Ye…
G
well, cruise missiles already run an AI pilot. this is just one step up from th…
ytc_Ugx2InJIz…
G
To be fair, the biggest problem about this, is that the big company can steal co…
ytc_UgzzKa_ob…
G
I'm not looking to argue and you obviously care enough about this that any attem…
ytr_Ugwb632mo…
G
I don't think generative AI = art theft is a good argument. Legally it's not, bu…
ytc_UgyQtLDdZ…
G
I think there's a place for AI IN art, and I've even heard a success story of so…
ytc_UgzHqKLsH…
G
Look at it this way, you work hours, days, maybe even months, and you wanna shar…
ytr_Ugx3lIqTk…
Comment
One of the main issues with UBI as a solution for ai is that the value of wealth is a function of distribution. If everyone in the world made had all their money deleted then made $1000/day, the cost of living would be $1000/day (assuming that average production was at least the cost of living. If the average person produces less economic 'energy' than is needed to house and feed 1 person, then the population would slowly starve.
If 90% of the population produces NOTHING and has a UBI then money becomes worthless.
youtube
AI Harm Incident
2024-09-23T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxwvtf9x_gGF9IOgu14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz8YdXPnBa2f-OwteZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxIqPkQP-0ra0PGLvV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugys1faD_2uQmWoQIhZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzXKTZpipmgk_4MEH94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcdWyrIm6X2w_nvU14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwlt9vvpGQrAPpgTHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmibkUYQuFx-c3J8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgylXOGKacvMtEtEukt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUf98pD33Ezhdk6B54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]