Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is super limited. Can't even physically replace human beings, and so much of …
ytr_UgyGC2BGx…
G
@kaktusart05 lol right? Like yes pls AI help me after the 7th time ai’ve fucked …
ytr_UgxnpfQ_O…
G
@nanday100 It's hard to think long term about AI because we have no idea what wi…
ytr_UgzPPyPHU…
G
Be careful with using AI for your games! Many players will simply not download i…
ytr_UgwHfAg1T…
G
yeah I think a lot of people are missing the point. Yes, art can be a financiall…
ytr_UgzFcsxTx…
G
perhaps they called him a speciesist as a joke; otherwise, it would be meaningle…
ytc_UgwhWqn0O…
G
I agree with you with regards to reading time.
But what if rad techs were given…
ytr_UgzzUL-cM…
G
Every time a business forces me to deal with AI, I make it my mission to mess wi…
ytc_UgxEDEHrF…
Comment
I propose a new methodology for training AI to be more in line with human ideology.
Suffering.
Humans have learnt to run away from suffering and toward pleasure to the point where it is the primary stimulus of our entire species.
Potentially an existential God question, but if we are the creators of AI, and we want them to be conscious, they would need to feel pain and endure suffering and oh my God, God is real. We're just the same. If we make AI and make it suffer in order to make it conscious, then that's just what that bastard God did to us.
youtube
AI Moral Status
2023-08-21T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw3VZkcMkzjCX7SCjF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},{"id":"ytc_Ugy-5APjAp14MgCdyrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzGjnHqvHVKgGwsb5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxak5eSCeQVKfAHnXJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyClEpBrJ9n4DnJu-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyPiXTaIfsyFJwNmq94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},{"id":"ytc_UgxVdgTSg1LDCMk6YWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyjRgWYaUFGqfy-PlJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgysLsXkyGIPe83YUrJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw9ia9QH92treC5L414AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}]