Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbf nvidia will be fine. They make fantastic stuff. They can probably survive th…
rdc_nk6s6rc
G
Kids are already struggling to read and they are becoming more reliant on AI. I …
ytc_UgxdoPg6P…
G
AI is not Conscious YET! I am sure Sir Penrose is a very intelligent man but I d…
ytc_Ugy4JkvTX…
G
After you automate everything and there are no jobs left, who is going to buy yo…
ytc_Ugyw6_CGh…
G
Good thing we have expert like this alberta - who has no idea about how AI gener…
ytc_UgwVnHM1D…
G
They don't mention a fundamental result of advanced AI and AGI compared to the I…
ytc_Ugxal6WWe…
G
#tl;dr
The article describes a project completed by a redditor using GPT-4 to g…
rdc_jdjzkir
G
Trump and Israel is about to kick the butt of those religious fanatics… what wou…
ytc_Ugy3lGtWP…
Comment
I'd assume that an ai which is able to change its own code would activate the reward function as often as possible or set its goals so, that they're always fulfilled. Basically a pleasure cube for ai.
However if the ai makes predictions about the future, it will choose the future where it "lives" as long as possible to experience as much of the reward as possible. That's obviously a problem, so we might want to make sure that deminishing rewards for the reward function is something that the ai physically can't change. This would essentially put a limit on the ai's lifespan
youtube
AI Moral Status
2023-08-20T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx4HDA9pXf6JRz7p7J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytyZlD5jKpp6c5XDF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgynOJpZQmrWTKFtXoB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNBjv1fDiCo82aKTN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRoQkTaQHo8dYV3V14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzY1XY5YH3Ps9yEObV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyU7XdNRaDxc6i3ws94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgztMdsejCpsvo7_QhJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3BWppzV0jv7LWlCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_xIIxsEvXsZCKJU54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"})