Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have no need for AI , I use my wife's intelligence, Don't believe me she's tha…
ytc_UgzB2LjYi…
G
Are you dumb? Ai is still learning this type of trades have recently emerged mor…
ytc_Ugzkc3G-T…
G
@caveteethjoe It's okay to not understand something. I think you should learn mo…
ytr_Ugw47Ee_9…
G
Have you watch the movie, mother everyone dead but what she thinks is right a in…
ytc_Ugyiwvhgk…
G
@Gman052488 there's a reason the most fervent defenders of generative ai are fa…
ytr_Ugw59dTtw…
G
This is only going to benefit white children of the Department of Education is p…
ytc_Ugznh_Q2h…
G
im soo pissed with people who ussed the abalism argument i tink its more abalist…
ytr_UgySS4FvO…
G
We keep scaling AI bigger, connecting it to robots, giving it agency. The odds o…
ytc_UgwiXXB3N…
Comment
@anothenymously7054 LLM's alone arguably don't have drives, correct, but people will inevitably try to make agentic AI's as they are much more useful (see gwern's "Why Tool AIs Want to Be Agent AIs" for example) and are in the early stages of doing so already (see autoGPT which is bad right now, but it's just the beginning of trying things like that). Then you plug a goal to an agentic AI and it has all other sorts of sub goals like gaining more power, because having more power is extremely useful for achieving it's primary goal (see Instrumental Convergence). Then if it realizes that humans are on its way of achieving its goal, it can get rid of them, assuming it is smart enough.
youtube
AI Governance
2023-05-17T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw24WZzQxwp8FGivEp4AaABAg.9pmku29CVQv9pmrrSg4AIY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugw24WZzQxwp8FGivEp4AaABAg.9pmku29CVQv9pnKqP0TIJu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwZTX4UNZqhDjzEymp4AaABAg.9pm_zjDOTGY9pmiiz9wGlV","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw3Zv8zbDLEsJdCAvN4AaABAg.9pmZeZSGZ0l9pmxqrR7XMf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw3Zv8zbDLEsJdCAvN4AaABAg.9pmZeZSGZ0l9pmylJIahoC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw3Zv8zbDLEsJdCAvN4AaABAg.9pmZeZSGZ0l9pnu4EERVR9","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgznbO0YTrH_H4QUN3x4AaABAg.9pmZQkEF9jZ9pofc8wU8AU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgyGcinTN-Sz2n3s5N54AaABAg.9pmT46YnSK99pmYgdhbAR-","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyGcinTN-Sz2n3s5N54AaABAg.9pmT46YnSK99pyq03jpcuq","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugyh7XUNpVaZeliVEHN4AaABAg.AOBJSe4QZJyAS-vLHmUDNF","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}
]