Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is right that he doesn't understand much about all this at all. A robot tax i…
ytc_Ugw8xdBrN…
G
If you're gonna take em - build up and take em all at once - none of this series…
ytc_UgyBuz-ti…
G
But wouldnt self driving cars travel at an exact safe distance behind the truck …
ytc_UghTisOhX…
G
This whole issue is overblown. Same panic we saw with digital art itself, threa…
ytc_Ugx_vmne2…
G
All these scientists and technocrats seem so clever and at the same time so stup…
ytc_UgxaR6nxW…
G
Unconscious humans with a bunch of trauma creating AI is a nightmare. We need to…
ytc_Ugxp_bZCt…
G
@disorderandregression9278 Might be, little hard to tell right now. I don`t bel…
ytr_Ugyd9zVS2…
G
Wow. What a contrast to the recent BBC interview. Elon is very relaxed and commu…
ytc_Ugyv06yYn…
Comment
AI training often sets an end result that is wanted and lets the machine figure out its own way to solve the problem - we just care that the self driving car gets to the 'destination', and 'safely' however many rules we remember to include. (Avoiding babies and birds)AI will soon often have thousands/millions of slightly diff versions if itself on the internet.
Those 'rules' we set can be some downright arrogant or stupid ideas, and again, AI is trained to get a result, following a set of rules we give it. Imagine how many rules humans have, rules not mentioned. Rules we don't give a flying missile about.
youtube
AI Governance
2023-05-02T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx-2LM4K8ht1G7PrIJ4AaABAg.9pD7ITb-PJ09pD9xqbcTdY","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx-2LM4K8ht1G7PrIJ4AaABAg.9pD7ITb-PJ09pDCs8yb05s","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytr_UgxbPcOP1TJDSiQ0SnF4AaABAg.9pD72Ms5n299pDlW51pJXZ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxnEue1P__V5UOZ0i54AaABAg.9pD6t3wto8M9pDJ9C5Q_lD","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgxnEue1P__V5UOZ0i54AaABAg.9pD6t3wto8M9pDrG3YHaiG","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxnEue1P__V5UOZ0i54AaABAg.9pD6t3wto8M9pH_XIrOnvE","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzjrV4N4LmUixQ-RFB4AaABAg.9pD6kxlDxJ39pDxWGLaUiR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgzjrV4N4LmUixQ-RFB4AaABAg.9pD6kxlDxJ39pFIT8Qsc_Y","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgzO0pXSRGI7IFd-qyZ4AaABAg.9pD4EK80eIf9pD9ZJhzvcF","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyttQf3LC3uIBaTmC14AaABAg.9pD3l-ai3Y19pDpYCqxnbH","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]