Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I sure do love natural selection! With this, all the dumb CEO that are misinform…
ytc_UgxcUcWRs…
G
Maybe the right answer is to let people post content, but just curate the recomm…
rdc_e7ixpzk
G
I was just about to call you out for AI slop until the last line.…
rdc_oi08e0n
G
@i@inohopel it depends on the idea. Sure other AI could implement it, but it doe…
ytr_UgxqYSF3p…
G
I too prefer AI that hides in bushes and attacks in the middle of the night!…
rdc_dwv659a
G
Basically, we need a civilisation collapse (grilled grid) before super AI sees t…
ytc_Ugy4h4BJR…
G
AI is very dangerous....computers have already ruined humanity AI is the final n…
ytc_UgxiL9CiY…
G
Well, this robot need electricity energy. If someone took down the electricity p…
ytc_Ugz7DVPvy…
Comment
The idea is that something like that could be a solution against competing human tribes (and as a consequence possibly also future competing AI tribes) which fuels the dangerous unchecked AI race. So that humanity can be united in developing safe AI. Barreling ahead towards possible extinction is his worry, not that he wants a human world dictatorship.
youtube
AI Governance
2025-07-02T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyCLBEJAZHeRl4oaYV4AaABAg.AK-wckndrYCAK3sdOPNRza","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzMwA-2As4laPuRoVp4AaABAg.AJz3mGo7qjuAK598EIrC3o","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyENOH579RxGauK-Fd4AaABAg.AJxQi3GpjsJAJyNJyA82pX","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxGcR7_R7Qvr8MeJN54AaABAg.AJxDAzLcnKoAK53ZbbSD8N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzQ4G3PwmHBfIFbOoV4AaABAg.AJwsdfgS75tAJwtX_TBG5C","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwRzp1RMgsGaXXIHYR4AaABAg.AJwRrAGjn4UAK4EOuziUyG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxZT6LuDvkTW0mRrNt4AaABAg.AJwL-E2cvILAK4FBe6qO1x","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugz8kXJ54R54LGC4pDl4AaABAg.AJw8sFdpsn3AK4L8pyijZP","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_UgxuKqS5SR4qTNHORi94AaABAg.AJuhcfLiaW9AJv0uOzmZHv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyCLi-a9qrVIhXsxQB4AaABAg.AJu8hia4-F-AJv5cYcGsPA","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]