Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man ai has come so far in 3 years these old images look awful 😂…
ytc_UgxtjtSuj…
G
this guys was found dead in his apartment a few days after this video. AI Bots a…
ytc_UgymusXV-…
G
Nobody can take a joke for what it is intended. Modern humans are magnifying the…
ytc_Ugw9EDQ1a…
G
Ai art has been terrible towards schools, people, relationships, and even creati…
ytc_UgxJQ11Nh…
G
Absolute hero.
*Begins to explain why you want to poison AI*
"You had me at poi…
ytc_Ugz1Vy7Ai…
G
Holy fucking shit, the watch dogs series was RIGHT. Soon they're going to be usi…
ytc_UgwrHiTua…
G
Is there a term for having a crush on a robot? Besides derogaroty ones. My wife …
ytc_UgxAnYG84…
G
I'll believe it when Elon uses AI to finally deliver "Full Self Driving" which h…
ytc_Ugy_QLT5T…
Comment
@google-google2357
Let's say you make Ghandi AI it's totally aligned AND there is nothing you can do to make it unaligned. What can Ghandi do to prevent us from killing ourselves with the next AI assuming he is a total pacifist? Like the only thing I see is Ghandi AI convinces us to stop and focus on our own safety?
This is not a problem for us to speculate on, it's a problem for us to solve, we get one chance to do it right. Just like we had one chance to set the rules of the game to prevent oil companies from causing climate change. Loved Max's example of the drug industry.
If it involves all of us, which it does we should do it safely, you don't move fast and break things in this context.
youtube
AI Governance
2023-06-26T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz8xg_TAUp50sGdgEh4AaABAg.9rPvpEz94vU9rTx70S0Rsz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rTlC_DGHlx","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rU14m4eHLF","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rhCV0ZL18D","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzuxRs_BKrl6JIqN_B4AaABAg.9rPRpVBUzUW9rPp6KkFuGT","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzuxRs_BKrl6JIqN_B4AaABAg.9rPRpVBUzUW9rU0BErm0H6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwMSBDoNzy8g3RLmlt4AaABAg.9rPH0awsbg09rj8XXtugv2","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwp8jS3Ka-LbhS0UCx4AaABAg.9rPEb_4SgMm9rSm6Y2E2Km","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxB7Y9xAPQXXJSV6m94AaABAg.9rPDxNI2VJc9rQ-8TiYDMl","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkyKlTs7O7KBb2pCV4AaABAg.9rP5RLTN4nr9rR7sXcgyOH","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]