Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seriously how many people wish to actually draw and waste thier time reply 0:30 …
ytc_Ugx7ajD7r…
G
Not yet.
>A Waymo spokesperson said **drivers of its self-driving vehicles**…
rdc_ebv2lem
G
God his constant jumping to people who don't like AI use wanting them dead is so…
ytc_Ugxpjp-yL…
G
qui a dit que c'était 100% automatisé ? au final tu rencontreras un humain, c'es…
ytr_Ugyg5pbCq…
G
Lets be honest - most of 'modern' artist do same as AI generators do but slower …
ytc_Ugyct38bv…
G
I'm a software engineer with over 20 years experience. For a year now I rarely …
ytc_Ugy9XnyWq…
G
I've been reading a book called The Presidents Club about the partnerships and r…
rdc_e2wbh0i
G
Im getting into programming and im 15 and i really feel like AI will only make p…
ytc_UgyNyFFy1…
Comment
@TheAIRiskNetwork I understand your hope, but unfortunately, if we're perfectly honest and rational, we won't succeed, even if international agreements were signed to stop ai research (which is already highly unlikely, not to mention the fact that we have very little time left), agi would still be created one day in a clandestine way or something. If we consider the gravity that s-risks could represent, it seems clear that we should try to avoid experimenting with them, by all means.
youtube
AI Governance
2024-03-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugz54guo-Rrl2UoadRh4AaABAg.AGSsSi9dADPAGSzo8Lh92a","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxKrKiWKHhLp15lLAx4AaABAg.AIuYWSUDskcAKaxaapKkhH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzthgKiYLPxDaEy9l94AaABAg.A0hV4RJ45q6A0kE_aer8jw","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwhplvsOdZVM5U0Uph4AaABAg.A0fB2zPK9MCA0fI1_3nwMx","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0fGy6Ssz-H","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0fdtxp4mAL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0g_C1R0n1e","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwSn2kymZ9bFozQigx4AaABAg.A0eo_YxGiMMA0ffWFc_EOw","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwSn2kymZ9bFozQigx4AaABAg.A0eo_YxGiMMA0zIcVKqBwB","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwb0ySrT5vC1MfaEG94AaABAg.A0eVlcZeKrUA0fUk0AOc1O","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]