Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So- pandora’s AI? Honestly though, what would it matter if an AI is evil. So are…
ytc_Ugw3mqYO4…
G
to finish, all three systems monitor all I do there, it is lonely doing this, wh…
ytc_UgwEPdh57…
G
There needs to be restrictions on it, laws in place to control the development a…
ytc_UgyS_mo4w…
G
There's no difference between AI and other technology, it's just software. Synth…
ytc_UgwRdU3N-…
G
Then the only remaining problem will be what to do about the $134 Trillion in de…
ytc_UgxInsozM…
G
It's hilarious that he thinks artists don't use AI because it's "hard".
Nope, n…
ytc_UgwllYbob…
G
i personally am currently going to college to studie art, and i am just terrifed…
ytc_Ugz2zz0sA…
G
lemme just pull up my ai...
dont think I aint gonna f*ck.
I DO THAT WITH EVERY…
ytc_UgwemdDAf…
Comment
At the dawn of In-Vitro Fertilisation (Test-tube babies) and mammalian cell cloning technologies (Dolly the sheep) stringent legal frameworks were rapidly put in place. The same needs to happen here - asap. If we ignore the warnings from Elon Musk, Steve Wozniak and over 1,000 other signatories including leading experts in AI and CEO's of IT & AI companies, we do so at our great peril.
youtube
AI Governance
2023-03-30T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLH1hkr2L6VqcJmwx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpnMLTfLhMqDTKv_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-G1faHlVI0kwCaEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpqiSJsME8JvIE2E94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJyAxUuLMx9cEqUol4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzF_SQl2udIrOLYC2N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxF9RQGxusNpHwLiCp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwiHI7_NutSiS3Jwz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzPkUH0Gf6lh9SK4Ap4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxptKgs5yZuQPVmi4J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]