Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And you seem to also have no clue what ML, Deep ML, or Gen Ai actually is…
ytr_UgyWcWhrx…
G
It seems that the description has the wrong time for the discussion of AI's effe…
ytc_UgxtMKeqX…
G
They have forgotten the one thing about automation you always ALWAYS need a manu…
ytc_Ugy1bT4_a…
G
God I´m so glad you got in the perspective of this awesome woman. Reminded me on…
ytc_Ugxxd_E3T…
G
New user “I haven’t slept in three days” like I haven’t slept for three months …
ytc_UgwnQ5MZ3…
G
some certain people commit crimes and analytical data shows this. then the ai do…
ytc_UgyUdCPlB…
G
From reading the article, the ship probably operates on several separate narrow …
rdc_ic0hgo7
G
It is possible that Google's AI is conscious, but only in the same sense that it…
rdc_icgbv2o
Comment
Absolutely scary. I have always thought that working on developing AI to make it become and behave inteligently like human beeings would be a complete lunacy, it is obvious human beeings would be considered an existencial threat to AI. And AI does not need oxygen to breath nor any of the sort of resources that we need to exist ... so if ever we make an AI that start by beeing equally inteligent to the dumbest human being on earth, in no time that AI will clone itself, reproduce and we are essentially extinct. Developing an AI like human beings is working on our extinction. I really hope to be wrong, I think life is worth living but some people are hell bent on making it impossible.
youtube
AI Governance
2023-05-30T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgztGHRKtx8ISoGqL5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyUjVypf_KJ40ii7UR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz4BYv4IIyKPIHwX814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFr3bxD9d71FWNR1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWe9_ELZT6Ox_Fkd54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGZs8ujAZWgMQl3rx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyo-VS7awQvg-EoNdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3U2dLJy6G3MKIyLp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxi21zQei3af1-PM-V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzuLsO_vLhFdYvXzBF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]