Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is that true? Would make sense. I remember who quickly an AI on Twitter was inte…
ytr_UgwhUBrT4…
G
Company CEOs: "Don't worry, as we implement Ai it will free our employees to be …
ytc_UgxhoR5UH…
G
Right now the tech companies should be paying every user a fee and for time usin…
ytc_Ugw7bgkhT…
G
It seems like you're sharing your thoughts on how humans can get lost in their o…
ytr_UgxJWpDm0…
G
I think in the medium-longer term, AI is going to be extremely disruptive and ch…
ytc_UgyWBUn3k…
G
And there is the crux of the issue. Humans will be replaced on all levels and p…
ytc_Ugw0bnh1Z…
G
the weird thing with ai artists is how desperate they are to not only "make" art…
ytc_Ugzayvuht…
G
Waymo has had 2 crashes in the previous 1 million miles? Tesla cars drive collec…
ytc_UgwoyjD1C…
Comment
You should talk to someone about Ai who's not a materialist! Fundamentally different views, I respect a lot of what Geoffrey says but his reductionistic view humanity is a big problem, by saying AI has or possesses the possibility to have consciousness he green lights the obsolesce of a human being. Because why bother with biological creatures when digital entities are way better.
youtube
AI Governance
2025-06-18T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxxBepumWmq66J82lJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzF4NP4gmbzxdBCxRJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwltPBuOgU3QhEu2_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugynn5YXgxQ5bl9YugZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLI3IeVuAullznqUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjfOTPimryYJIJgZd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIajnVzdKD_5rx4wl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqPLQXSwGC-T2JT1V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJeamRsX2B2C5qVH94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzSj1hCEQERqYictR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]