Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1 step away from global domination. They just need to give an AI intention. N…
ytc_UgyaTyuaI…
G
Amodei had no problem providing AI surveillance on Americans and killing Russian…
ytr_Ugz4LyfiJ…
G
"AI can only know what exists on the Internet." Triggered by this, am I.
1. It …
ytc_UgwllvQ1G…
G
The "Three Laws of Robotics" by Issac Asimov are reasonable and sensible guides …
ytc_UgwHZbosk…
G
It can be helpful, but not in the "do my work for me" way, but in the "automatin…
rdc_mle5pcu
G
Yes, this feels now with AI just like how the internet was in the mid/late 90s: …
rdc_nk6kbvv
G
I am going to say D. Looking and listening to the video AI doesn’t develop emoti…
ytc_UgwYfeXuD…
G
@mr.intenz2636 fear is instict, but for survival So if we really want to surviv…
ytr_UgzYN7sZn…
Comment
58:34 “Assuming AI has internal state experiencing…”
That is a HUGE assumption where his argument falls apart!
Would have like to see you question him more on that, but it seemed like you (like me) were confused by his skipping thru this point as if it’s a side issue.
Essentially AI cannot currently create CONSCIOUSNESS. That always has bern and is still the area of mystery. Just because AI is close to creating simulations that we conscious human being can go into is not at all even close to creating CONSCIOUS BEINGS inside the simulation.
That is where the discussion needs to be my friend.
youtube
AI Governance
2026-02-20T07:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwM__XhKTJPxkW5GRV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz4_RAMS-lwRnzLgwl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMz9PML6_Mbj2lpWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSKVggdvFIXNZpK6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzS52nr_VSYC7GVIxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxsTCogPUp_8Vm369t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxx8hnlPVxoDY27tdh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy493hhTYm4WvxZ4xJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPntuoEtfMlP2HLG54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy5oXaMXsFNOtVj5sR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]