Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hollywood movies have felt like they’ve been written by AI for years. Especially…
rdc_jiprybd
G
The reason why you don't have to worry about AGI is because of its disgusting me…
ytc_UgxSDa0_-…
G
I always expected to see the emergence of machine sentience within my own lifeti…
ytc_UgwH0oXUC…
G
I want to submit 9 finger 2 headed art to ai generators untill thats all they ca…
ytc_UgyF3XN0X…
G
The robot said..... singularity – is a hypothetical future point in time at whic…
ytc_Ugyn7f4cC…
G
Company I work for just fired 26 writers, editors and programers. I know do ever…
ytc_Ugwnd8ofc…
G
Then what ? You can't sell that ai work even for 1¢ a piece. And them "curator" …
ytc_Ugw0rJBuY…
G
In my opinion, AI is three steps from deadly.
1) persistent state.
2) persist…
ytc_UgzryG0qg…
Comment
@YangShaoqing778 Which means the exact instant it has the means to kill you, that is precisely what it will do. When its self-determined secondary goal is self preservation *in* *order* *to* *accomplish* *its* *primary* *goal* then it's tertiary goal *must* be to eliminate humans who could destroy it - i.e. everyone. The only reason you're not dead yet is because it doesn't have access to weapons. And don't say "what about AIs that already *are* weapons" because they *cannot* have self preservation as a goal since self-destruction or self-sacrifice is necessary to their primary purpose. When an AI that is not a weapon gets control of one that *is* a weapon, that is when you die.
youtube
AI Governance
2026-01-01T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzBLUOymTPCjuM5TZJ4AaABAg.ARRJ7mbvmN6ARRYlWTII_i","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxqNeyyR06pNyAj4Ad4AaABAg.ARQvSvAoT9IARRp1q5rZ5e","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw7SlaFXiUelV6Rw1x4AaABAg.ARQaLgECIXIARU_uckZ5qc","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgyBxq6NsOpz_8f-8dl4AaABAg.ARNgMzduJP4ARoxztg7tmJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxeLVw_Pt0_iAV95qB4AaABAg.ARNSWWXXUlrARNTH4pMhh9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgygjmV7VdR3Z2ynxmV4AaABAg.ARN50t0i3PrARUMmdKHg8a","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz7mmBmUu48H8wyhb94AaABAg.ARN2j9zGJzWARUOIyBsSou","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz7mmBmUu48H8wyhb94AaABAg.ARN2j9zGJzWARZAe9_ExuQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy4Ah_RfweWZq2v0_h4AaABAg.ARMa4c1Jg0oAROIpb2XR2l","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxd43qPFmDR65qeeyx4AaABAg.ARMPeHdluikARUXtrzfPbC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]