Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
anyone who thinks auto pilot in this day and age is true autopilot that negats t…
ytc_Ugxr3KETI…
G
@ihl0700677525 What you said has been debunked by all actuall conparissons betwe…
ytr_UgwWrOFCI…
G
I think we are already starting to see what it could look like. Companies are sh…
rdc_nxpst77
G
That's true,as with many industries atm,Ai is not advanced enough to replace mos…
ytr_UgyqfYDTk…
G
Imma kidnapp 1 of the A.i robots to bring home its works for me now 😂…
ytc_Ugz9p6x2S…
G
From my perspective I've always believed that AI, which is really just pattern r…
ytc_Ugyd9_mTi…
G
Twitter moment:
The artist: posts an art piece, and a link to every timelapse an…
ytr_UgylNEXEU…
G
For many of the algorithms, when a person says "thank you", it creates a positiv…
ytc_UgwwqIvAQ…
Comment
@imthinkingthoughtstake the example given & remove the bits about the verifier, controlling the AI, or any of that. Just, we create & improve an intelligence which surpasses ours greatly & has goals. The emergent goals we can imagine it having are things like survival, resisting change, & acquiring additional freedom of action. Just from that, it has reason enough to eliminate humanity, on the off chance humanity might try to deactivate it because it is doing something it senses they do not value.
youtube
AI Governance
2025-01-08T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz_yKuoKnTOTwpefPt4AaABAg.AAiaptz-j_1AAmOMFZF9M6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx8g_5sHOdatpNS1ql4AaABAg.AAiaMK8MSDxAAl5UmKk8NB","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy6CiuOR6gKecGxKsd4AaABAg.AAiaGgR1H05AAkEW-cmu-_","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzCvXXYuGSCHhd8t-R4AaABAg.AAi_R3e0iIkAAjHps0E2Kq","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyJ8K6sL0wfC6SMlq54AaABAg.AAi_DdDRj5UAAl8z-LKwiI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAjIytb5cUL","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAjJ1__3PS3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAAl3hJVF6k9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwePVVbMUGmOuwAgch4AaABAg.AAiZnSWW2klAD2JjEBDIcN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAigX-3Eus1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]