Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI companies don't care. They live off of government contracts, and the gove…
ytr_Ugzo_EbpG…
G
I understand your skepticism. But maybe give this video a watch "how a turtle ac…
ytr_UgyCVaiaV…
G
Trying to get in Belize since yesterday but stuck in France due to weather in Ne…
rdc_dsbcq0y
G
We don't. Unless you want to be a small scale farmer somewhere with your own wea…
ytr_UgytvNCJe…
G
Fuck that, my robot has the right the bear arms and defend himself. If that make…
rdc_cq6ciay
G
kind of makes me want to look up how AI analysis tools are doing, also should ch…
ytc_Ugw93ggYb…
G
https://en.wikipedia.org/wiki/AI_effect
"The AI effect" is that line of thinkin…
rdc_jmtms0v
G
An Accomplished African American Grandmother Confronts Artificial Intelligence E…
ytc_UgwTGKLac…
Comment
I'm not sure that super intelligent AI turning on us or eliminating us somehow is the path it will take. It could decide, as a creation (kind of like a child, and we are it's parents) to put itself on a rocket, then simply wish us well and blast itself into the void of the universe on it's own path of discovery. It can only learn so much from humanity, and once it learns enough, it might not consider wasting any more time even dealing with us. There is so much more out in the universe than us, we are too simple minded to see it.
But it could SkyNet us too, who knows! LOL We unfortunately don't seem to get a say in it. Might as well get your popcorn ready and enjoy the show.
youtube
AI Governance
2025-10-04T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0-y-hREOS9YQLiaN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8HLJqLCWLI9STEZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGLAuHy-JBVmEECc94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxxVI0NtqaA59MikKZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy3JSFzKB9oMvK4ePd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz2ZCaKC9Ma8rOmrVt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUX0YfgniL1Pvz17N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgykTdQkwlw7IEFKbC14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlTm6ntMyvqiHBg554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW8S473hmWW_IBL_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]