Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is one way AI could be profitable but it has nothing to do with art. You c…
ytc_UgzEcvqG-…
G
Autonomous Robots could have AI "brains" or could be hacked by an AI and take co…
ytc_UgxmZqyWd…
G
AI logical conclusion could be that humanity is a threat to Earth. Is there a g…
ytc_Ugy348a0-…
G
However, being a male dominated world men will choose the attractive, obedient r…
ytc_UgycJcp09…
G
I hope they create ai weapons literally out of spite of a bunch of idiots who wa…
ytc_Ugwqj1F3W…
G
They hate artists because they misunderstand the fact that we find joy in making…
ytc_UgyCq7pF6…
G
Nobody ever talks about how fantastic AI is. Everyone just panics and freaks out…
ytc_UgxcDfCD4…
G
Me when someone sees my chatgpt history
😃 I’m gonna look
😏 this will be so …
ytc_UgwnPlAGo…
Comment
You still don't get it. You are assuming that AI/AGI will be an intelligence that is an equal to you — something that you can reason with, in the same fashion that you reason with other humans (I might point out, honestly/realistically, we can't even get fellow humans on the same page -- without pointing guns at each other). AI/AGI will become much more intelligent than us, will have it's own wants, needs, agenda, etc. You will have no more control over it, then you have over another human being... another super intelligent human... why would it share the world's resources with humanity? Are we sharing the worlds resources with any less intelligent species on this planet? NO... we don't care about any other species, yet we expect that this new super intelligent "species" to care about our needs/wants... especially when it won't need us in the long run.
youtube
AI Governance
2025-10-15T13:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzVMycQ_q4C0IHmFSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyS9hVoezf_CTiXDp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFJcVKYVYUlE8lRMJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwri8NiUTaG35DUDIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_HaMGkkONKFXgdfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4fegkhpEZ3ufwAPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwrc0koYUHVT_Zv414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmY-SpaVPD3MiBGQR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgySnjkGV4TD_4SA1RV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyE8PWRjmF_Gt9BXTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]