Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They should also do Miyazaki with Totoro’s catbus (especially bcs he hates Ai) ,…
ytc_UgwDBFR20…
G
14:27 I'm a disabled artist who has known a LOT of disabled artists (most of all…
ytc_UgyTMsVse…
G
Let’s go over the wealth gap a little more. Why was that topic glossed over or c…
ytc_UgzhSN2oK…
G
There are so many nice things that Art-School students could do in collaboration…
ytc_UgyOpmztH…
G
It’s gonna be up to the point that A.I will build humans.
This world is done. I…
ytc_Ugzj4RbPS…
G
It's honestly very tasteless for AI supporters to bring up disabled people as so…
ytc_UgyJQWXjQ…
G
tens of millions of us citizens are already unemployed and unemployable. ironica…
ytc_Ugwn6IGac…
G
Ai should only be used to help mathematicians and scientists everything else sho…
ytc_Ugz2oSBI9…
Comment
"...with this technology the probability of doom is lower than without this technology..."
Naively delusional or deceitful liar.
Either way, we are being *told* we should "have a say" in how AI is developed, produced, and deployed while truth is that a relatively small handful of people push this technology, control it, and will continue to profit from it.
And, none of us will have any choices about any of that, and this won't change.
The change will be the level of invasive power over all humans, globally, to such an invasive extent that will render us prisoners to AI's controllers - until they themselves lose control.
But sure, tell us again all about how we "have a say" here...
youtube
AI Governance
2024-01-04T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJnXmJnUjI4BZw5cd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwpYg_nreZwNpknQop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzjYp7Oaf7u7i6xvml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzm3XB7nXqrSR1ypFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTPRPTwe106_KghyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNfYpULfh_Ir9Ix1R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwO_VzN-pF3q4Py3c54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkgvHDi4UVDVYQ8Ml4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm3y0DqVd6ftgejit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD2Yiu9gI7Z8nwxY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]