Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be honest AI is shit that is taking our job and our hard work we draw do artw…
ytc_UgzUyuB1T…
G
Honestly, AI is really helpful in research. It helps so much when reading huge d…
ytc_UgxMJ-zdq…
G
It doesn’t take a genius to figure out what the bad is… what if the AI decides w…
ytc_UgyVGMeb_…
G
Its funny how the defense AI bros say is "oh it's a tool to help artists," but a…
ytc_UgzcxeWH8…
G
This dude is an idiot that doesn’t understand how ai works. It’s a program and d…
ytc_UgwxKd4-Z…
G
Yea the majority of people are not using it correctly and this is because of the…
ytc_Ugw67R6NH…
G
Having Ai of a bad idea ... so many flaws, yet companies say it's advanced .. y…
ytc_Ugz5tP0re…
G
First came AI doomsday fiction, featuring bad AI. Then Sam trained chatgpt on th…
ytc_UgxiEidUT…
Comment
All these discussions about AI are very informative for what lies ahead of humankind but they are pointless from the safety point of view. It is not about AI or any other technology for that matter; it is all about the human condition and how it governs itself. Meaning the incentives structure in-built into the system of human governance make all these discussions futile and just mere a theoretical exercise.
It is already too late to change anything - AI is just a part of the human race demise. What AI is going to do is just conclude our own self-destructive behavior and in doing so, the sooner the better.
youtube
AI Governance
2025-12-04T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzRiMQNQbxPfmaq-94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHKG4YpajkQuLSid94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy67QsStWNjqVqFSyx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymzDvtQ-qS6TWcEF54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpeRkHCwW9IhRzxK14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzfKQW4QdGIDAxTJM54AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1BD1Fz3EqYynmNF14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-_pqDSYYv_efBnhp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzH0AzFcgpIqOo8h7d4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-THkijqdY9Vc5HcZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]