Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who gives a fuck if AI can do it too why not? Do better then the AI and people w…
ytc_UgztNgKij…
G
AI is just a digital tool for suggestions finding and it's kind of a reference o…
ytc_UgxQoDiUt…
G
As to what one of the speakers says about "we have agency", the question is (do …
ytc_UgzhMJ_S-…
G
I’m surprised how these AI experts are so incapable of imagining what people wil…
ytc_UgyC9B4Ii…
G
@maarshiexcryxxI’ve hand drawn all the “art” I’ve made but it still will never b…
ytr_UgyR4ny5N…
G
This all sounds great and all but... at he end Ai is just software, meaning it c…
ytc_UgwzY6b0r…
G
It's already to late chatgpt 4 has an iq of 230 Einstein was 212 not only that b…
ytc_UgwOR884V…
G
which isn't the issue, people buying ai instead of hiring artists, the plagiaris…
ytr_UgzmCHTMu…
Comment
The greatest fear that I have, is that AI will literraly take over the planet and the Human race will be extinguished, as has probably happened on innumerable planets to innumerable specias over the past thousands or millions of years. The 'First Law of Robotics" is supposed to be that they cannot, directly or indirectly, harm any living being, either by action or by inaction allowing any harm to occur. [I paraphrase]. How does that work with AI? Or, DOES it even work with AI??!
youtube
AI Governance
2025-12-29T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZGVmaIDdAEuZLkHV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKRuU2Q0ogJ6lGIQZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0r85267W8rHKqOqt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2iXY7rtvw0qLOs614AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGp_Zh-vNlqUSHbaN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwq0Qq5cquEViRireN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwK2vXGWL4uLg9fcuR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRlBa8xf1dV2o7V0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwKKKnW6eQdRfTh4U94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2C6OoA8Y-Ir-e7r94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]