Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why the hell are so many people counting the days before this becomes a part of…
ytc_UgwrX2C6Q…
G
these commie doomsayers have been wrong since forever. They used to compain that…
ytc_Ugxuk6Gkb…
G
Right. And all cars were supposed to be self driving 10 years ago. And the Metav…
ytc_UgytKlX8S…
G
I love it when companies like MS hallucinate that AI will be the biggest time-sa…
rdc_nu4oso3
G
Those law tests etc only require reasoning if you are not "trained" on the respo…
ytc_UgzTMrRH1…
G
The thing is, the AI artists can also do all those things to bring their vision …
ytr_Ugzn2Z1xN…
G
@christopherabelet4672 if the employer knew nothing about CS then yea, but the …
ytr_Ugwr_DMiO…
G
Maybe? But people are also heavily using AI at work... so there is some truth to…
rdc_obvam5z
Comment
The reason why the EU regulations don't deal with military or defense use of AI (or public safety and criminal application of AI) is because the member states have not released any authority (or sovereignty) to the EU in these fields of government. EU does not have a federal government in this respect. Presuming Russia, China, the USA and Great Britain are more likely than not to further develop AI for military purposes, it will be a hard sell to have the EU member states restrict themselves.
youtube
AI Governance
2025-07-05T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTbPPLe0_1Jm6R5554AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyudsmXDch-AyG7DWh4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyh7VdsLlCGTRlLNPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRcEmO2rEAzqFLT014AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTeYXwQl43o2sIjz54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz7wTivutkRmh2wPmR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzlwosRm4mNaCOhwMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVZx_-yCgeoD9Fcxl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyLvtgnsL2-OYuome94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgydW-in4CmERicjdx54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]