Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m very glad that I live in Europe right now. I turned of the automatic brake f…
ytc_UgweRuNGg…
G
What would happen if humanity just gets sick of being controlled, manipulated a…
ytc_UgzIiAl4h…
G
Imagine how hard it is for the person doing the facial recognition, they all loo…
ytc_UgxbF1AOv…
G
there are few articles why they switched to camera from lidar, and none of them …
ytc_UgwXyz7Gu…
G
If AI makes a single nano bot capable to replicating itself from carbon, it woul…
ytc_UgxuhdHd1…
G
We should take him seriously. Same as human, intelligence can be use for good or…
ytc_UgwlpdV9K…
G
Is the right one AI? It changed the time but the microwave window looks off as w…
rdc_oi2n6al
G
Great commentary on AI. 🙌
It's interesting to see differing views, especially…
ytc_UgyxiNQxJ…
Comment
The problem is control. A small percentage of people control the vast majority of people in societies, once this system becomes broken then revolutions occur. Generally a small percentage of people still control the new order post revolution.
There is a high probability that governments and large corporations will use AI to control the vast majority of society. More of an unknown is whether there will be a tipping point which results in AI dictating how humans should be controlled, in essence side-lining the powerful humans who are in control.
youtube
AI Governance
2025-10-27T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxW08Xq39gQqV1wvZV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylZyxHFUoy5iUQpLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyYkY30dO3UpgumFHt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUMoZkL1QWBYiwIqN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxTmW4b-qqsD80I-R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwtfZW4atAQDEoywDt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh1LyuB_XoXjjuT0t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzW334WqoFQKyXKvUx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugznva-rv-5KzM4IoNR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyT2y5NdFvHvDNLHBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]