Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The question then is whether consciousness is something that is purely algorithm…
ytr_UgxLWJ3uO…
G
But what about all the data that the AI trained on? It you used it without permi…
rdc_jwvqxyl
G
I’ve been doing art for years, it’s so ignorant to see people use ai instead of …
ytc_UgzPfpe_1…
G
One thing I learn about drawing is that at a certain point People WILL Recognize…
ytr_UgzBDkctw…
G
Can we have co-pilot integrated? And please do not let people disable it. Force …
rdc_ohwgop5
G
Don't get AI to lie listen to Elon Musk he said there could be bad consequences…
ytc_Ugz74MGLL…
G
Imagine the robot got hack and turn the gun back at him and aim directly at him …
ytc_UgwzrMOkA…
G
(I mean that artificial intelligence creates huge risks in the communication sec…
ytr_UgwhtOyVh…
Comment
Are we racing toward an AI we can’t turn off, an operating system that is AI, running everything from personal devices to government systems, using skills it can create on demand instead of traditional programs or applications? I asked ChatGPT AI for its opinion, and this is what it said:
The real danger is a public “safe” AI for everyday use alongside a secret, more powerful AI built by governments. The moment one country uses its secret AI to gain an advantage, the race is on, rules vanish, and AI becomes the control layer for supply chains, security, and public services.
At that point, you can’t just turn it off.
youtube
AI Governance
2025-08-11T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzmNan-CHyWnbHVYtl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4OS9lgpk7hQNOJUB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFFZqCEFJH5Zxz1cV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-7xX0Gbh-iG0jYgp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygYx54ZIyGL8azyJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyuWVRrERhqWvzJGTR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWVOKaA5hyq5h-g4J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1L48m-fYIGt99VTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlUd4F0CNNYepPG154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBu0YgAaUrmUArefB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]