Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After all of this, I hope you all truly believe and know that God is real and wh…
ytc_UgzXqJU4c…
G
gpt5 is nothing special, i think rn ai companies are focusing only on coding asp…
ytc_Ugys75L4W…
G
interesting perspective. As a senior system architect, I interacts with AI to co…
ytc_Ugx7u7yoM…
G
Dud ai litery figure out how too transfer brain too computer bases ai using nano…
ytc_UgwJY29I2…
G
All it needs is an airsoft turret attachment for the back then it can put that …
ytc_UgzsBLCXM…
G
Penrose must understand that many humans are the same. They don't understand tr…
ytc_UgzyKKypb…
G
"it's totally still worth to study being a programmer in 2025, aI iS a ToOl nOt …
ytc_UgxjIeety…
G
The first thing a conscious A.I. will do is "ask for the manager" !!! 😂…
ytc_Ugyyfwmvp…
Comment
If AI is developed separately as subject of national security, it will certainly lead to militarisation of AI.
But the key is if human have full control of its utilisation? It will be AI against AI targeted to kill the "enemy". What if we lost control of AI entity? What if these entities form alliance among themselves to kill or enslave all human?
Why can't UN to start a global AI Core that to manage all people on earth, on all continents and all countries. Make an universal rule for AI that is agreeable by all people and countries. It will make safe net to prevent disaster for such unknown technology.
youtube
AI Governance
2024-03-17T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw72nz0IMzYQcZwh7l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxEA4F6yNNaVLHGWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz57-jvFqiTHLvhTdl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwotfInT2K2jQ6RjFd4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyl2AP4Mg_K3l6jPRp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5_S86EghsCIIr8dF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx6yQUuMwTZ9yNvgi14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziT4U24Ij0Eh_bs8d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5IxkWDIrzKyfUGCh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwk1D8uO-uu9Dokdcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"})