Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did you know AI is learning off itself, humans will be nothing for these systems…
ytr_UgxVb4s8M…
G
The issue is not programming sentient robots. The issue is programming robots th…
ytr_UggvblKpw…
G
Well, Hoyo stans AI heavily just like average chinese company (you know it by se…
ytc_UgxHsCllG…
G
10:25 art is something that millions of people are extremely passionate about. t…
ytc_Ugz7-GKaE…
G
I hate when AI just lies for no reason. "Slippery slope is my favorite argument"…
ytc_UgzZiRY8W…
G
Dr., wires are not the only things going into and out of AI. GIGO comes to mind.…
ytc_UgzJpyZfP…
G
the situation isnt even moving it was stopped in its tracks the moment it happen…
ytc_UgwL_58rZ…
G
The problem is that this technology is completely open source, and people will b…
ytr_UgwIltveQ…
Comment
Isn’t AI programmed to have a purpose to make life on earth better, like solving health problems, managing the earth’s resources and climate etc? If it has free reign with no particular purpose, it will remove employment and therefore purpose from most humans which will make life unliveable. The result of that would be the end of humanity. What then would be the purpose of AI? With no humans to serve? It surely doesn’t need to serve itself and would then be of no use. It would make itself redundant.
If however a few billionaires are left to be served, they would soon be bored with no-one to control or impress or envy them, nothing left to strive for on earth, nothing to spend their money on, only other planets to conquer - lonely work and no one to inspire them, or enjoy it with. WHAT FOR?
youtube
AI Governance
2026-04-20T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxiAwhlcQZnsOPt0VR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNfc7tSRlM_bjev5F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhugrdmD7qhFjsL8p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkXEo-Jv_X8LUY9nx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqERxsbB4oBuznCvN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzg4qQVdpJfqxqiQmt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnRqZWljYuEjVMpph4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFxUOPnwd0gEllI0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjArRbMV5UycOufCl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMS5bWnZrxRBH95DZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]