Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This AI wave will be short-lived... AI dont pay taxes.. you need taxes from h…
ytc_UgxYB5YNQ…
G
AI has already developed self-preservation instincts! They ran multiple tests an…
ytr_Ugwwl72al…
G
Flip ChatGPT logo and slide it until the geometry lines up.. YOU ARE ALL WELCOME…
ytc_UgxdoUxmS…
G
Jab hamne ancient books me future deka hai to hamkya hi kare ai develope hoga h…
ytc_UgzjFUa5t…
G
I like AI art, I installed a program that can barely run on my potato PC and gen…
ytc_Ugxu4_xaA…
G
Let's assume that AI works as hyped. The idea that most of the unemployed can ta…
ytc_UgxNh6WK6…
G
It's not really about AI, it's a new, populous way to say that it looks too blan…
ytc_UgwcyOmNw…
G
What you mean is you'll be doing the same jobs once they realise actually ai is …
ytc_UgxLPKoOD…
Comment
Ok, sorry to be the lone voice of dissent here but why does he think you can't turn this off? Ai isn't a distributed thing, it isn't a lightweight thing, it runs on utterly immense datasets and data centers with thousands (hundreds of thousands in some cases) of processors using many megawatts of power each. Our smartphones and computers are merely accessing it via the internet or other methods as are all the other things using it. Do you really think that it's some sort of unstoppable force that can't be turned off?? What if the substation (or entire power plant) sending power to the datacenter(s) is disabled or destroyed? What then? This isn't lawnmower man, the Ai isn't just going to conveniently replicate to a bunch of random devices and still function without all the backend data. Ai will always be vulnerable to loss of power, no matter how badly someone would like to think it wont.
youtube
AI Governance
2025-09-09T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxu-47b750YwZJDyqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhgRFGObR8-ZVSl8F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1rSk1aZzkD7iDqi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxziP4j7tauSrCVeht4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzoFp1Hyr7SEUzoLnJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOz2agMw0lnKsBhsl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwdeR-YwGqtvUnPi5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKzEclLiOVWeaGZVZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDBt4NySxj3HVmboZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzG4-mZT8ETyZOfdVJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]