Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At this point the only people safe are CEO’s and tradesman. From AI that is. Tra…
ytc_Ugx05MFc7…
G
AI is good, but i spend half my time fixing programming errors it can generate. …
ytc_Ugw2M_vxV…
G
This guy is such a con artist.. A couple of weeks ago he was trying to get Congr…
ytc_UgzMHNJXA…
G
The amazing thing is I have no idea what these people are taking about. AI is a …
ytc_Ugzq9vsNI…
G
i agree with him. properly using AI does require you to think. if you are not th…
ytc_UgwDJ1HA9…
G
The absolute best interview I’ve seen. This beautiful young lady is well on the …
ytc_UgzqSsSrs…
G
AI pulled the cover back from the Google racists that programmed it. AI even mi…
ytc_UgwQvcb4X…
G
Dr Yampolskiy is undoubtedly an expert in his field but I don’t think he was ask…
ytc_UgwjHZq3b…
Comment
What all these people always miss is the high overall costs of implementing AI. To make basic needs cheaper and provide free time to humans, AI would need to be able to operate on free energy, or have no source of energy, like a perpetual motion machine. In the physical world (outside of thought) this is impossible: energy can only be consumed and transferred. Currently AI consumes a large portion of our worlds energy. If the control were to pass to the AIs they will likely suck up all of the energy and resources from our planet very quickly making it inhabitable and deadly for humans and ultimately for AI itself unless it can find another source of energy to keep its data centers and its electrical generators running. Ergo There is NO version in where AI makes our lives easier or better or more cheap. It will only serve to make one man super rich and powerful for a few moments maybe a few years only to destroy everything and ultimately making that same man the last person that AI will enslave.
youtube
AI Governance
2026-03-13T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdx5rV8DGQFlmGbx54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSiOl6goEqLM_2gkt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUorKqXDxRnkSMQmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOPNkphegAH9jzjwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYAXhzMfmYdO5FmZF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiiwlthXyUcwkWjst4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqHnO8Ei_InhDUoMB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTJ4jSebFt20BJgAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymYX2rYfvOblR_Tkl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9RYZg_lpa19SbfCJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]