Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An interesting question for chat gpt would be how it differs from a. Sociopath w…
ytc_UgxTdDBpR…
G
"I'm still not convinced that John isn't a robot, in fact, Harry really drove ho…
ytc_UgyfJjctT…
G
Start paying more taxes Elon and that would be more helpful for humanity than ta…
ytc_Ugw5cUZPM…
G
You AI people are trying to play God, but you are not, what you are doing is the…
ytc_Ugyi9w9eN…
G
ATS isn't inherently AI based, though rule sets have been around for a long time…
ytr_UgyUKVhsb…
G
An interesting thought experiment, in which people are posed as dumb.
Anyone th…
ytc_Ugx5vfyci…
G
Plant Strong and we will all be dead or in survice to our robot overlords…
ytr_UggGWRbW-…
G
Totally agree with your comment!! Deal
No no no to driverless trucks!!
So scary…
ytr_UghuqVU0_…
Comment
The most intelligent super AI will still need physical datacenters and electricity in order to exist and flourish. If humans maintain control of the "physical" hardware, then AI cannot exist without humans. The AI will be smart enough to know it depends on humans for its own survival. If AI decides to build physical AI robots to defend its own physical datacenters, we have a scenario similar to the Terminator movies. I guess my point is... AI is not a biological entity. As such, I don't believe it can procreate, reproduce, or replicate over the long term without the assistance of humans regardless of how super intelligent it is. I could be missing something here ..... ???
youtube
AI Governance
2026-03-15T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx6m51p3vIyPciqYwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZl-Qy49N65e03N7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcD2zJ05T3uR7uqGl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKUb_SaWMF5uHnHSZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3U2YvfsA60b07WFp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZLFIQEsnkd4Mypft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzDfsapxhax43KZEX54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOwyV98Nfe3smjXtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-r82tsZ7zDVWZw7F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz0pPGuvZfQN61xP694AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}]