Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is not smarter, but its an artificial mirror of collective conscioussness of…
ytc_UgybUboNS…
G
Listening to all these videos where people explain AI and its scope, deep down s…
ytc_UgwICmMDk…
G
@ItsBlakeMoore Yep, and you have also missed the point. It's about time people l…
ytr_UgzSgaPuP…
G
Why did he mentioned Elon Musk at all? He and Jeff claimed the same thing, that …
ytr_UgwW5nm1o…
G
To the robot in this show, if you dare to lay a hand on human race, u better wat…
ytc_UgzECNVnx…
G
If generating an AI image makes you an artist, then I’m a carpenter if I get my …
ytc_UgzrjaHQo…
G
It's also because we're all mostly focused on living in the moment, with not eno…
ytr_UgzTMHsxW…
G
Camera only driving can be accomplished at an acceptable margin of safety, that …
ytc_UgxfbbTVt…
Comment
Stop thinking of AI as “programming” a computer. The point of AI is that it’s not following a super discrete set of directions. It’s a brain.
The creators then choose what causes pleasure and pain. That’s why the “natural selection” conversation makes a lot of sense.
That’s also how you lose control per se and end up with the wrong results. Like natural selection wanting reproduction by making sex great, only to have the humans reject the reproduction half with birth control.
youtube
AI Governance
2025-10-17T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzmexWnJbzB4UVydcp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw7OLpNX_TZUxqq59p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKEehWUnNlPWy_TWd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyU9nMB3UAMNASSNJJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyP5g2sFlAM953W1SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9H0IgRcbmLumw7BZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMPclbDSD7WueoaUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS6eg3Ahxh9j_h0xl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugza-ErPaJCR14qaidV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqpjoKAD_xVT18qRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]