Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Apparently, no one seems to realize that if the entire economy were automated, a…
ytc_UgyQVBtuL…
G
He is right but also wrong. The Turing test is a method to test AI but it will n…
ytc_UgwQmly6L…
G
@peterbaruxis2511That analogy is not valid. A sunset is an optic phenomena perce…
ytr_UgyVA66pZ…
G
Counterpoint, some people simply lack every visualization skill to draw. Not say…
ytc_UgzQUx8jl…
G
I don't think this compares to the technologies of the past. The moment AI gets …
ytc_Ugz0iTyLm…
G
I'm currently working with an old legacy system that no one but me understands. …
ytc_Ugx6tthzD…
G
AI can be a multiplier, but creative people will always have their place because…
ytc_UgxwSVYGH…
G
Here's how to make safe ai. Stop working on ai now lol. It's so good now and you…
ytc_UgynnHIJh…
Comment
AI will eventually decide that humans fight a lot over not much.
That they’re not just greedy (just) but their sole objective is to amass more wealth than the next person irrespective of the consequences to others
That they’re selfish
That they’re ruining a planet they believe they own
So step one will be to remove their purpose. Subsequent steps will be to rid the planet of them altogether.
The gorillas will be somewhat smug about it
youtube
AI Governance
2025-12-04T10:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAhMrhUc_auGh4TTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzuw-2iuq23MEl313x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwwxmib6fBINJO4lRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCLafFK5kLCkt7oN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwgk0jMLcalxQ_6LC94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxe5qXrCSSmD1A6b4V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy6q0Jl6o_33ZOp_Lt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzlj4NF2U5a3jMqLFx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynMNdnRu4xTzvs7Q94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw7E1lqGesi2nP1WOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]