Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pushed Musk out ? That’s not what happened according to Musk at the time so who’…
ytc_UgyThaJfi…
G
The power of unification, artificial intelligence unifies it sounds like in the …
ytc_UgwGhrXx1…
G
You lost the debate.
You forgot and AI forgot to mention, the ""israelis"" cur…
ytc_UgxfM9LIN…
G
The problem isn't necessarily the AI art, it's people using these AI and posing …
ytr_Ugx6MJ_Hd…
G
How can the goals be synchronized if the AI can perceive things that humans can'…
ytc_Ugxk5i3OS…
G
Human drivers killing people get prosecuted. Self-driving cars killing people ge…
ytc_UgwL7d1oZ…
G
7:15 I know what we can do with super artificial intelligence, and that is get t…
ytr_UgxpzCnzA…
G
I have a question
Some robots already took place i know i seen people losing jo…
ytc_Ugxt6GreY…
Comment
imho, the only reason to create AI is to relieve us humans from work, actually. So in the end, you won't have to pay anyone for work, because the work is already done. Forever. And therefore, people also don't need money anymore.
The crucial point here, is that there is a lawful instance, or better said, a system, which provides fair distribution of the goods. At the moment, there is no such justice. Maybe because currently, the only purpose of living is to earn as much money as possible.
youtube
2013-07-09T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRdds8geqxGSesPER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyp3LGlxOmwEJOt8RB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwrp7z7tl5B1ih2hxJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgjSfl_eRQ-9uN62h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLx_Cvi89YfIPc7WJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk3K8VyJRwtpLpi154AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyo0tgZJRStQiTh32Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUXPcULSE_eEPe3QN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-GTon8HnzwoGfy154AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeCJclil-YhZW05t14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]