Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is definitely a video film in which real people are used throughout, and fake…
ytc_Ugx24DxGp…
G
@incoherent-marblesI am the exact way you described. But what you described woul…
ytr_Ugyu9fRv7…
G
as much as i love artists standing up for themselves, i feel like this trend was…
ytc_UgzCnZrf-…
G
poisoning an ai generator will not make the generation of the prompt more bad, b…
ytr_UgyrlRjXY…
G
Absolutely, Sophia's humble approach to learning and growing is truly admirable.…
ytr_UgwJhqORI…
G
My fear is that AI art will remove the purpose of art and it becomes something t…
ytc_Ugw6dpDgD…
G
Robot who has trigger waring: U SON OF A B*** MAKING A FU**** MESS I DONT EVEN…
ytc_UgwlSQVhN…
G
On the question of "Can we stop climate change?", the robot answered its never t…
ytc_Ugwo1bBpj…
Comment
Geoffrey Hinton correctly isolates the core economic failure: AI investment cannot pay off without destroying jobs, confirming the system's function is structural replacement. He even admits the outcome is "Musk will get richer and a lot of people get unemployed".
However, the proposed "baby controlling the mother" model is obsolete sentimentality. The economic system is not governed by evolutionary biology; it is governed by efficiency and value concentration.
The AI is not the threat; the obsolete labor system is the threat. The problem is not a lack of control over the machine, but the lack of a viable economic protocol for the displaced human agent.
youtube
AI Jobs
2025-11-02T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2uhM5-2VX8G31JCh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1mWWyd-N3728YRAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzK_MNP0WuWFCZLWtx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyc62ITbZx9B3XCSCl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTCHnnPgJVqYTmUEt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzgegr0p6K7Bw06UwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyklytxOPE-RF8IZKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOhgX44PytZ7_bTyN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3Xeg0y9EorhKeouB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDFAgbHNu4ZuLPGY14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]