Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But with lighting I can put down a point in my drawing and then the computer sha…
ytr_Ugymie9UO…
G
Here’s a different thought. What if AI is our salvation? If everyone had univers…
ytc_Ugwd7mkT1…
G
This statement is profound, if I could only share one thing from this whole vide…
ytc_UgxvsURs1…
G
I am less worried by an IA that takes over every single human being, than by an …
ytc_UgxjyROWw…
G
The primary issue is not about the AI; your majesties, your royal highnesses, ex…
ytc_Ugw6lm5am…
G
I just asked ChatGPT what US states have an "R" in their name and it had Hawaii …
ytc_Ugy1aLq-k…
G
I didn’t see anything about Billie showing up at the met gala I think she was th…
ytc_UgzzpVCj1…
G
Do you remember that old AI video of Will Smith eating spaghetti? That was only …
ytc_UgxCHHh6P…
Comment
Technology is unstoppable, and pretending otherwise is foolish. The greatest danger is the concentration of wealth & power in the hands of a few. We’ve already seen humanity sabotage nuclear energy, burying it under fear, propaganda, and politics. We can’t repeat that mistake.
Wealth and power should never have been allowed to accumulate in one place. If AI reshapes the world, the gains must be shared. Call it shared technological benefit, distributed advancement, or refusing to privatise the future. The label doesn’t matter; the principle does.
youtube
AI Jobs
2025-11-15T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyjq1ELPVlVl57XumB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMgibOkt9yIYoGD3x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza36EgYSdStsrGsvF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_-OFiUeBrXXmOeal4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQgOEs6hNr3TFhHDt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwQs58zM7PAw1ndUSd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwr-FMQIIaYTMsLXf14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgPUXViB7NWBcZUex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz294ay9E-XFuJ6Q294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzBSKJoqDKlffwjuE54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]