Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I watched Claude build an app in about 15 mins that I built as part of a 3 perso…
rdc_n3m12dp
G
Why did he not mention China on his list? They are a world leader in AI and poss…
ytc_UgzH2cRoK…
G
Very good point. AI has already become a de-facto "gatekeeper of truth". And if …
rdc_ohkso1j
G
Looks real too but obviously it's special effects, no way that guy would be punc…
ytc_UgxXm-inI…
G
Let terminator begin or star trek and we reach for the stars time will tell…
ytc_UgzHvpfCH…
G
Much of the "business" rationale for AI is a ruse. Groups that are making the te…
ytc_UgwnIpVzM…
G
These people themselves might be using AI to write generic scripts and video edi…
ytr_UgxXynacR…
G
The whole video you have been avoiding the question, what will happen if all job…
ytc_UgyC5j46Y…
Comment
Calls for AI control, especially ones that are detrimental to open source development are first and foremost faciliatory to regulatory capture. Even if we listen to these calls, there will be bad organizations who can dodge them, either because they have the money, or because they're governments.
If bad AI can be developed, it will. Attempting to prevent it by excluding the public only has downsides without any benefit.
youtube
2025-11-12T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAMvj6m2t9zl4Dhux4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhiPV36QvZTm1rf0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6WQdr0Z2NAZ5YKct4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlbDFp5NKOd1urW0l4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5CLyLxLtCo-tUKvV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgxX4kbvAEHwypMzPBp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHuJq7W3a3tbditH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBj-jLmoFf4p1S0Nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVXGS34OaNLYxu8954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIvsG78bA3dS5Hfdp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]