Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artist here! I only ever use AI if I need inspiration, but I never claim anythin…
ytc_UgzkcWubL…
G
I do use AI for reference, but only for things like the outline, etc. The detail…
ytc_UgxSJZFJt…
G
I've seen Dr. Hinton explain these things many times on many TV shows, podcasts,…
ytc_UgwiUnzku…
G
It's actually incredibly sad that these types of pro-AI people are so adamant on…
ytc_UgwByvHDB…
G
Odd that the major quantum companies haven't already co-opted this announcement.…
rdc_oguip9d
G
Haha, yeah running AI model I believe costs the most out of any other random tas…
rdc_m9j3evt
G
LLM is brain augment, not brain replacer. The same it augments your workers, and…
ytc_Ugw5yMOD1…
G
Bernie. I think we’re too late for this given the fact that we are in this race …
ytc_UgzkHI1dc…
Comment
No matter what, there is always someone who wants less and there is always someone who more. You can put as many restrictions as you want on the development of AI, but there will always be someone out there who will want more.
youtube
AI Governance
2025-10-13T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEeJtxCZ0L-TXIde14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwT09jnHKTG6Gu3PYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugylb8WW0PXHFyEdnV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLG0pOr9_Y5z3WK-R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzjMOlO3zDnO19WwIN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgwAf64yfuY-eHjoLYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwkg_gMEmym_iBYA4x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxmDj5bf7UKvuDTfQd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS2l5Z55euxgX6HzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4__W3jlcNOqDpsKl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]