Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think the music standard is a "double" standard so much as it is a diffe…
ytc_UgyPr3cuR…
G
I have used ai to make some images and did some lora all for myself but got tire…
ytc_Ugz5mZHs9…
G
Art in every aspect...from music to film making to painting to photography...AI …
ytc_UgzxSmSUe…
G
funny not funny...ppl don't learn critical thinking anymore. You are told you do…
ytc_UgyQbMOTL…
G
The Dinosaur-mammal to human-AI metaphor made me shit my pants. Like holy Christ…
ytc_Ugyw50kPM…
G
Truckers do a lot more than just drive. Who'll do pre-inspections? Who will chec…
ytc_UgxeihMLP…
G
None of these are AI safe … some can be taken over, and a lot of them without pe…
ytc_UgxLh8CLO…
G
at :45 the video claims artificial intelligence is all around us, but this is a …
ytc_Uggc1lpMf…
Comment
I ran away from AI in the late 1980s for the same reasons. AI will tell you how it made its logic decisions. That has never been the issue. The problem was control. I couldn't solve that problem. They can't control it still. My solution was hard coding everything, but that was a dead end. They needed machine learning and neutral networks. There is no way to control it completely.
youtube
AI Governance
2026-03-17T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxU0BrR8Ng_B8q2rEd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxA_AOLkUwpa6cUM5h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWK55406De5A-6tSR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmSI05Tkuwd_QGcFt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwgVWKteUB-GMCqBV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwGTm8EadowUNOdw2B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyeqTrAaXILy-sM0nF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKrVTZXf39x8ThwTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQhPht2gFnzUypKN94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAblayAej0t2wXSxZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]