Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI Free" will be a quality term in a few years, just watch.
(also these examp…
ytc_UgywsgBH1…
G
This just goes to show why vision and radar-based systems aren’t enough for full…
ytc_UgwNBuLOv…
G
Your ai is as smart as the monkey in your family tree. You say I come from monke…
ytc_UgzeV2KOX…
G
Absolutely! Wisdom is a culmination of time, knowledge, experience, and empathy.…
ytr_UgxV23EHC…
G
Why can't Aurora build their own motorways like railways specific for their driv…
ytc_UgwOQvc_i…
G
I have never thought working for someone else's company was fulfilling as a huma…
ytc_Ugx3blouq…
G
Good way of saying ai isn't ai. It doesn't have its own conscious or own intelli…
ytc_Ugwyi5NQq…
G
All this showed is how easy it is to get AI to tell you exactly what you want to…
ytc_UgwIjpViQ…
Comment
1:07:25 — NDT is off on this one. Give that task to AI, and it will always produce something new. It never says, “I don’t know,” because it doesn’t truly understand — it just generates an answer, even if it’s wrong or explicitly told not to. It would rather hallucinate than admit ignorance.
This, right now, is the dumbest it will ever be, and yet it’s already shaping every aspect of our lives. There’s no part of human experience that won’t be touched by it. One day, even events like this might be run by an AI avatar of NDT and its guests — unless we consciously choose otherwise.
And for those who think blue-collar work is safe — take a look at what Boston Dynamics and similar companies are building.
(re-written with Ai)
youtube
AI Governance
2026-03-23T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRHj_GqoTuKUuo8z54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKkolzCmNiXNpum1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgximKBdniY8witwtEp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzbIo26YunXGXwSagR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0w9lGkc22srY7CX54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrWF_VuGcSgrSOyqt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaAcgmkYhN03Aei0x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSeaQIdDAAFYvWuOt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHj2EQ7AGsA9en_854AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgznswjF1WAiIvs34pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]