Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WE ARE IN THE THE "6 RESET"WHAT WE ARE DOING HAS BEEN DU B4/ AI FINDS WE ARE …
ytc_UgzzsTRcr…
G
Does sentient a.i differ much from the human experience?Aren't we also afraid of…
ytc_UgweHhIpf…
G
Man, your attitude this video has really struck a nerve. I think you should real…
ytc_UgwPsIcjj…
G
@Also_sprach_Zarathustra. unrestricted AI is the last thing a goverment like Chi…
ytr_UgyIMFq0s…
G
Simple, don't put AI in machines designed to do repetitive or "slave" jobs" why …
ytc_Ugh2D6_lD…
G
You are assuming, incorrectly, that the work of the AI is better than human bein…
ytc_UgxQWtFPc…
G
AI taking jobs seems daunting, but OSVue automates tasks without replacing my te…
ytc_UgxUOnlWI…
G
This video reminds me of the time where an “AI artist” got embarrassed by one of…
ytc_UgwvnF38f…
Comment
While I enjoyed the conversation and it was rather entertaining it was also full of misses. Or maybe…. Predictions that are not fully thought out. You can place autonomous systems on existing vehicles replacing the driver very quickly and easily once demonstrated. And “vibe coding” will never work the way people think it will. Are you going to allow all those vulnerabilities and gaps in security on your system? Only to be found out later. The Marines have a saying— slow is smooth— and smooth is fast. Meaning it’s always more efficient to do things right the first time. Then to fix mistakes. In their business it means lives. In ours?
youtube
AI Governance
2026-04-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxLyMFeSGD2PB6Eaqd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVfDF40vgP9iuIu114AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx5nfGdKZPTpdSrvnx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzQ8EJmG907qtfukiV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwHpN_eHyOqD3CsWah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugzpwy4uzTQsh098R5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzwrWP4wflaTreSbQ94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyetzeULfoEz1fBPSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwP1ec9ztJoroclnYx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0G0MElVH-9mPqFfl4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"}
]