Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Generative AI should also count as large-scale plagiarism. No credit or even sma…
ytc_Ugydp1sEp…
G
I worked in operations for a high-tech electronics firm for 38 years. In electro…
ytc_UgzIr-4-C…
G
This video will age so poorly. Watch it end of year again, mid next year at late…
ytc_UgwWRoJV2…
G
We really need Person Rights for all persons, no matter if they're animals, robo…
ytr_UgjA4P2zv…
G
I don’t think most of them I think it will introduce more people
Thus being sai…
ytr_Ugxb2ql2G…
G
Don’t panic, it’s only Automated Intelligence and not Artificial therefore not s…
ytc_Ugz2Efdt9…
G
That amica better watch out if it comes near me i will distroy it!!! i aren't ab…
ytc_UgwAPDDWR…
G
Drove an older woman on Uber other day and she was treating me like a Waymo! she…
ytc_UgzYSS8_3…
Comment
Thinking we must spend more time and effort on designing a Prime Directive (startrek) so humanity is the machines' purpose for existence. i.e. we maintain our 'value' to the machine. For example, the machine's Prime Directive is to assist humanity in their quest for health, wealth, and happiness (all words clearly defined). From that Directive, we can build laws and rules and tax codes and hopefully put guardrails on AI companies 'creativity. I know I would feel safer if I knew where we were going (Directive)
youtube
AI Governance
2026-04-20T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx71CTRCoNv1Xslqh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzL1s_Pb2A8dG628gB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxexAzCruGDdc8OmdF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLUufIsVHnCcN64GJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbTF1Xv6h7Ta_PChZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0bhSo5HYE_h0WA4p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmyNtLNN-VPhxMLth4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyFuLtQNDCAYnuK8zN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxL-7U3T_BkXmixFUd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz05fIP8Kt0DJ-0ilt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]