Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Government wrecks everything! My fondest hope is that AI will only be able to sp…
ytc_UgwPX8Nsw…
G
See, I as an artist and slightly connected to the programming understand that th…
ytc_UgxK4t6eF…
G
Robots as intelligent as humans will most likely happen by putting a human mind …
ytr_UghAVC5Uy…
G
Are you sure it’s full self driving? This can happen with auto pilot. Supervised…
ytc_UgxTJab6f…
G
If this guy is making A1 robots then the only thing we have to fear is the mass …
ytc_UgykqC5ye…
G
The problem is we use the term "AI" as this anthropomorphized catch-all for many…
ytc_UgzzbfewF…
G
I love to see this shit unfolding. The billionaire who built his empire about hy…
rdc_m9gq85t
G
What does Trump's plan have to do with forcing you to do anything with AI? If yo…
ytr_UgzamOqHJ…
Comment
Yoshua Bendigo, don't put AI Agents on the internet, instead let Scientist AI deep researchers look for solutions to AI alignment mean while. Bendigo's company ZeroLaw (Asimov's 0 law, save humanity) all good ideas
youtube
AI Governance
2025-10-15T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxvblgWarV_XSpi7Wt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnCXqm1RZbl5iWUFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZOx2TvyBPt-KsW8N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxaNnI_bAbU-dftvTx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgycipP42UwlR70Uzx14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBiaANjE85vI0-lMV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsdXfh3BkZ5rh9ast4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzW4q7r39b4P7UlFyh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzbCkJe-xFbqv7dytR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz6cEq0UdFY6DRTflF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]