Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do not be fooled by the obvious robotic movement that is shown that is not the e…
ytc_UgxXus5aD…
G
I realize i might have been a too harsh, but i will say this, i have not seen a …
ytr_Ugzk-gF3T…
G
The chess analogy is good because it shows that even though ai dominates humans …
ytc_UgzpGVffP…
G
Why are we asking AI any questions!?!
Ask humans, and you will know more about h…
ytc_UgzXY3b9l…
G
Much handwringing about doing "something", but nothing specific, which is only n…
ytc_UgzUgjrdx…
G
Ai si good at creating code for a little task, but it cannot in any way think on…
ytc_Ugz0Yf9BL…
G
The speaker did an excellent job of speaking on how artificial intelligence coul…
ytc_UgyLP9muw…
G
Utilities are incredible places to work. Electric utilities especially. Ai is la…
ytc_UgxFEryyN…
Comment
The entirety of human history can be viewed through a lens of deploying technology as "tools". Hence, we are deeply programmed to regard any and all technology we deploy as merely our latest and greatest "tool". But General AI won't be a "tool".....it will be an "agent"....and aside from having biological children (who are notoriously unwieldy and unpredictable but are thankfully subject to human limitations) Human's have ZERO experience with developing or deploying "agents" with equal to or greater than human intelligence.
We literally have no appreciation for the dangers involved. Some very informed people are trying to warn us. But we are hellbent and determined to F.A.F.O.
youtube
AI Governance
2025-09-04T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx0ZsdHPDXJxIShiWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvjbMxd81f2M3MUSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxCGOJjkng6dDbDht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWWrDfErp2Nd9WpQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBGjTJIBYttPQ_KfV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlD30wXrc8WzIbVmZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5zftANSmRpq7xXyd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy5fOl0byMBtz004t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJfySQEIn2bf8nK1l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8r0Xx0xO8tVvq9B94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]