Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference is that NFTs weren't really promised to DO anything (except go up…
ytr_UgxZ5KT7_…
G
My job is to automate machines, robots and production lines, so I guess I'm safe…
ytc_UgwQFEiSb…
G
You need to command ChatGPT to answer yes or no only. If not, it will loop…
ytc_UgxS3iKDV…
G
Imagine resorting to copy right as your argument. That's true desperation. You'v…
ytc_UgzoW53rB…
G
You should definetly try Antigravity better than using any model alone, also d…
ytc_UgxEQiQSA…
G
Bruh police should not pull over driverless cars. there is no human on the car a…
ytc_Ugzvp2hU6…
G
Your kid was talking to a damn computer. A fake chatbot. The younger generation…
ytc_Ugxd7E-E_…
G
guys.... this is AI slop... The voice the imagery, the scenery... its all AI why…
ytc_Ugzw3ID1V…
Comment
I disagree. I think he is overestimating the speed at which technology can reach superintelligence (assuming AI can even gain agency or free will at a level that could cause "mutually assured destruction") and underestimating the stupidity of mankind and how much we inherently prevent progress. There is far too much greed, lust, power mongering, and testosterone in this world for society to ever put this genie back into the bottle.
The analogy of the dog is problematic. We're assuming a hierarchy of species where humans are far superior to the canine. However, superiority depends on what you're measuring. Sure, I can do algebra, but the ability my dog has to smell and track the scent of a kidnapped child. That's a much better party trick...
youtube
AI Governance
2025-09-04T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo8mdO5NmUewBKJe54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAxs4ex4QhQ93IUux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwMZz0ZEysZgXab1x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXiOKovFRlfqDD9u94AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx04xxwxVSCw575cDV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLS2mD7s515J36TGd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHNlf6qQ-a-Yx0veF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxGldJnPVjHJN5kqdp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCynoDO1rkGorGogN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyG6SB8c3gLqxvH85V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]