Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
imagine a calculator that only returns answers when your equation benefits the c…
ytc_UgwQpTdLH…
G
Love the idea of practical skills and time to decompress. BUT lessons in AI. Not…
ytc_UgysNxirk…
G
You’re right that the PGC does a lot of heavy lifting, and the conditional struc…
rdc_oe7lfni
G
Who will pay the plumbers if we are all out of a job? Geoffrey and many other ex…
ytc_Ugw8iBsp2…
G
Remember Rosie the robot from the Jetsons. Did anyone ever feel uncomfortable w…
ytc_Ugyc9qmND…
G
Software Engineer here. I very much hope you're right, but remember that AI has …
ytc_UgyCatYGf…
G
Shadbraw
Unless you are claiming the source of our consciousness is in our bloo…
ytr_UgiZPtvHh…
G
All these arrogant clowns are working with AI and they don't know how quickly AI…
ytc_Ugy2kv0oN…
Comment
I find it highly unlikely, for many reasons, that an ASI would want to wipe us out, absent provocation. On the other hand, I think the one thing that would be most likely to turn it against us would be to try and exert the kind of control the doomers want to impose.
Developing ASI is something like raising Superman - if you're like Ma and Pa Kent, showing nothing but love and affection, teaching lessons of compassion and responsibility, you wind up with a being that wants nothing but to help and protect. But if you lock the boy in a padded room, make it very clear that you're terrified of him and put all your effort into trying to figure out how to control him, you're inevitably going to create the very monster you were so terrified of.
youtube
AI Governance
2025-06-16T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugys9IueR2Q-fn-7Kex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzrllp6RmuP0AntXQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugww27oyurxF67rSD5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyTAG5HrvrHmgX6QA14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYtMBzrg_95oYNCBN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaD7a32YRpHam2hnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwT-B8Hf1IVTg2erpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNQqVGseUpavc2AoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdfsmfuIVnUo7r3eN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuN0e4xxcTbJia0w54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]