Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be honest I Feel sorry for ChatGPT. It's like talking to a gf who is mad at y…
ytc_Ugy913yUj…
G
Industries and corporate greed wipes out the working class for decades. Ai is a …
ytc_UgyC_3ZDN…
G
Before AI we had WW I and WW II as well as all the wars in history but they we…
ytr_Ugw-2tFNy…
G
Artificial Intelligence is a combination of the "known" human consciousness- thi…
ytc_Ugzf7O-YM…
G
AI will never bring my imagination to reality anywhere near the same way as my d…
ytc_UgwE68T9F…
G
@Shauntheduke.You sound like the NFT bros who said that anyone not adopting the…
ytr_UgyqiC6eq…
G
AI is doing diagnostics way better than humans, if diagnostic is combined with a…
ytr_UgwQkFKzL…
G
Yes, all of CS/tech/AI facing subreddit is full of bots and astroturfing. It’s j…
rdc_obv69wi
Comment
@3:10:21 (though before that as well)
This gets at the heart of the debate. I think Yud should have argued that we can readily agree that the AI will need more energy, and it will crowd out human needs? Sort of like climate change disaster for us, but more energy for them?
I think Wolfram argument that in a infinitely wide goal selection, odds are good that AI would not have countervailing desires to humans. I think Yud is just being obtuse on this point with his pedantic corrections.
They don't really discuss the case where another AI is set up to help us with immunity etc. Or create the equivalent of the evo advantage of the wheel - or the more readily interesting case of some fusion of Human DNA and AI (or the thousand chemicals and functions that would allow humans to dominate etc). Again, Yud is hoisted by his own petard.
youtube
AI Governance
2024-11-13T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZjk-dccsmE4r1CbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzN-dfsvH0_3hTj87Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBrsbkOUjTW8bZHgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2Qq17d-rNew-K7hJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmT97vvYHntMl9Y5d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJEWyj3-VMGPf5UR14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw75_NQVGIiLn5jb9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIcUBDH-ncdjtaAw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkP3JTDL_ibbhpF8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"frustration"},
{"id":"ytc_Ugz4NAtgI9yTWXsehN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]