Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots fail automation is nowhere near where it needs to be to totally take over…
ytc_UgwMW64Vr…
G
This is the equivalent to “black” hairstyles when you create a character in a ga…
ytc_UgwAjfpJn…
G
Companies start implementing AI. Some workers get laid off. Everyone starts gett…
ytc_Ugwh9rzFy…
G
You dont need AI to fire employees...
Trump single handedly fired government wor…
ytc_UgxU2UP3v…
G
This is unacceptable! If the European commission does not stop this non sense, E…
rdc_d0fugfr
G
5:12 The art itself is not what makes us angry, it's the principle and the conce…
ytc_Ugz_eCfUz…
G
It's STYLED like a professional work but that's the entirepoint of AI: Mimicking…
ytc_Ugx9y5RgG…
G
He wont say it but hes talking about your digital twin and how you are connected…
ytc_UgwSYYrvP…
Comment
Interesting conversation! This is Eliezer's bread and butter, and he always has (or comes up with) illuminating examples to illustrate his points. This isn't Wolfram's field, which is fine, but unfortunately he too often nitpicked in ways that showed that he was getting the wrong end of the stick (e.g. Eliezer describing Stockfish's chess-playing as an example of how he conceives of wants, beliefs, predictions etc. in the context of AI, whereupon Wolfram got hung up on whether the mechanisms that represent these wants, beliefs, predictions etc. can actually be identified in the scary types of AI too, which doesn't matter for Eliezer's point; it only matters that they're in there somewhere).
David Deutsch I think would be a good "opponent" for Eliezer in such a debate since he's a high-level thinker and won't get stuck at the hurdle of accepting agency in the context of AI or on claims of value relativism. He's simply an optimist in opposition to Eliezer's pessimism.
youtube
AI Governance
2024-11-12T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZjk-dccsmE4r1CbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzN-dfsvH0_3hTj87Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBrsbkOUjTW8bZHgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2Qq17d-rNew-K7hJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmT97vvYHntMl9Y5d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJEWyj3-VMGPf5UR14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw75_NQVGIiLn5jb9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIcUBDH-ncdjtaAw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkP3JTDL_ibbhpF8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"frustration"},
{"id":"ytc_Ugz4NAtgI9yTWXsehN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]