Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great discussion! I had misunderstood from the trailer that s-risk would only apply to our "copies," but I now understand you meant us and/or our possible copies. Also, I don't think that talking about job loss is disingenuous. Large populations being replaced by AI mean that they are of less utility, applying less value not only to them but to their representation, making politicians gradually obsolete. All while those controlling AI gain more and more power, as long as they can control it anyway. And we are talking about totally unchecked power. In a video about the new iteration of the Boston Dynamics Atlas robot, the presenter said that it would be too expensive at $250k to be of practical use. I commented that working three shifts a day, nonstop, it will pay for itself in 1-2 years. You would be crazy not to buy it as a business owner once it reaches human capabilities. Job loss may not be the main problem, but it could "disarm" humanity, while UBI can have a transitionary placating effect. Of course, all this comes with a slow takeoff. If it is fast, we are toast anyway. As for the general public, a main course of job loss and terrorism with a side dish of x-risk and a small hint of s-risk could be the best recipe indeed to make them to start thinking.
youtube AI Governance 2024-03-06T17:4… ♥ 2
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugw9WkHmvC4AMz7ZWmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxpBfLqTIqZCjgpWqh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyheoQsqk8pc9gsJaV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzIFA03EzGEiATA-ed4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAv-nyO8c0Yto2cjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyynK7fVl5EvpFBjXd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwSn2kymZ9bFozQigx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx89wXlMwsjPX1Osht4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwb0ySrT5vC1MfaEG94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxJnW-BCHKI9ayzEBB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}]