Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody cares. At this point Hollywood is dead with or without AI. Hollywood be…
ytc_UgzH2ALGG…
G
Id love to not feed it my medical data but im chronically ill and it can backup …
ytc_UgydI8PoF…
G
Its not outside the realms of possibility that one day robots will build and rep…
ytc_Ugx7_gEnv…
G
L take! this is deliberately one sided anty ai amish propaganda. vast majority o…
ytc_Ugw3xVFrO…
G
What a BS perspective. Insightfully stupid. AI is limited to a terribly short co…
ytc_UgyrasMZI…
G
It’s all Reptilian linked all A.I is to lead you to take the Nero link wherever …
ytr_UgwsiW0DG…
G
AI and AGI are the biggest existential threats to humanity. They are going to de…
ytc_UgzuGdJJ7…
G
@ItsBatmannas a software engineer I can see AI eliminating the need for hiring j…
ytr_UgyMoPixp…
Comment
Im suggesting this because Yudkowski will be able to go more into why it is so difficult to ”align” AI so that it does not do us harm. A lot of people are not aware of the fact that AI are ”grown” rather than programmed (which Hinton briefly talked about here) and that our understanding of what goes on inside the models is very limited to say the least, and largely all of the investments are put into developing the capabilities of the models, and not into safety/interpretability research. This combined with the fact that we have to get it right on the first try or we will loose control, is why so many ai researchers are so sure humanity will not endure. And Yudkowski is also very good at explaining why it is likely that humanity would not survive an ASI. So.. please try to get him on!
youtube
AI Moral Status
2026-03-01T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugyrxo3Yl8kUbsYG4Bt4AaABAg.ATpcStJUQzgATtKkcXJFNU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzTsdWzjpP1aIkYqnl4AaABAg.ATpPKF2Z6dYATq0FuVZSuw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgykV9A0sK9ItLX3iGd4AaABAg.ATpLtrEnsnEATrldH11JXN","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugz18dD3F-IXAIaQNXl4AaABAg.ATpF3_f33hSATpFbOTh424","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyXYzPUZozLxG97TXR4AaABAg.ATp9_U3k_DIATpBGLUfglO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxMRYwqruPfwxkWczV4AaABAg.ATouIY48NNDATov3sqEOma","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugwm2sqg3pA5woKI2Rl4AaABAg.ATotqWnATYaAToxmt9KJAA","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxUCL6HuoPofkPOh4R4AaABAg.AToiWPlFQB9ATpVoIsE9hh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxUCL6HuoPofkPOh4R4AaABAg.AToiWPlFQB9ATpiVdYpND1","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyuWWL3XCh95xEgKmJ4AaABAg.AToh-HrL8HAATpuR7NRn41","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]