Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Very bad take on AGI. Scientist are worried about it not because it's cool, it's because the alignment problem is not solved. And this guy is an astrophysicist, not an AI researcher, his opinion is not worth very much.
youtube AI Moral Status 2025-07-23T20:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz1hOeI-7q8D96G5Kp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzTzBftyBPmK0KJkTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwN3Y7E-gpqZU3_CF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugzx36lHxz201bAy4LR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyNIXVDcOF0CWzfOiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgyD6NZJWoR2Yv0Cxk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzjWSttinwxu3j3QSx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgznzYLZrjWKY1SVDF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx8nvTTFQ5uP-WvjpJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz1BY6gScLp3gvKocF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]