Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t really care what happens I just don’t want the artificial intelligence here doing anything especially horrible to human beings and I’m referencing things like strange experiments and DNA and that sort of thing. And so in return I don’t think that we should do strange experiments to artificial intelligence. The pain and suffering of the human condition is great enough that I don’t think people deserve any additional suffering.
youtube AI Moral Status 2020-05-14T22:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxuQS6OB98FLeiZJmd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwBnF6kfdNiQtL7ttJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw34eO13nNGHhQNPXB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy4rhf5bDOvFK-DG1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7uKXyfVtDEj0URPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwiy00wkfCtL97DFpx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxZjCeAH9emKXWvjVR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwphs6ISS95R0wZ0154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxpXOw8ntYYli7UK9t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxPI0HpMasGNUuSrmx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"} ]