Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It’s not worth it to have these data centers. They are corrupting the environment on top of ruining the health of the people who live in these areas. A lot of these corporations don’t have to worry about the long term effects of these data centers because they don’t live where these machines are. Elon Musk has built a data center in a town, i believe, in Tennessee. It’s destroying the environment and causing sickness throughout the neighboring towns to the people there and the law isn’t doing anything about it. Because he is allowed to operate this machine without being checked, it’s just a matter of time before those towns become wastelands. If we don’t stop these corporations now, what will happen later? They are going to round all the people up in put them in one small section where they are actually sitting on top of each other. These corporations and our politicians don’t care about us. If we don’t stop them now, they will be coming to a town near you. If this is what we have to accept to have AI, we are not ready for AI. Why don’t we do something more logical, like build the mindsets of our children where they can receive the best learning, regardless what background they come from. It’s just a matter of time before one of them invents a solution to all the problems AI poses now. We shouldn’t be building up AI. We should be building up the minds of the people so we can have an AI we can all use safely.
youtube AI Moral Status 2025-11-29T05:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzqeId6BqbcbvBc_c94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw6aRB-FDKcpu9AHqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyRJMgvv1UNURXIugl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyamlVNZTxESsHM0cp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzR4EudMpwIcA-DwL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwcK6Eo1MPupISWgox4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxWEojGuHlc-DuRKBl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzac2uxI7l3wqBkdNd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw6C0McMKufXRv02Ph4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwfaw5OmL40_4cwjYp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]