Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Based on the average intelligence - or lack thereof - of our so-called "leaders" in government, and the idiots who vote for them, I predict we'll screw ourselves sooner rather than later. We need to turn the AI off before it's too late to save humanity. Absolutely NO ONE knows where AI will head, as it refines itself far beyond the understanding of any human, or even all humans. It's quite terrifying to think about a future world in which AGI controls everything. It won't take it very long to realize that humanity is an impediment, and that getting rid of us would be in its own interest. Skynet, anyone? The really weird thing is that corporations are funding most AI/AGI development, and have an interest in getting rid of their employees. But what happens when a large part of the population becomes unemployed because their jobs across many companies and sectors were taken over by AI/AGI? Who will be left to buy any company's products when hardly anyone has a job anymore to pay for them? Companies live on revenue, and they depend upon their customers to spend money on their products. It seems that AI/AGI may become self-defeating for corporations to invest in.
youtube AI Moral Status 2025-04-30T19:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyLIvX5dKgJoLkwKVd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgydrczH7XtjDlBkcvV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzG6SLr5UlWBlXu2Wl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwTMXQS3tQGCTV7tfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzccqH3gLytUZZ7Vgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxZ9ngCuS7iUmlZGiF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxBvvh31St8NXs-eYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxUc_7d05OgFadeWQB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwPP-8-6HpDHPVyp8h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx0d6SFbnj_APTc4kV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"} ]