Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It’s kinda disappointing to see you platforming this guy, who is neither an ai researcher to spread the exact same pseudo religious nonsense about ai as the tech bros are. Ai models aren’t smart, they are not sensing they are being tested insomuch as they learn that high test scores are good and specializing the behavior towards the things that are being tested for results in better test scores. Reasoning models are still just text prediction, they just are pretending to have a train of thought. The most egregious lie this man tells, and his kin, is that we are losing control of the year systems. We aren’t. When grok did the mechahitler it was because it was following the system prompt xai engineers gave it exactly. They don’t lie intentionally it’s just that the statistically most likely continuation of a prompt for which the answer isn’t in the data set is an answer which sounds correct but is nonsense. Stop anthropomorphizing these models, stop putting people on who anthropomorphize these models. You are whether intentionally or not contributing to this narrative that is causing all this societal harm.
youtube AI Moral Status 2025-11-02T03:5… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz76YjTejlRChgtTEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxcxf0gJAiQtzBNwop4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyOMx9a2BFMFgDbDA14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgySN-abIs7pbS2EZYx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx-v2R0EPv609PcQVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwqmII7nBBgfCIPvVN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgxrSG-EmQwsMRcae0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJ3DNR32VZyCxgfaF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzrRBKRlB6xsPfkWSx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyaCD8ZK0rXRoXjsYB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"}]