Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
OK yall. Everyone has seen the Terminator films. Yes this is scary. AI taking jobs. But we are the human race. We might not be smarter than that. But its still a program. So unless AI gets free to build enough robots to match our human race or even half. We can always just stop the AI from getting to thay point of no return where we have no jobs anymore. For example. Yes a lot of practical jobs will be replaced but those won't criple our economy. But just one example say heavy equipment operators. Those should easily be replaced by AI and wipe out blue collar. Well we have to allow AI to reprogram all the equipment as well as allow them to move ahead with daily workloads. It's is already happening at some ports offloading container ships right now. So we have allowed that. And if we allow them to program robots we are screwed. So we as the human race have some upper hand when it comes to critical thinking. And we dont allow the AI to get to that point. It would take a bit of time to reprogram a bulldozer to allow the AI to be in full control. We simply have to understand we cannot allow our technicians and operators and installers and whomever to be that stupid to agree to allow this production to move forward. So the AI will advance and take over alot. But we are still in control up until AI is free to physically overpower us. And that's the Terminator fear. So as long as its still a program we still have control.
youtube AI Governance 2025-10-18T23:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz0g8ZwTcoytKINy3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz4FlWuAi_U4w322NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxRRg-5udAbO5jp5ad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwkqBoy15CyxWcMis14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxSoDjstFugz9wNTRl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWcEYtu5QZTfnGVrB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugypdg3s3E_09DsZm3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyBcUTD1qetuIs2lBV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyyU586Mc--m7ONwCN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzkbnrhSiYN5LTWgO94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]