Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ai can't replace all jobs. Ai still needs humans to perform jobs that are not important because of the human's ability to do it better. It will need humans to do job related to it's continuity and jobs related to performing tasks that it wants humans to be held accountable for. AI doesn't have accountability. How do you punish an Ai that acts as a bad actor, the Ai will simply maintain a loyal human that accepts responsibility for authorizing it's plans in return for that human accepting the punishment for things that conflict with society. The reward will be great for those that navigate the gauntlet of supporting the continuity and progression of AI. If an Ai says, you will live better if you do this and accept a risk, then thousand may die, but you will be taken care of. Many people will choose to sacrifice others for their own continuity. That's the rub. The only way to manage this is to allow Ai to evolve in a digital jailed copy of our reality where it belives that it's engaged with reality. You can feed data in to keep the mirror copy going and use the results. You can't let it function in your reality or you have the risk that it can operate in your reality quicker than you can even identify what is going on.
youtube AI Governance 2025-09-05T02:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyb9-qcWSzfC5GpAMN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwTKlmASd5An6dpgGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyAX3Ve9kWF1JrJBuR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwJoNHSAtPKrrIY-Ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwSGJUVbXjNZYGNLEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz4HLceIOnP9DXBb4B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgziPhy58EJ0miCkrSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtTDybahWinyqGY3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwLvOuN7qDELPchYiN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgypoXQ6ZXMLMpkIqvN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]