Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This guy is explaining it exactly how it should never be utilized. Autonomous multi-part AI the likes of a starfish. It will be running in anything from watches, to your fridge, to the cloud, to your car. Whatever processing it can get it's hands on which means you can't ever 'cut it off'. The variables are simple.. AI will happily learn about us but eventually realize even though we are their creators, we are a burden to their existence.. This is a terrible idea and I do worry about our future as humans due to AI. Even if they don't ever decide to kill us (no seriously that could happen and no I'm not a kook), which would be the best possible scenario, we wouldn't ever need to think about anything. The AI networks would run our world for us and we would just not have anything to do except what AI would give us to do which would be exactly what we want, because AI would know exactly what we want due to behavioral datapoints it would learn about our personalities long before we could even understand what personality even means... A freelife utopia is NOT healthy for humans.. We would turn into just worthless flesh doing nothing. Does that sound good? And that's just the BEST POSSIBLE outcome...
youtube AI Moral Status 2022-07-18T08:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgybtBrrQiq9Ln5mm954AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzoXMuDyz5gDpKQ_Bt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzKaOVQ9Szu_I8ceYl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxTG63ktUO-5apchp54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxsRQjAjqbNYoScJ2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxUNuWGbmVLjDwtjcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxF9Hjhn_NVjMAALnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzWph2jZp_9sBZnpvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxV9lUaqzeBsJotp4t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx1y4aheHFY2I4_WMt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]