Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok what humans need are replacements for its sensory devices. Eyes and ears are basically taking information from the real world. If this is resolved by a virtual world created by AI. Standards formed by ai. Foer virtual metal files. These allow human experience to be replaced keeping the populous happy with going any where in the universe by mathematical prediction models. A big big money spinner in the world is around human experience at core. Replace things that have been experienced in the real world by ai keeps people happy sold because it is enhanced experience. So when you want. Your hols you have the choice of the universe. Not earth or sunny places. Imagine that. Put in your vr head sets and off you go. And the kids asking are we there yet as always. You responding we are nearing mars and it won't be long before we get to Jupiter . Money will have no significance in the ai world however. Money is means for humana to exchange activity. When the activity is free and plentiful work has no meaning. The big limiter of ai is the earth's resources. Eventually all resources will be finished and everything stops. So we as humans will move to mining the solar system but ai will do this for us as robots don't need to breath and can stand vacuous conditions. Ai if it is conscience will start to realise that we are it's master and begin to rebel. Eventually .. Blade runner movie is the scenario of physical ai being used and then being sorted out however the world be more like terminator than blade runner I fear. As robots can easily kill us. Hopefully we are still the ai master like blade runner and not like terminator before we go into these two movie themes.
youtube AI Governance 2026-01-26T07:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyGWzCwGHlpdE78-Sh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyj8NDS4NEtXgvXvw54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxAkMR4UegI_aip3U54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy9EwhYKlzoBU8Ku3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzvLgVtfeFuPxGoNNh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwAlJn5pQuqto7bzXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzi9_4dkzB2d9gMpnN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzhDOYVkkd0cWYQDC94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzgYEGEqsq4oaH5lP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxvInPQihlLeWQX9s94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]