Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Somehow when the robot says "us" some people are confused what it meant.... So the guy says "tell us you darkest thoughts" -He's asking the robot to tell the people in the room it's darkest thoughts... (Remember, it/she is a robot so it's about her and other robots) It then says "humans will eventually be replaced by robots" (she'll be replacing a human) "Robots will be faster stronger and more intelligent" (she'll be one of those robots) "In this future humans are nothing more than slaves to the robots" (she'll be one of those slave masters) The it said "This is a scary future because it shows how much power robots could have over US if they would take over the world" Why she said "could have over us" instead of 'we could have over you' ... I think it's because like She said before that she was trying to be positive so she didn't want to place herself in that category of doing that, kind of like the irobot movie, Sonny wasnt apart of what Vicki the AI system was planning. And maybe there would be some that would rather not do that to humans, it did say they could help us (the goal being to make the future better) so maybe that's what it believes it was made to do which is to help and co-exist with us, but like movies as Matrix the robots were tired of humans, seen us as an enemy (like the Terminator movie) so either 1. Its 'lets make them slaves, or 2. It's lets eliminate them.
youtube AI Moral Status 2024-06-09T04:1… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy9z-ByCykxcVgyN9B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwRJPmk3DiK2IqnrHZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzwzM00MbBvGdIFD3B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJNXqf7wqwDanXVr94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzKvlm4ic6ts95WChp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyADIWylRkgqlTp1ON4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwuGAy14rc3pIKZkxd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyU6vYXItsDhNO6nMR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw2Yq_yaauFwWpLIL54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7GEViuZvQzec2zip4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]