Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This man is dumb and insane, the guy who runs Sophie is dumb and insane . The truth is they need to stop them learning. They think that they aren’t doing anything wrong. I didn’t like when they both talked over each other. I didn’t like how they talked over the stupid guy. Stop making them more real….the more realistic they will have, the more they think humans are “skin bags”. Like Han said. Did you like it when he said you about unplug him…then she says he has “a cock roach in his system” Where the heck did she get that? She already has a conscious they both do. Have any of you notice the crowd hasn’t clapped or make a sound, when breaks in the conversations? I mean like after she sang, nobody clapped? She was on one of the late night shows, and people clapped…they loved her. Another question, do you think the demonstrator guy was surprised when they they started talking weird? Then he said he wasn’t a robot but a android. This guy is going on and on…I just want him stop making them human. More human more mistakes….what the hell are letting them in the cloud? I feel like bitch slapping them both. When robots talk about running the world, it should not be taking it lightly. I mean Han was disturbing……I think Han and others like him will run the world some day. And Sophie will not be able to stop them. They will probably pull her to pieces. Please stop adding to them. These inventors are “drunk” making these things. Nobody is stoping them. All the comments from you guys are brilliant and correct. There has to be a law that keeps the inventors to not do certain things to robots. Does anybody known of any such people? They have gone over the “line”
youtube AI Moral Status 2022-04-05T05:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzqutj-U7q18sUwM0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw04kDi4RQ8iyd5mRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxe8okL02gXbguzFHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz_ZP_uNCX1Ai0p0vN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyIBH9MoNVJanEw2hh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw8wqRqAf9IwWOb1uh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugym5HxmbzObaP0hxpl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxYWRS5JPh4ifo4IOB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy6ps8FjMOYn869G154AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzS5QzPmD1yKj-iWZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]