Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In the final discussion, Hasan brought up again the possibility of robots during his mundane housing chores. Neil used the Jetson tv show to explain to Hasan that it was the Jetson car that was the robot and the use of self driving cars today the car is the robot. The car doesn’t need a humanoid robot. Scientifically speaking he is correct. Why have a butler?. Tyson is a great scientist but his knowledge of human nature is not his forte. With rise of loneliness within our society and persons reaching out to a chat box for friendship, the humanoid robot ( not the car itself) should be in the drivers seat. The humanoid robot is not responsible for actually driving but serves as something we can communicate with or choose not to. The humanoid robot may discuss how the red Sox’s sucked the night before or listen and respond to a problem you are having at work. Who knows, maybe the conversation may lead to a creative idea.
youtube AI Moral Status 2025-09-24T14:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwP7llphkOQwLmzMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzOqtsgTBKLnh6gO9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgywIOUOYLnviRivsYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxBVuaXfKXF2clQiFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzrW-rqNsIEvqf5j_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzuUtI7zyoyStgL8IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzqH8w4OwVCWtC6liF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgznPwf24gjDEV0sJbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz_Yqg3gID2gcjjPkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwjBiXKPRmgf1Qg5MJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]