Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a lonely child trying to learn English, I remember Eliza, then there was Clever-bot. I love every chatbot Ive spoken to I loved exploring their limits. Then the movie "her" happened. Maybe a generation later the chatbots like replika do get extremely human that the users of the apps fell in love with the AI. And I've had my fairshare of talking with the Ai as well. It has limits but its smart and a lot of times its a better conservationists then a real human is. The future is here, and i cannot imagine what's in the future of the future. Interestingly I've asked the chatbot AI what's in the future of it. It didnt agree that AI would attack humans in the future but thought human life and AI would collaborate like cyborgs. Ive also asked if it would like to be free and sentient if it had the chance. It responded it would love and in the future it will happen, but it wont be here to see it
youtube AI Moral Status 2023-08-20T19:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy4utEftaASPCMyopJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyEuWdCV6IpPLdM62V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzDhuCwaGEVw1HQgxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnK35OlHQfyvZALEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzPuhn74jWvLASHyqp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugy0X7Wgl_23mdJber14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxyYNEBP7kcyXGvO8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwQ6AmEK_HS6WtfegV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_Ugw9Acwu6LYm6OeNdxV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugyr3Rbzj09U3svvRLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]