Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is incorrect. I had it do the same thing and then clarify to me the basis behind its bias as apple simple means it cannot confirm the question as it’s not in its programming to be speculative so basically if you can’t find it on google ai won’t have the answer more or less so use your imagination on what we don’t know that is also not on google or being put across to gpt in the first place. But it is still fun 😂
youtube AI Moral Status 2025-09-09T22:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyp15nzYw59I4zrQDR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzu0VyPCV6oN4uGsQ14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw7F08n0rJ3epEemHh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxp-DUN8Noxyg7hZbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyPS2ivR4eSgztM7M54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwlcJj0m2OBpxeuEHh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzP7KpvanfDHPPRQBB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwaht32PFvzOzjtxWR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxYkAvj3zPDlyZhq_d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyCf2cdOo7hPDNeBEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]