Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
first video i dislike in this channel... the premisse is stupid "toast have feelings" even if tecnology advance sooo much that robots and androids become a reality... They will always be someones property, so they don't have rights, the owner have rights about them... people might destroy them but they will infrict rights of the owner and be sued for that googles search is smarter and have more complex "thoughts" than 70% of humans on earth... does it deserve to be considered a person with own rights? makes no sense to think of an android as a person, its tottally stupid and childlike thougth.... So let's think tesla develop a robot able to driver perfectly and want to sell the poeple who cant drive or just make their version of uber.... them governament will make these robots complete 16years each in order to provide them a driving license? well because this is what happen to humans will a robot doctor forced to take a 5 years university in order to start working? dont think stupidly Even if tecnlogy advance so much that robots stop being made "purpose oriented" and someone for absolutelly no real reason make a "generic no purpose" android what will be his reason of existence? no lifecycle, no aging, no capacity of reproduction, no fear of death please if you ever thought of this video as a phylosofical problem... go back to kindergarden
youtube AI Moral Status 2018-12-25T17:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzT7d2rq2mQd4v3p714AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzi-EVRJiOpMcgMFwh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxmUD2JM8M4bWbh2S14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzfB_N4YdYVljtU5jt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwk_3H-Y5fawBKw9Mt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy-bLLsUYZvv0W8GRl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyAV0ewUfGE0aHLJiF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgxFmESqag1KgqEhO-V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzuZtDlqBsWu1QDkGN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxpHKvWsFjUna6H0EV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"} ]