Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Except that it doesn't work. When I first asked a question to Bing Copilot, I started with: Good morning, would you kindly know if <and then my question>. Thank you for your help. And it would give me complete rubbish as an answer. IT only worked when I skipped the pleasentaries, which I really am not comfortable with. Decent courtesy should be mandatory, even talking to a machine. IT does not matter who I am talking to, as it says more about me than about them.
youtube AI Harm Incident 2024-08-08T15:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxGyreSSoTeiw30Gll4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy0yuyZH8IkYEW6sc94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzJdM-T5X2_vNx-aWx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxem_U7cuklljfOdex4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFNPJVBpY-X6_oYUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxnGTasiOp7YEyLTFx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWkyW41-dwD8KKuyd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzttQLvhFpaxsjsSPF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwWJvvFsCDBk9VcaYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzhLkAs6_FsWUvDZgx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}]