Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yeah, Im pretty sure at least some of their "thoughts" are scripted, and not chosen by AI. We're just not there yet.
youtube AI Moral Status 2020-10-01T15:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzfGjBdidvMnHV2faN4AaABAg.9Db8hFUR-_s9DsIOs74P5e","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwoK9X2BvcD_AR8gz94AaABAg.9D_fEUmi__D9FY8ECGrq4-","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzHQMsfac0CzpQ9HgF4AaABAg.9D0CZ04ONiu9EZVOzgZ3mg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxrtYeNMYUgTA1a1Z14AaABAg.9CjLbV5jflv9Dv_47xrXFR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwNXvLa7Clxypdh2tF4AaABAg.9CaPRSYyPXf9EH5pUiemN8","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwvB87mRRWDsgF9ADt4AaABAg.9CTEGrnoKZc9DrCvpGi_B4","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzjUmSRaDs3jKOvedF4AaABAg.9BexzDvn94s9DuLyFc4igA","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgybBIyFrEkMRoqj8cV4AaABAg.9BV5doLZ6aJ9C57WEM1sTF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgybBIyFrEkMRoqj8cV4AaABAg.9BV5doLZ6aJ9CWdJ4d3Aln","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgybBIyFrEkMRoqj8cV4AaABAg.9BV5doLZ6aJ9Ce51PvcSxi","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]