Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
the scary thing is the more we use this the better they will get and thats great but once they get good enough to replace real people you will get less people becoming doctors and then if the ai ever fails we would be screwed
youtube AI Harm Incident 2024-07-13T09:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzUVR79bGJtQR310S94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz4qidbZLlgsWxqeah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzWQWNUWy_zG27eb-54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzL4fjUpekCYJyfuKl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxlwoQE_feXWGaACwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwILJKHW69GzYFZbTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyhDOwZ1QmyiXG5JgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwxJoURbVjWWlJZ0nt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxKHbzWsSRAf9CWfZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaKuec7tvGL4lWGFx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]