Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'll never be able to understand self driving technology, they still haven't figured how to stop hackers & other types of technology problems which means it's not fool proof, how many cars were recalled in the past because those robots had problems and people were killed, can't wait for self driving trucks
youtube AI Harm Incident 2021-11-18T13:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyvGKMIi7dyh4AogFl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwwDx8_1WONpv3SnYh4AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyKO4CpdTjKF7r4oNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw_bgyGIHaf_qcqCVd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzabSrG34sT4L-4rIJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwgqVZUg150Ax7OGgF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvqnIFCUPURiuhesV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzCbsEJnEGR0rTUK2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzoInJxE4ogq-QGdIx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz_UXhPkl672XUo1HJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]