Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
WARNING: What I'm about to use as an example should NOT be taken seriously. Now imagine a robot in 2040 engages in a mass shooting and is the main responsibility of the incident....... This would mean the deadliest shooting in modern history. Nobody would be able to stop the robot. Now think about that for a minute.
youtube AI Harm Incident 2024-11-13T07:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugw4aUI220u4eCtgZJd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgxVq1BXtAun_KPNoX14AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"indifference"},{"id":"ytc_UgwR36JkDPb9nbplioZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxqMlJ0lFXN36GdiyF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxN7ZJ9r_bCzer3c_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzVvBuPD_QI8nkOoBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugxjnvku_N2A_y_9x5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxLmtXRh2p5N7ls-Ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgytDge9EP_iPAEGKFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxhjtKKq89c4tB1Sh94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}]