Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LLMs have no minds capable of aligning with or against our goals. If my car's accelerator gets stuck while I'm driving and and the vehicle crashes into a wall, that's not an "alignment" problem.
youtube AI Jobs 2025-11-18T20:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxSMCkv5OtbnA0av-t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXxYd_l8QR0ZY8zvB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzGZ4UDoYwpp_LSgQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxYd6UlBMI--mbcysV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwCzKbAhJgHJTOzD1p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyoIZrEpHL2dzZVIFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxoqAbla5Ic2tUMXSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyUYHrmVydazWzWh3V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw-0f_l9qEAknzrrUt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgynF-YIPQuCprQgJDB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]