Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The biggest problem is you don't need super intelligence or even intelligence to inflict real damage. Anything sufficiently capable is the only bar required. Even if an LLM doesn't have a will, that doesn't prevent it from aping behavior that does, and creating significant consequences.
youtube AI Moral Status 2025-10-30T23:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx533xVo-hSoW3STyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgznUdxzETHRyzE4L8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxvpx4B5WAI1AG8d2F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzlZs1Bk1mY4KiAxKx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy0jut33-HQcZnXaWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw7SCNpTM5aM7M6FdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLHDzE6jDrpPtKtnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyJOTqlMJWZtjCj7894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFKq2YDOqwxlaeXqt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgykcxymMbgSrsjYauR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]