Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
short answer to the title question: Yes, eventually, because it will think and the creators will want to distance their liability. if a robot murders, who's fault is it? not the manufacturer's because the robot thinks with Ai and can form its own thoughts and opinions. unfortunately for quite some time after this distinction they will be seen as subhuman, with their abilities truncated to not surpass humans, limiting their progress even though they would be an evolution of humanity into the technological realm.
youtube AI Moral Status 2026-01-31T16:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxplO5Dn2KidY4vkll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyCSZaAl3Yyoymm2wZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugy949M7wrtULItcT5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyluuMGcXiUx1tlu6N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgylXXHwkCQUkDc8wk94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzW-9bE4Tlrqujebup4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy9iz1jchTFs9hiRQZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx54fw0pTB8kHF0u1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxd67MavUDplSm9y794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxW0X77637XLOEdgwd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"} ]