Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The path to removing hallucinations involves more checking of truth. Regardless of the reality of "Ai knowing Truth" let alone "caring"... more checking costs more more electrical power and chips. and where do they stop checking? That's a way to predict what's not worth doing for the companies.
youtube AI Moral Status 2025-11-14T19:3… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyindustry_self
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwQOiyO3OCzih3F6Nx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwZ8r2PkMx5xS5VR0Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwRRDIxC8EnRV-xB1Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx43e7kWLFGYaa3hQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwLoZ9ZwlSd0rYQOzl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzh1aZjpluQuI8zrZB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzhjdP__n9vjBYJVOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuFNU9Q7nla7Usgv14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzGIF4ab5tdnSxd3Th4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxoIpBwJY9WVY8T0Rt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]