Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My guess is that if we ever get a generalized AI that is exposed to the outside world, we will all be turned into paperclips (or whatever) because the AI doesn't know when to stop making paperclips because whoever created the AI made a mistake. My hope is that a generalized AI that is exposed to the outside world has our best interests at heart (or rather, all living being's best interests at heart), and figures out what consciousness is and how to guarantee similar alignment out of any future generalized AI.
youtube AI Moral Status 2023-08-20T23:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyybC4nL5cFgsKiB0p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmEIiUPypn7Tlo_xx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzSx3p89IuxTaihi4l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwBAnq1ORdiWrXSRsx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwruCKHmOzcwXDNb154AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzs2CXLCiIwB9PiUWJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwaxQOveC8RMgLe0PV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgywwIuqN8sINnT5wsh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyxn-fVukocWtR-QCl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwGZxylCbatCfhm3O54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]