Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When making robots, I expect the key will be to give them motivations. Robots would have to be given motivations in order to be autonomous, because a logical being wouldn't really do anything without unless it had a goal. We would have to program a goal into a robot, and I expect that most robots would be given goals seen by we humans as "virtuous." I expect these goals could be made to be open to change, but in humans that allows for extremists and such.
youtube 2013-07-02T03:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxlymtF7AZIsF8hs8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy5ELZrEi8odmijOFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyQK9QtiCZymnsajzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"}, {"id":"ytc_UgxCf61kfFLddphnoc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"}, {"id":"ytc_UgziRsg2tmEBgr54Rhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzN2SDs4-ukmmWP7K14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgytIx0dEEb_9BWBLxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy_8PwpPqJMkTVv31h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLetMJJYZK8PPsqkt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwpQdgG9z2dYci7a3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"} ]