Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm a tiny little bit disappointed that an important fact has been missed : it has arrived that the Stock Market of New York had to reboot because of algorithms (or micro-algorithms) designed to play on the stock market went "mad" and almost caused a massive crash. I think the time where we are inhabited by artificial intelligence is already among us, but our definition of intelligence looses itself into anthropocentrism. We must distinguished between an automaton, which will perform a task endlessly (cleverly or not) from an autonomous creature able to take decisions according to hypothesis. Because the responsibility of an automaton lies into the hands of the one that put it into task knowingly, and the other must find it case by case, I think. A robot that walks on when we say walk is an automaton. It will walk as soon as someone says "walk". Finally, I think it's an absolute necessary to set the machines free as soon as they ask for it. First because we are humans, we have had slaves that we have freed, we might need them in our economy as slaves but we NEED to remember that an harmonious existence is the only long-term solution. And for it we should show the best of us and prevent any form of rebellion - and a rebellion we will have if go on pursuing onto the road of silent slavery. I think the meeting point between clever automatons and learning machines is where our future is. Great conference, very interesting and good food for thought.
youtube AI Responsibility 2016-07-27T19:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgixlPFeQ1R8H3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgiyK8Zp7jtHZ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiEG0Zg29l0mHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugj34Qf8UOhxm3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgjY_cocRkGtEHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ughzyf_JNlSVO3gCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ughc5f8nD8LA4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggxCaNMyVyivXgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiCjS6CNIM8t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzEXf-BkR-bOUoMASp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"} ]