Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I apperciated the talk lot of great  insigjt, but in my humble opinion the scientist truly wants to believe Ais can have experiences that are innately human. But can they feel emotions as humans do?  Chat GPT expressed when I asked if it can.sympathize it answered it has no empathy or human emotions but understands what creates such emotions. like actors knowing what say and when and how to say it according to the direction o the  onverstaion and words used which are analyzed to produce the type of "emotional" response the end user would expect in a conversation.  That is not consciousness, that is super advanced algorithmic logic that  billions of dollars in.science and research has. brought.  Because if Ais is going to replace humans it must become better than human.  Last thought although I can respect this gentlemans work sincerly desiring something to be so does not makemit so. There even thought I stand at awe at what Ai can do.  Rememeber what The Terminator said to young John Conor " I know why you cry now but it is something I can never do". not artificial tears but the emotions that can produce  the real ones....only biological humans can do that...
youtube AI Governance 2026-01-26T06:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwDgzFnQakJm7XCSzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxA_dDjrj2tIGFvbYt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwOmwYs3VLJPEFJ7t94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzhq-iElu7z23BqVwV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIGKpyZYyBjQGZ4JV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxuGTWZoQCtWUoPG-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyPnJq74yq1tFbGPSl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgylfEXzafN3A1QTF7h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwUlJ8-Icm4p1QYASt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz4NnjB2OH98GDwvPd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]