Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
so if you think about why would we create artificial intelligence in the first place unless we are prepared to handle a new race... see if I want something to mine for me now I just get one those great digger machines not intelligent robots but may id get intelligent robots to control multiple digger machine but also id let it be programed in preset manner no feelings and is able to calculate structure integrity and things like that not hard for a super computer in the future I assume... but if we were going to make sentient robots its only going to be for a few thingsA the human race is dying and we want something to survive in our likeness before we are goneB some human genius is an ass and says fuck yall im going to make me a skynetC companion ship humans with their gene minipulting future would be soo intelligent that they can actually compete against the machines and be part of each other other wise known as symbiosis. we work with each other creating something more spectacular than machine or what human can be and become essentially god. like say robots are one side of the brain the constructive type and the humans are the creative one without the other and there would be no progress.
youtube AI Moral Status 2017-02-23T15:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UggBwt39ne95NHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi3tBoCXCry5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgiV96vmAVd6m3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg3Gwx2PlKLe3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytc_UghOb-FChO3vGHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UghwSl5bL0NorHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UggIIAf5apT5mngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjXCxJaU4DN1ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ughqt-XlMSOrZngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjOC6cVNxU5N3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]