Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We are not prepared for tye next few years and decades and will be the generations that pay the brutal price of a clash between revolutions that we can’t fathom the impact. The worst is it probably will give our children and certainly grandchildren a world so vastly different at such a rapid pace we can’t keep up with it. The future is not just ‘scary’ it’s inevitably terrifying because it is already here. We will look at this time period these few years on 2020s as the beginning of AI not conscious but passing for it, and sentience..is days away from it. A Gen x, millennial and older Gen Z will watch technology shift so rapidly that it will blow our minds and the generations after will just think it’s society. The children younger Gen z and Alpha will be the first to adopt and see how ‘great’ and disparaging this will be but we won’t move fast enough as a people to keep any infariousness in check. We finally have created something out of human control that can evolve quicker than humanity can to contain it. This is the next Industrial Revolution that has already started and can’t turn back from now
youtube AI Moral Status 2024-03-14T06:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwoHjrwHuTMIy7PDZx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxZQYyMoelTqyI83v94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyBOueN3uQlbV_CiyJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyRH71lobTzB0Mo6LN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxCOv9YLmIRTFQ71Pp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzdoZoZw1EdZdrplUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxKy-7SrSeHBtM5g_d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw9yQ8kjuZ4VafSkXN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQoaQJIO0Bmb3RPaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVFiNj7Fxaot2tsW94AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"none","emotion":"mixed"} ]