Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This concerns me greatly. When can we tell when its not an act and its real? And if its never real why are 'WE' real and they are not? Objectivly speaking, not by some soul or god but something observable. I do not think AI right now is sapient but I do believe at some point it might be and we might have a few decades of accidental slavery and thats how we get SkyNet.
youtube AI Moral Status 2025-06-12T06:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxsB_yUKPLDA1xb-zR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyXvXT1z6ux_CDlBMd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzNywJx6Tcab34rsCh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyYYOGVsSY3T2iXD_54AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyulV1B9hNr8pypMz54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyXvJc9ITFejG7j2cp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzt9R_Tj0JCGC1NjON4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxiZhPG1m9bZXn9yjB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyqcitwWOAgVFWonsB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgwUA00LpNglyFw3cgJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]