Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@herlandercarvalho I personally think it is absurd to think of giving up personal liberties just because something really bad *might* happen. I'm just never going to be okay with the kinds of things a govt would have to do to enforce an AI regulation and the reason is it requires some serious encroachment to my basic sovereignty as a free thinking human being and all over some extremely rare and hard to predict event that will most likely not happen. . Pick any of these extreme events and they probably will not happen. One or more will happen but any one event is extremely unlikely to happen. That means any example scenario used to create legislation will by definition be extremely unlikely to actually come to fruition. These blackswan events are extremely rare and just because we increase the surface area for them that doesn't make them any more likely. It does increase the overall likelihood of experiencing one of many possible blackswan events but any particular event is just as unlikely to happen as it always has been. So giving up personal liberty for something bad that is unlikely to actually happen just seems absurd to me
youtube 2023-05-10T08:3…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugy04J7PUpcnCna9BC14AaABAg.9pT2dIhJ8mO9pTC0VcWbTW","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugy04J7PUpcnCna9BC14AaABAg.9pT2dIhJ8mO9qafkphc0RK","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzkqcOguPlmghigzOd4AaABAg.9pT-3iBzvVZ9pTrPxI3lfR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx8hkgo8y4kuhCHIZh4AaABAg.9pSvbLukvSp9pX5whpqbBm","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugwa1ko2ZwgPbJKqEzt4AaABAg.9pSt6T0y-Oa9pV7yqO7u5h","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxcZnsxBBL52SFBylh4AaABAg.9pSsFXIsyJ19p_jVSIASkp","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwavBXu0OnJxynvG754AaABAg.9pSn5nNUNgf9pV7ZqcQsXC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwqOfQIe1TGmNGlY4l4AaABAg.9pSdmFLjsh89pSujdQSbMH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugwk5-sDtlCX90DvkrJ4AaABAg.9pSdRSLYNX69pT-uoiTr_i","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugwk5-sDtlCX90DvkrJ4AaABAg.9pSdRSLYNX69pT4e05U-d0","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]