Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My biggest concern here is an AI developing some self preservation without it being programmed to do that. Did that come naturally from the code?
youtube 2025-11-05T13:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzmJUVtScKfn_SOqYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzhg5sCVsn6H1n8l9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz5tbkgho4y2P9j4qp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzvwYGq59SCVUK6maF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyAxa_nh_o2LVm80XN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzvd7ON_b-Q8vareTt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxOdR_v8sO45gwQlzl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzQ1cdVMVmPZPEeGnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyL_f3Gb1gNhQQ4g_R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzCaHoUAnwokgInuLl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]