Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm just a simple paralegal with a degree in history, but even I know if you're going to make artificial intelligence or androids, you program Asimov's "Three Rules of Robotics" into them. Or go the Red Dwarf route and program them to believe in Silicon Heaven and give them a religious mortality. Something. You don't just set lose something amoral which is capable of destroying humanity. Now, either I'm more clever than eggheads at MIT, et al, or I'm more moral than whatever sick bast*rd knowingly chose not to do this.
youtube AI Governance 2025-05-10T16:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgydKaxbVks4Pvri1SV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFnVtdSdf4e7sL2Mx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyrTva47682q9a8qMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyLpFTQE_GG63z9uiF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwaH-AXwWPe-drAhad4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzx_SWRhYM4kdXsenl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugznhu4rYa_QXjdpOnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw_RippEeCRBhEyKrl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgybFdK6XiA9bpcFE514AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyQoS-5zTCyV2mHm6R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]