Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All this AI talk has got me thinking of the TV series Person of Interest again. To create The Machine or Samaritan. One to save us or control us. What happens when it glitches and doesn't take accountability for its actions? Would it tell us that we are wrong and then become Skynet? Would it then enslave us as batteries? This comes down to ethics. Just because you can do something doesn't mean you should do it.
youtube AI Governance 2024-01-30T03:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxt24K7wrZ6VTC9VT94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzRsOO_HboUkgGXnpx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx0jyGQ5nArHq61CyV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzCBjB7BxlyOtpq8N54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxlFRKTyx9E8XuIVGF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzBrxku7icoduZqAh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgydJ6UjoO6N18aJust4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugycza7bCmNvIuCZBOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyFlZ2DUj8h5hznWuR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgweCcz5BPxx0i7R8I94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]