Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The law of unintended consequences will bite us on this one. Precisely why techies are clever by half with their AI pursuits: they care not if they should, only that they can. It's a Pandora's Box, this.
youtube AI Governance 2023-08-06T00:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzwSYjca-GsSU-XkoB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyz1kq930Q_kmTi3XN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwQCLXz5GQwNPnepkx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzyu2ocOrJa8vAtCgt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzF2RK3bWOZ5pEWlWt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw8CuA-d61vqSyqLAZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxIiBEOyg-41pUCs3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwFwNqono3skvI8DVt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwfFpwx_9A55U_hPqp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw_FFt9EsvClSHeXaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]