Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@1:20:06 - “The public interest of these technologies is at the core. And if it is going to be using our land, our air, our energy, and our water, then we need to have a say in it. And these tools should be used for the public good. And that actually means making this something that serves all of us, not the few.” [Read also: EMPIRE OF AI, AI SNAKE OIL, and all of the books by forensic epidemiologist Harriet A. Washington ] @1:20:35 - Calliston-Burch: “There (are) Isaac Asimov’s Laws of Robotics,(as follows): 1. A robot must do no harm to an individual, including through inaction. 2. A robot must obey what the human says 3. The robot must not do harm to itself, barring laws 1 and 2. And the later there was a Zero-th law, which is, the robots cannot harm humanity. [Read also: JUSTICE FOR SOME, THE NEW JIM CROW, and THE COLOR OF LAW to see how laws have been and are being used to harm humanity.] It’s 3.28.2026 and this presentation is obsolete.
youtube AI Governance 2026-03-28T21:1…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzWhMQf-fWyYRC0RQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwFNcr-iTYJNlYWxNV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw7YGiuUMNoGVywD7V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgycwsUmKqm4OczGMxN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzXV3bNuWnww8U2-KN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwtvB9lp3PCyGgud7Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx_Ev-M9lkiBtAXFyF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgylRrXtZbD14m8WgSt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwbpMwdbQuOvKCPFDR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxEBlpi5snRAYIlw6V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]