Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is a tool.. just like any other tool people look up for various medical or other things on the internet.. the problem with AI is that it has the word "intelligence" and people forget the word "artificial".. people are used to questioning medical things they read on the internet via forums, book sites, etc.. however since AI appears to be "humanly" people dont question it near as much... I use AI in my coding, web design, engineering, etc.. I use various AI tools including a couple CHatBots and API's... they get it wrong... ALOT.. while it sends me in the right direction to solve a problem often.. it doesnt have the definitive answer hardly at all.. I stll have to go think for myself and develop a solution.. just as everyone can point out the deficiencies in AI "photos".. esp technical ones.. you know the "look how bad my home wiring is" and most anyone with any knowledge of electrical will point out 20 things within a few minutes that make the pcture obvious AI.. others simply believe the pic is real and actually talk about how you would fix said electrical system... alot of people dont think for themselves these days... they choose not to learn deep... and are letting the computers they assume are right run everything.... again they assume thr 'I' in AI is always correct..
youtube AI Harm Incident 2025-11-25T01:1… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxu9LMz7vCXppum27l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFPn8hBVJZRZdXqgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw4pZtxJ9gguF5jhPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzR9_PXFn9reSjXQL14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzfs8BErOKUt3tVLHN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyL-7MwJki1sBJBOrJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyiOqWqiD9HU9LKrdl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwTedqfJsLTHd8Z1md4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwRSfdtBtdNs9tWUn14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxzXmMuLhgXS0v1N7R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"} ]