Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Nah, not going to happen. I use the latest, most cutting edge AI for engineering that currently exists, and have been for quite a while. Current AI is FAR FAR away from taking my job. There is no way AI, currently, can even come close to replacing me. Sure, it is a valuable tool for me, but that's all. I don't see this changing in the near future. Perhaps way down the road, maybe, but probably not likely with LLM's .. large language models simply are not capable of such things. The best LLM's in the world cannot function without expertise guidance, not in the engineering world anyway. I don't see this changing for LLM's. There are too many barriers for LLM's. Other, more sophisticated model constructs are going to have to be developed in order to replace real engineering. Simple probability neural networks cannot reason, and it is unlikely, if not impossible for a computational machine to EVER be able to actually reason. Computational machines are bound by rules, rules they cannot operate outside of. In order to reason, one must have the ability to transcend those rules. Computational machines are not able to do that as they specifically function by rules, rules are what makes them work in the first place. This is just another "AI is going to kill us all" type of video. AI is never going to "escape" and run the world. ... and NO! .. they cannot "think for themselves" .. that is impossible .. for the reasons I already gave.
youtube AI Governance 2025-11-28T02:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz_9zr2PfPbNwZxgPx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgycNgDE5MdO_16nrQB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxT1L8B--c3IqfXnSl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx8_rmMuTcc6pmjcxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyojqJu_3Q02sPasW94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyFmDrP0tGou2aObAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyuvwdLL10Dfc_iE5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxAUwXja3-pvW_CAY14AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwjyPFNv232woz3And4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyJA4nq5N_yT8iSrpJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"} ]