Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think one problem with LLM is that there is this predicate that all knowledge and intuition can be contained in a text format, but the text format itself is written on the predicate that the reader is a human, with human experience, and takes a lot of things granted, given its writer and the reader are human. Watching snowboarding videos for infinite hours, can't guarantee you then know how to snowboard, the experience itself is different.
youtube AI Governance 2025-08-28T11:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[{"id":"ytc_UgwV8kFXbzrA7ncRSZV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyiozCxUll1cVRMUkB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz1dbU2M779PvDtxbB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyeJlpTjm2VvRXX4PR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw4E3t3T_zFW6ablZt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}]