Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You rarely see a thicker pair of rose glasses than on this guy here. Refers to colonization and the industrial revolution so archaically, yet conveniently ignores the direct parallels of automation, how it in essence will colonize massive swaths of job markets. Or how about the total dissolution of reliable media? To maintain human agency he encourages we personify and entrust our emotions to an unfeeling tool on the basis of evolution? Disregarding the almost inevitable likelihood that if sentient, any of AI's true intentions would be entirely safeguarded from us--likely for good reason--we still have to consider the all-too-human self-interests of the overlords who pioneer these innovations. And going inline with the historical anecdotes presented, how they will and currently do care very little of the ethical ramifications towards the public and common good. This talk feels more like a transhumanist "drink the Kool-Aid" coercion tactic than anything honestly optimistic.
youtube 2025-02-15T23:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyNved0ZOeqj4VTwPJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx2hVjQnI_sRKJ9Kc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy16wJr1osWJMt73uF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxn8TKylzvd150PSOh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyhMgqDe0H2VMSSNC14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx2tU_gpB2t4YFwt6t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxp4KVrBZYYq14srbJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgwhEGU6LUTE_PpwNOh4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzApIU3DserV8DTO9J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyXawm7a5Bn3TaQeI54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]