Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t understand; how do you get such certainty about something so uncertain? Some outcomes can be predicted, but the sensationalized subsequent catastrophic consequences of those very general, benign outcomes, which were not predicted, seem to be no more than captivating storytelling. Citing unintended and unexpected consequences to bolster an argument for specific and extreme scenarios is puzzling. It’s like saying that because you’re unpredictable, I know exactly what you’re going to do …and it’s horrifying. Fear is useful insofar as it makes one cautious and thoughtful in order to find solutions to problems and to avoid pitfalls; on the flip side, it’s patently self-defeating when employed as a paralysis and fatalist lever. Of course, it’s also of utility to authors, news outlets, and anyone else standing to profit/benefit from employing it. It’s no surprise that the most established and thus far successful AI companies are the ones most clamoring for regulation. I find all of this fascinating. The evidence of all the events of our lives seems to carry no weight with regard to this illusion that we are so fervent in maintaining, that somehow we can predict the future with respect to some very specific and life-altering event. The belief that you can predict the consequences of invention is a total fiction. We can’t even predict catastrophic events that will happen this afternoon. I reject “no solution” as an answer. Unexpected things will happen; it’s the nature of the future.
youtube 2024-07-18T20:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxbc-G2VWJ6gI3_LPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxQi3_hZrb3jyO_gNN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_Ugw8vRXJawrz6SjnVth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxNgiY34q1aWsKIJq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz_UWo_i06vU0eFbrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyDBS36b-RADYW0hgt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgywshKs58OeMmGVFiF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzbqSPApXQrEYGSwtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw2fS5_bKNSA4KFDWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIN-C-3W4UzGE6SUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]