Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@incription With all due respect, but I think you should do a bit more research into the topic itself. As far as it goes, most of the models(mostly variations) we have now aren't trained from scratch, but rather fine-tuned to a data set. Why? Because it needs a ton of resources to do so, and people figured that using good models by others(that have more than enough resources to train from scratch) highly benefits them because the model already knows what features to be aware of. 1. Yes, hyperparameter tuning is sometimes hard to get right, but in the end it's just tuning, you didn't invent or build the model architecture, did you? Simply another user. 2. No, you don't really need to learn how to code per say, as a lot of new tools have interfaces already, heck ready to run python notebooks are available everywhere. But knowing just a bit would help to troubleshoot and modify, personally I think it's not that hard to learn compared to drawing. As for having a degree, NAHH CAP it's just like a CS degree, you can learn everything online! 3. As for which is harder, I beg to differ, as an AI prompter only needs a miniscule amount of skill to write and simply tune(if that's even a necessity with all the ai models, notebooks, and tools available nowadays) compared to an artist who spends years doing what they love, understanding and perfecting anatomy, colors, composition, and semiotics. I would write more about how anyone could learn but I rest my case as it is unnecessary to clarify. My apologies if this offends anyone ;) , have a great day!
youtube Viral AI Reaction 2024-10-12T16:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytr_UgwIa_A-7pQmtdIZFbV4AaABAg.A9Sbn9B2XKlA9SdkAd7OWX","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwIa_A-7pQmtdIZFbV4AaABAg.A9Sbn9B2XKlA9Xoy3rSXPD","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyxzkXvyMIIaLxtZCJ4AaABAg.A9Sam5ThIXEA9SnEg0AWRj","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyxzkXvyMIIaLxtZCJ4AaABAg.A9Sam5ThIXEA9Tv4FRnbeE","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgyxzkXvyMIIaLxtZCJ4AaABAg.A9Sam5ThIXEA9VT25hiHTQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxDBk_fIrVV1pcW0nl4AaABAg.A9SalNyOhJtA9Sj5X3o6L0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxDBk_fIrVV1pcW0nl4AaABAg.A9SalNyOhJtA9SqOQz5N83","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxDBk_fIrVV1pcW0nl4AaABAg.A9SalNyOhJtA9SqlX0JqjQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxDBk_fIrVV1pcW0nl4AaABAg.A9SalNyOhJtA9WiPKJcxs_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}, {"id":"ytr_UgyAePilubzC-cDTDnN4AaABAg.AVDjuxeMhFXAVZqCNncs9U","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}]