Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In the style of 'X' is a very broad term so being inspired by works by Van Gogh could look completely different than what the AI will ever do, the AI will just copy his oranges and blues and the swirlies. A human that has actually done art study will more than likely change the colors used than him and simulate his use of small scribbly strokes. There is a lot of stuff to get inspired by than just making a photo look like it was just onion skinned and liquified to fit into the other picture. Not to mention the AI mistakes vs human mistakes are completely different... Human sucks at drawing hands so the hands are a little too round, AI sucks at drawing hands and suddenly there are 8 fingers and some fading into the wrist. Prompt stories don't carry the level of imagination or nuances that humans do. Geiger literally made a biomechanical gun that shoots babies as a statement on what schools can do. If you write a prompt like that I guarantee the AI will include an actual school, or the gun would literally be firing out babies rather than being shown with the mechanisms. Emotions dictate a lot of human output. AI doesn't have emotions to show any restraint or not do the blatant and obvious. AI can turn out some cool stuff, but it doesn't attach a message to it and doesn't give the same weight as a kid drawing you with a pet giant yellow alligator because they wanted to.
youtube AI Responsibility 2023-01-24T04:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy7PmXp353jbB_xovd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx4raJh6N_IyZNI4w14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx2ytDu1cRWqbcu6DF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwZgtM9EUoYMFkCbVp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw2hcULIapb5sfMiQ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw9h_tuJnO2g69Xrwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxs-rSemxphE5gflZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwXQx4xsvJJZ0PTxYx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwZM34zH1cAKIAbCyl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfvJFx5dr3W4O58U14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]