Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The writers want the studios to promise not to replace them with A.I. writers, but can the writers promise the studios that they will not use A.I. to help them with their writing? If I had a million dollars to blow on a pet A.I. movie project that's going to fail, am I taking away jobs from human? Jobs that were never going to humans to begin with? If it was well received and made a billion dollars, do I owe human beings anything all of a sudden? Do I have to now hire human beings for my next movie? If writers/actors/creatives want more money because the studio made more money, will they allow for a pay cut and share the risk with the studios if the projects fail? A lot of writers and actors negotiate less money for points and more backend. Would they be interested in those kinds of deals like Stallone's deal on Rocky? Or Cemeron's deal Terminator and Titanic?
youtube AI Jobs 2023-08-28T21:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzHpCA6FswYfv0g-bR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4AU6P40VksfWj_fB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzaw4SxN9MkbF-MLLl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxcMyxaW4I8LJP--sZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxxfdAs6VDAb1vEvAl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxLci1GXkp9usF0rwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugx8P3h5EoL7lSHjVb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw4niYGNRjtDFwdKCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyZ-aBerpomm4YdzuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx1s3GqoFS3l4MdOC54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"} ]