Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
GHB is also consumed as a playful drug in less quantities than it use as a sedat…
ytc_UgxZhXUOH…
G
The 99% to 1% analogy doesn’t make sense. If there’s a 1% chance of dying in a c…
ytc_Ugy0C40Kl…
G
I backed off the moment Chat GPT offered to help me install an AI on my computer…
ytc_UgwTPKIH9…
G
When someone with his level of institutional power warns about AI becoming 'too …
ytc_UgxuQnp0T…
G
Well, Amazon’s laying off thousands of people because they have AI now it’s alre…
ytc_Ugw4D7De6…
G
@LavenderTowne people will write a whole essay with the most dogshit opinions…
ytr_Ugy-TL1JI…
G
At the pace AI is moving and with keen regard for it's potentially harmful socie…
ytc_UgxA2UK6I…
G
AI literally has thousands upon thousands of points of view all downloaded into …
ytc_UgyFVyyAI…
Comment
The writers want the studios to promise not to replace them with A.I. writers, but can the writers promise the studios that they will not use A.I. to help them with their writing?
If I had a million dollars to blow on a pet A.I. movie project that's going to fail, am I taking away jobs from human? Jobs that were never going to humans to begin with?
If it was well received and made a billion dollars, do I owe human beings anything all of a sudden? Do I have to now hire human beings for my next movie?
If writers/actors/creatives want more money because the studio made more money, will they allow for a pay cut and share the risk with the studios if the projects fail? A lot of writers and actors negotiate less money for points and more backend. Would they be interested in those kinds of deals like Stallone's deal on Rocky? Or Cemeron's deal Terminator and Titanic?
youtube
AI Jobs
2023-08-28T21:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHpCA6FswYfv0g-bR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4AU6P40VksfWj_fB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzaw4SxN9MkbF-MLLl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcMyxaW4I8LJP--sZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxfdAs6VDAb1vEvAl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxLci1GXkp9usF0rwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8P3h5EoL7lSHjVb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4niYGNRjtDFwdKCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZ-aBerpomm4YdzuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1s3GqoFS3l4MdOC54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]