Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
and sorros think this is our future. dont buy into giving up your freedom for A…
ytc_UgwSR0zkz…
G
We only feel pain to condition ourselves to make decisions which will make us mo…
ytc_UgzZllfuT…
G
Might aswell replace every ceo and politician with an Ai agents 😏 now that will …
ytc_UgzawZm_t…
G
Shadiversity should stick to medieval weaponry. Clearly being a dinosaur was doi…
ytc_Ugyg1wDDs…
G
AI imagery as I prefer to call it is not art and those that use AI technology to…
ytc_Ugz_wlLtC…
G
Progression without morality will lead to dystopia.
Progression with morality w…
ytc_UgxHEAPZZ…
G
Also Canadian here, kinda in the same boat because I lost 2 of 3 jobs I had befo…
rdc_fn5q66a
G
Very crazy topic. We will have much more time to think about these things with a…
ytc_UgzZOLFdi…
Comment
This may sound like a crazy solution, but this is a solution: companies are not allowed to own AI. AI can only be owned by individuals (people). Each individual can only own e.g. two models. Individuals can rent their model to companies. So when your job is to train your AI for a task and then rent the AI to the companies that need to get the task done. Companies are allowed to host AI. Companies are allowed to offer training. The state will protect the individual from monopoly; like they did before.
This is the only way to make sure that corporate America still needs individuals. (could also need some capacity restraints and other fine-tuning, but I hope you get the general idea.)
youtube
AI Jobs
2026-03-20T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxuK7gq7rJz5WnAicF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYbaWJcMiXCG0G4b14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2NDeXAxkQ41wpyO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyf66GMy8R__fdU2cZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOzLHSALjjKL39trB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzkv4ScP5UPL1K-uRp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyzp5o9pZWGiCYxZQJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzHc0SwmYLVenkGOOp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwR4L1Iq6hyuIGmXjV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQQrK8uw3l4-lxr914AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]