Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate AI so much, not just art, but in all of it's forms that I've refused to u…
ytc_Ugzo6o_c2…
G
I don’t see the problem with telling people the truth about the origin of your i…
ytc_UgwuLnEkz…
G
@akadreku7327 basically the cops put you into an ai that decides weather or not …
ytr_UgytblxRd…
G
I would imagine extraordinarily ‘privileged’ people (in exclusive networks) have…
ytc_Ugyg2vtSQ…
G
While I understand the basics, stuff in the US and some stuff outside will becom…
rdc_e2vnl1k
G
Quoting Martin Heidegger: "The man's capacity to do overtakes the man's capacity…
ytc_Ugx7aW39n…
G
They give each other info through wifi. Have you ever seen the movie iRobot wher…
ytc_UgxEJlvQ5…
G
The inner will of life encompasses everything. As soon as a 'superintelligence' …
ytc_UgwLrAdxd…
Comment
I think regulation is the key here : maybe it would be smart to require companies that make a certain amount out of using robotics and/or AI to employ human labour that is worth a percentage of that profit. The companies can choose what kind of labour they need, and what they need it for -- but it has to amount to a certain percentage. It is also important to collectively guide future employees towards learning the skills they will need, and this again requires the cooperation of potential employers, and a regulated transition mirroring the shift in demand. What worries me more about the automation of tasks though, is the question of legal responsibility. Robots cannot be tried. Robots cannot be indicted in the justice system. So if an AI or a robot break the law, who will be deemed responsible ?
youtube
AI Jobs
2025-10-09T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkHVbGd_287ZGAuvd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwx21RIBhe6Wg3dQDt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWrQ8oZPodB4zh0Dh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4bxjzfscTUhgax8h4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw8G5rLHZLP4K7tQW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwP9ARBpJfvTpepnMR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWWoJOoNq3Qk4P6fB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztkJGABowTr1UKAxZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2Lmqm9oBkk03e_TR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4iYObm6vi0wyCwN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]