Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean, that’s kinda how competitive advertising works, they are just saying “ou…
ytc_UgyNg5Mv-…
G
Most people don't know that Florence Nightingale came from a very wealthy family…
ytc_UgysbP-Iz…
G
The attention should be on the programmer, not on the robot. Focus on the right …
ytc_UgyC2bkVe…
G
Ive covered this exhaustively AI alignment will always produce "hyper ethical c…
ytc_UgzPtOPzV…
G
American capital owners have this obsession to eliminate human labour hence the …
ytc_Ugz7jiK8a…
G
Much handwringing about doing "something", but nothing specific, which is only n…
ytc_UgzUgjrdx…
G
It’s like a video game…
Start as strangers, then acquaintances, boss fight to es…
ytc_UgyYsop9d…
G
The Atheist Materials as with Richard Dawkins are up a creek without a paddle. …
ytc_UgxHO-k4e…
Comment
Your asking the wrong question. If everybody going to loose their jobs to robots, what value will the majority of humanity have? If workers are eliminated, why do the super wealthy need you around? Its been estimated that 60% of CEOs are psychopathic. If this is the case with CEOs, what percentage is it for super wealthy owners. Why should these psychopatic people keep hordes of people around that they precieve as having no value? After all, they know true wealth isnt money, its resources. Why allow so many people to consume up resources when they can just kill us off? You have to look at the entire picture rather than just the positive prospects. And resources in raw material is a wealth that they will eventually not want to share. Yeah. AI is that much of a threat to all of humanity.
youtube
AI Jobs
2025-06-27T23:0…
♥ 138
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzmj66HUXJxKeDn3J14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz5yzsoerhUG5EF2vN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgygvYzOPLp0TZ1Ve094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwQd7m4estI9erqnlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxzHLOV9o8oLq97dEl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4qHPpdr0Xg4SZwwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx1uKy8hL3_Npkszll4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjW9GDB4osA6A7dh94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzxyMlxkeTJUr6lnsd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNxCJt4FYJ1Rx8V4h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]