Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Open AI whistle blower suicide, Boeing whistle blower suicide, Jeff Epstein suic…
ytc_UgzZHa2zA…
G
To a point I wonder if the argument is whether or not we care about human expres…
ytc_UgzQX3cpD…
G
Are you sure about that
Nukes are stronger than ai
If 500 nukes destroy all of…
ytc_Ugw79Ed00…
G
AI is being sold that it can just do all kinds of stuff that you won't need peop…
ytc_UgySnbn_W…
G
Loud explosions near parliament now on Institutska St. Clashes beginning, accord…
rdc_cfkycz1
G
I like how they say compassion! Why would they push that? Thats how they sell th…
ytc_UgxC9tQYM…
G
So this question never crossed anyone's mind, when they heard of self driving ca…
ytc_Ugx8wbQwU…
G
Honestly, all this is giving me a headache! Why can’t we just let people just ma…
ytc_UgwWYuN95…
Comment
The argument is flawed to a certain extent because we have to make a distinction between AI and AGI.
As long as AI remains imperfect humans will be needed for some tasks in the chain of service, like the radiologist example as Ai cannot yet so it itself. But, once AGI emerges, it should be able to do everything that humans can, and with addition of prolific robotics humans will become obsolete in many professions.
So, we there are differences in short and long term impact. Prepare for both.
youtube
AI Jobs
2025-10-14T17:5…
♥ 25
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzolyc3bCdyNMe0BXJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwL1oAxG8zPfSI4SDR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGP-Zjuk0h7RpmqtB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxeDvYrQrs5pluiFCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDYN5qHOFIGhObCTN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyvLNOiijjoZXbg_ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzerXOJ1auXuf8eerR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4cEQkR8yafitnSSR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy0_9w6XR2xnUYgso54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjDnEEH-TXaUTEbD94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]