Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
no matter how much eveyone complains, tweets or tries to fight back, AI WILL re…
ytc_Ugyw4NMpo…
G
He’s the face for his company. He has to make sure we get a warm fuzzy for the A…
ytr_UgyXn4j--…
G
I fell for the whole, "people will always prefer human art over ai art" bit, but…
ytc_UgxWjyhkf…
G
If a child needs chatGPT to commit suicide what kind of life did he have? One wo…
ytc_Ugy8Y-1nH…
G
dinosaurs: exticnted bcoz of natural calamities
humans: will extinct bcoz of the…
ytc_Ugzndoh_R…
G
A.I. is not the danger. Human beings and the code logic incorporated into the t…
ytc_UgxBRttoQ…
G
This is going to add a brand new opportunity for the funeral business with a per…
ytc_Ugwu6nK4I…
G
Not at all....time spent commuting between systems would not be an issue for Ai.…
ytr_Ugw_EHAAn…
Comment
This guy is just pulling numbers out of the air. What is his basis for "40%of jobs"? What we call "AI" is just a tool that is good at handling best-case scenario jobs. The reason the jobs he described is done by humans is because it regularly relies on a person that can do a mundane task, but then actually respond reliably to an unexpected situation. Companies that seek to rapidly switch to AI will only do so for the short-term savings, despite it being a risk to the public. In short, it'll be executives like this guy that will get rid of the jobs, using "AI" as the justification. AI can't do the job anywhere near as reliably as humans. But, it is a useful buzz word to get investors to give you money.
youtube
AI Jobs
2023-11-07T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxobv0H7Ju9UxjUqWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQCgf8NAL4yvV3tLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1BUR5-4IpjY5JGlF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyV4ljme31jn2wGaex4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxU2-E947rBejqD_9N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxziA7Baka0hhDQu4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwr6nGWU2SMx_fWi8h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyvm4Mlm2J8JL4KIHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPC4dRrfT_NVZC2Dd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxR8_e4JYoOOIL6qIh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]