Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
[autonomous semis is not a terrible idea. there are pros and cons to everything.…
ytc_Ugy17_PbA…
G
I would totally train an AI on my own work so that it could help me produce MY a…
ytc_UgxH646l9…
G
- You aren’t making your own fridge though, your asking a bot to do it for you (…
ytr_UgzxXvg8m…
G
The idea of self driving semi trucks on the roads terrifying. What if something …
ytc_UgwwWV45e…
G
I wouldn't worry too much. Considering what the industry is currently defining "…
ytr_UgzU0-x_H…
G
The ai companies cant make money if no one has a job. You need people to be maki…
ytc_Ugw1KPgI8…
G
These are the same people who think producing AI is "skilled" and makes them an …
ytc_Ugw6OsiIN…
G
OK, now we need an AI to bitterly respond to this AIs posts and we can all just …
rdc_eczqc3c
Comment
I work in ML. You're conveniently leaving out that 1) current AI capabilities are massively overhyped to the point of fraud and 2) current AI fails on 95% of office tasks. This video is needless and outdated fear-mongering in the early-2023 tradition. The only valid task you cite is protein sequencing, which is at its base a fairly simple, rote procedure that is voluminous rather than extremely complex. Nobody is going to be harmed by automating that.
youtube
Viral AI Reaction
2025-11-23T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgywELhHQPBs1N5RdRN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwKpF8S8sFSPRQ5csN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyYVcPQJsnWa4v77Hh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJ4Hd8gFi0K3kIRfx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw8_sORyWu9Q-HdDzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzxc41WZDwkg5F5CAV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzHXmBAqfbV98p3bQx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzI8xauyQUb74W1-dJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOk6YunbulzjODkJt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyg12lXQx2I7ys6M_N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]