Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is only a problem if you use it just for the sake of using AI…
ytc_UgysiqkAv…
G
With AI focus shifted much more on PMs and Product Owners, I feel. At this point…
ytc_UgztGYVbT…
G
Somewhat true. 🤣. I built and deployed some applications with Ai, that are reall…
ytr_UgyroRU_i…
G
Because its cheaper. Software engineers are expensive, have to give them benefit…
rdc_m6y53vb
G
well... humans are WAY more complex than automation... sooooo you are REALLY adv…
ytr_UgxrOmElW…
G
AI will make everything massively less expensive. The problem is we have broken …
ytc_UgxXGCfyA…
G
You cant be replaced if you dont use there services. Boycott all businesses tha…
ytc_UgyuDoQ5F…
G
It's sad that I now live in a world where creativity is being taken from a soull…
ytc_Ugxnvm8mN…
Comment
The reasoning is flawed. AI is not just creating one genius Einstein in physic or biology and creating breakthrough in those narrow and specific fields, but is giving companies the ability and capabilities of creating millions and billions of agents to replace job tasks. AI goal is not for more job creation but will result in Net Jobs Loss of over 90%. Please discuss.
youtube
AI Jobs
2025-10-14T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwrRdN9S92Lh2LrLrd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhudYPl-Fus8kgfPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2mQA494wmxgzvwft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHZl0J6gJ7-X9lNG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPx4p-8fY6vmHBrvh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxK9GquBb0nJ7pra4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaKw5M_infA5eKk994AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz23n0XNOD0GKowDQh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxs3-Piw3M5kZc8FTR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxPgWntf_OD4A4mFy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]