Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I. scale up is in human nature. It's more about greed than advancement. These …
ytc_UgzNfXWKC…
G
I’m not a father but I’ve gone to school and university. Ai wasn’t needed so for…
ytc_UgxTkkvtP…
G
> AI models **now** have tainted data
Yeah, because up until the last couple…
rdc_mk7wihd
G
@Alexander_Kale
these are tools humans use. AI, AGI would be different it would…
ytr_UgzHLQ93n…
G
The matter becomes very dangerous. He will control every robot with his power an…
ytc_UgxFyzf8Z…
G
One of the ironies here is that he openly admits to having a narrow view of the …
ytc_UgweLt_o0…
G
Well, guys, check topics like "evaluation awareness". 1) LLMs are pattern discov…
ytc_UgwcDVVcw…
G
I love how all those guys are feeding of the fears if the people while spreading…
ytc_Ugw8nfYm5…
Comment
There is such an overhype with AI. When you mass produce because you rather to not pay wages to human beings, who happen to be your consumers, then the economy will be at a standstill. Who is going to buy your mass produced products? Also, robots can never have cognitive abilities like humans. We are more intelligent than machines. Machine do repititive tasks that someone has to program it to do. It can never think for itself and if in a surgery, thw machine can only do one thing but it can't see unexpected human reactions to certain things to know how to adjust. I would rather have a human surgeon work on me. These people have no idea what they are yapping about.
youtube
AI Jobs
2026-04-19T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAEU6lPXg6Djioj854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzMGIwigll55WZIDKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6ugvO3dfbnJbQsBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyDXLh6SkaOU7OcyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytnYY-dL6RjExui1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyLvXlnQQw9aCWgevJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsJrKBoVTPEy7nbTZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyx8b7siAcAgVsmAs54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy4EJskLu_fGYGo_hR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZnhaaMrdAHfg45zV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]