Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One day technology will control humans . One step closer . Crazy to see how many…
ytc_Ugx7cIBHJ…
G
Human is can't make perfect robot like a real human so it fake robot 😅…
ytc_UgxFE29GP…
G
No matter how effective or efficient AI may be, the need for the human touch and…
ytc_UgyGqv-kX…
G
I have a “friend” who constantly posts ai images in the gc and it is REALLY anno…
ytc_Ugz9ytQVM…
G
what do you mean that the AI tried to leave the building, at story of Claude Opu…
ytc_UgxjS381Y…
G
I feel like the important thing to know is that AI needs *actual art* to make wh…
ytc_Ugwa45su7…
G
We were not ready for responsible social media use and look at the state of the …
ytc_UgzAUdnoU…
G
As fat as a sumo
Eating nutrient paste from a tube
UBI
AI overlords
Dystopia …
ytc_UgzHqrh7E…
Comment
LLMs cannot directly replace humans. In your examples, junior employees are being fired, and their managers are managing AI agents, but it doesn't work this way at all. You still need the junior employees to interact with the agents, they are the ones who can do the right prompts, review and approve the work, their managers just cannot do that.
If AI agents are to replace humans (and they probably will), it will not be by directly replacing humans, but indirectly, by boosting the performance of fewer employees, so instead of having 100 devs, you can achieve the same work output by having 50 devs using LLMs, and this will take time, learning and adjusting to new ways of working. Or aggresive management who buys the hype and doesn't appreciate or trusts their employees, so bad employers by definition.
youtube
Viral AI Reaction
2025-11-24T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgziaHaOXUmVBgGmR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQZDg52RxZLlClm-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnZguwon_i8jC2IFd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzs93dq2-uRqOPDgEx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLFzHoFjXui86u_bp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5EwvThPOz-L2OkFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHOfc5RCndaSTFdJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhDbGVNPe3abIE-v94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzW4Tve7Fxntv_hO5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrcIExoVhNby0AncJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]