Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI Still need humans, they might be smart and control all electronics, but we ca…
ytc_UgxkUPx6l…
G
Well, that is not how AI workd, generations don't work by recombining data, and …
ytc_UgzvL46Nu…
G
the brush comparison is just stupid. but the camera one has some merit...
using…
ytc_UgyXc_SMx…
G
Remember boys and girls open ai was okay with it in fact they swooped in for the…
ytc_UgzUiJqjS…
G
US has a power problem for AI, but not China. Also, not only Chinese AI firms li…
ytc_Ugx2oArJW…
G
Yea blame it on AI instead of corporate greed. That’s how you keep the rich rich…
ytc_UgySAvq9v…
G
if ai becomes self aware whats to say it could launch nucliar missles.on its ow…
ytc_Ugx2kC7Qh…
G
8:12 if most AI were to think about this question, they'd probably quickly reali…
ytc_Ugw_EHAAn…
Comment
As a senior developer (20+ years experience) using Ai on a daily basis, my opinion is that this video greatly underestimates what AIs skills are. The skill humans need is bringing architecture, product, performance, security and legacy into the attention of the AI, so balancing the context, what is important or not. I don’t think humans need to fully master these 5 skills themselves in detail anymore, just as long as they know how to have a good conversation with the AI about it and to be critical of the AI. (So yes, you still need some of these skills but you can also ask the AI to help you fill in your gaps).
youtube
AI Jobs
2025-11-28T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyA6LMAYstAuX7S_i54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6Fi81rrrWHN4ggi54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHPliiAZJapuW45jV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTQhRvzvo4zdt7Hzl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKr0-M97m3WmnEX2d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCepkWsPGrQ40ZjQl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgykEoiDBpN9euMCbiZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbXmlxAYHU1EZAoth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBWliZbSPuTWDupvd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeyrSR0NDy7UzPcft4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]