Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a garbage chart.
Voice recognition (i guess as an assistant on android ok).…
ytc_UgxO5Xyh5…
G
This is just the beginning of the problems that Artificial intelligence will bri…
ytc_UgzT1Ga1f…
G
When AI becomes so smart that we don’t understand it, it’s over. For AI. The plu…
ytc_Ugy5CAQw3…
G
AI is just a database which we put into it and it increase its knowledge about t…
ytc_UgwO3Ov7Z…
G
There is not a single counter-argument in this video that is not 100% on point. …
ytc_UgxzwzB1g…
G
Ha ha ha. How much did you sell your soul for to send that? Or is it just a Puti…
ytc_Ugwn0_LZh…
G
AI has already worked out the future will not be conducive to human development …
ytc_Ugxg2CzZ2…
G
AI is a hammer. You can use it to build a house or to tear it down. Your choice.…
ytc_Ugzt-JGOK…
Comment
i agree that the current AI framework is not and will likely not replace software engineers completely, and hence context engineering is the way to go. However, that does not mean that with the current AI you would not be able to employ, say, 20 engineers instead of 50. I do think you would need way less people with the current AI framework especially as it gets better with scaling.
youtube
AI Jobs
2026-02-17T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxe31B9aSuL-Io5oFB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzC1rp7y2pkueyr_fh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxl-so6gZXScdsck2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwypr-pCI7Dry8nhB94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-pOmt0TB4izp1ymd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5FoSh6HNARw1OuSx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxse7T0ASjxU1rPY5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9Yk0nCdVI0jydEoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7KGriSC5lg8FDesh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_d6VHvmPxP2PPp3h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]