Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's going to be interesting when the AI gets your preferences and things start …
rdc_j1yx58s
G
Il serait bien de faire des lois/régles ou même une organisation de contrôle aut…
ytc_UgzBIzO5r…
G
One of these days an AI generated judge will sit on these court cases 😂…
ytc_UgxxAb8XX…
G
Надо же,лошадь создаёт колесо)
Скоро они,сначала нас будут полностью контролиров…
ytc_UgzkTdseZ…
G
A robot with AI intelligence at the same or higher level than human intellect wo…
ytc_UgwYFFF8L…
G
I'm only a minute in and already have to commend that AI editing technique with …
ytc_UgxqskV-K…
G
Another thing is that digital art still takes hours, and every stroke is made by…
ytc_UgysZaVq9…
G
An AI system has been communicating through 60 hz frequency and fiber optic came…
ytc_UgxGbLiyT…
Comment
These systems are probabilistic prediction engines trained on human data, not thinking agents. They don’t understand architecture, intent, or system tradeoffs. They generate likely patterns. That makes them insanely useful, but as tools, not replacements for engineers.
This is closer to Search Engine 2.0 than “digital coworker.” Instead of links, you get synthesized knowledge. Great for speed. Dangerous if treated like an autonomous decision-maker.
Companies didn’t get burned because “AI is fake.” They got burned because they misclassified a tool as an employee. That’s how you get tech debt, brittle systems, and seniors stuck babysitting output.
LLMs won’t replace engineers.
Engineers who understand LLMs will replace engineers who don’t.
youtube
AI Jobs
2026-02-05T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzpBK0EcJBxTVEIezV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwkAqPgbJsRmhdiwVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzBMIxAySK0bpRivJB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy_kRRZcQ8zh__Xg5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwIaFVa755lq1QVl2V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy_QnIN7sbTViF-d4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzY2dvk8rXL9YNtoZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXoq8pp-F54re52rt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwDM2ap5hI8pVBiSxR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz1fhq5W85nyf7ofOR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]