Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's actually very sad and this digital artificial intelligence it's getting out…
ytc_Ugz8LB1Q8…
G
I'm just wondering why a dev would run AI on a production server? 2TB drive and …
ytc_UgzXLLbX0…
G
It always amazes me that even a brilliant mind, such as hers, can't understand t…
ytc_UgzXAt8Y9…
G
As a contractor running a small business. Even with 25 years experience, I lever…
ytc_UgxdT1rw0…
G
AI robots are like a dead body and data the soul of AI.
Now it depends upon the …
ytc_UgxdICYsy…
G
You can not be replaced by ai,if only you know how to use them intelligently…
ytc_Ugz7mUKcF…
G
What I don't like of AI haters is the fact that they call it steal while they ha…
ytc_UgyIWMa1Y…
G
Before i met my current art teacher i only drew with pencils and pens, without a…
ytc_UgzguoeZl…
Comment
I'm not afraid of AI. Everyday it proves to me that it's still dumb way too often.
Imagine if you had the same kind of access and speed of information as an LLM, as well as the additional tools (create documents, images, use online tools). Essentially every question asked would be the equivalent of me asking a human and allowing them 500 years to formulate the best possible answer, or complete whatever task you are asking an agentic LLM to do.
The fact that they have this much instantaneous access to tools and information, with the level of processing we're talking about, the results aren't impressive.
More than that, while the competency of these models sharply rose initially, the development is slowing to a near halt. These models aren't getting noticably smarter anymore.
youtube
AI Governance
2026-03-27T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx_1unNLFFQZcxxK3B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy5uHdXPJ2viswN0ct4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyPx9Y-Z7Kme4mIUDl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_Ugwd3BD4kqIcoO3j36h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx_ObNw_PkS_E2F3sB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgynHpG8o9A-zPGCciN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgxovCncCUrw75DgjBZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxLjo9d_P2zaaYg7EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugzrj0KeTvakUVTUD5V4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwEbyi3vvfKk-v8FK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]