Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People making more money today working with computers- BS - I know for a fact ab…
ytc_UgwAaaLUX…
G
I feel like striking is only going to solidify AI in the industries. There’s nev…
ytc_UgzTdRIDC…
G
Why are there no regulations to control AI and these freaking rich arseholes tha…
ytc_UgyVQkWpw…
G
these signs and more like loss of taste and smell, huge swollen legs, getting it…
ytc_UgzSMNqBG…
G
Anyone who knows how current "AI" works is not worried one bit. Its pattern matc…
ytc_Ugy4oW3Zu…
G
To date Japanese corporations are the ones doing nearly all of the R&D on androi…
ytc_Ugx5dP8NJ…
G
Maybe I should kms using chatgpt so OpenAI can be sued again and my family gets …
ytc_Ugw_8gV0L…
G
AI: Agrees with most of the stuff, but just letting you know it is for a specifi…
ytc_UgwrhhXFt…
Comment
Current AI is garbage at actual hard tasks, like solving complex problems. Including software engineering or navigating legal hurdles. It takes 5 minutes as an experienced professional to see how it hallucinates all sorts ‘solutions’ that do not exist. And no one knows how to fix this, they are always confidently incorrect (or correct)
youtube
AI Governance
2025-06-21T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugwe6p494MSZUiKNeFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXcGe1ucGGXsRVGct4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxcVdoVb4SA9XtXZdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4HRTNK7LG8Z_guIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQJeg5xFxp-OJEI5p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyQTRHtHfaSrhO1z8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyoKPVrHozCtQDY-xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzokjAV7-XipTOGuV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzsnsuMY5cJHqOXw8N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyI2T7eXrZ-2rJlHvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]