Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People can barely understand what they read in YT comments.
Who thought that ope…
ytc_UgzQw4Ahl…
G
I think I need to find me a cave to live in because I do not want to work with s…
ytc_Ugzxy0H5I…
G
One of the things that gets me about AI art is that it is ONLY imitation, meanin…
ytc_UgwhFkbVB…
G
Every country needs to pull their weight, and we are in no way close to doing so…
rdc_da3urxo
G
At least we can identify the people who's jobs will be replaced by AI first…
ytc_UgxsxlQ0e…
G
If you "poison" a train drawing, it would be the best. Ai doesn't know how to do…
ytc_Ugx3aMOKh…
G
Here's the most "Bernie Idea" that I've never heard Bernie say:
50/50 profit sha…
ytc_UgyUFJM99…
G
It is important to question the relevance of this question. It would be foolish …
ytc_Ugjl0Zmjj…
Comment
i'm not sure i agree, AI needs that last 20% of improvement to be REALLY useful. Anyone who uses AI knows how CONFIDENTLY incorrect it can be, and not some small % of the time. That 20% is going to get harder to get, when conventional LLM training becomes less effective, compounded by the fact that so much of the internet an AI scrapes is just other AI content.. LLM training becomes almost incestuous.
youtube
AI Governance
2025-06-23T11:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5OnOq1apyhxfSInd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6GMwGphlc4zcAETB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyplHmgyL3envBc9i54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3QhWnjyuD8NIAG1t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwWfjVYCJpa52r7Cb94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyR74bbngVvomG_UQh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo5CEQJ8pbMaoUsq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwi7uMkQj4_bJB2BeZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtX2prwNhbxONBT9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzbgJShS7Z7OsqVSm14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]