Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In Roman’s defense @ 58:54, would be… case in point, how many times have we hea…
ytc_Ugz-UfGH3…
G
Maybe it till get to a point where some people own a few robots and rent them ou…
ytc_Ugxf1ljBq…
G
I don't understand why Colossus, Colossus 2 or XAi were never mentioned. Musk on…
ytc_UgzxB7bQV…
G
I see absolutely no problem in using the work of an artist to generate more piec…
ytc_Ugz5byaaG…
G
So tell me if putting a prompt into an AI is not 100% prompting then what is it…
ytr_UgzGpJtJF…
G
Its clear to me that the fed in conjunction with private entities are angling to…
ytc_UgxhN3lkL…
G
There needs to be a cost benifits analysis on the annual cost of having a human …
ytc_Ugw3RZ1Xk…
G
The problem with this is that AI is sycophantic, innit? It will always lean to w…
ytc_Ugwl17iy1…
Comment
LLMs have reached a ceiling. There's no more training data available. They're still stupid as a door. Great at reciting studies and identifying patters, but if you coded anything with AI you know how frustratingly idiotic it is. The current tech won't be able to scale up to AGI, and by 2027 when the agent produces its first robot it'll forget to place a charging port on it 😂😂😂😂😂😂
youtube
AI Governance
2025-08-10T09:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzfQdT58BPiDy5daDB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydZW2l5QKvh6HuDGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyH-ByG9G9kpLGUOK14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBCREkQ-z749LJGyV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSf5AJtPAyluVHHQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz6861p--5-kY1H7z94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZL9FZ2NDUJ0aZyFJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxWDPgunOQTvsNBS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxb8zXARZMgGSibYHJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTPmXNv6u_1bx1rNJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]