Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just because humans go extinct, doesn't mean we do. Imagine a world where we can…
ytc_UgzRh2zcQ…
G
If you want shitty code on 3.5 all you have to do is ask for any kind of code.
…
rdc_jtuidy1
G
“AI accidentally made me believe in a human soul, because it showed me what art …
ytc_UgwONTnHI…
G
Nature will have the final say.
AI runs on electricity and will have a huge impa…
ytc_Ugyh3Rn0O…
G
it will be taken over by AI, any computer science related, does nearly all code …
ytr_UgySH0MdV…
G
Ai as it is now should only be a tool to assist, not a replacement…
ytc_Ugwvag8g3…
G
Even if something catastrophic happens, leaders who are indoctrinated, uneducate…
ytc_Ugyv3omSp…
G
AI is totally biased - I kept trying to get BING AI to write folk songs about pr…
ytc_Ugx0x1Kl2…
Comment
The main issue that was not asked is cost. This isn’t a case of a $30 subscription replacing a job. That subscription isn’t covering a fraction of the total cost of the operations. This isn’t just in money - LLM operations are far more power intensive than the operations they replace. Take search - an older Google search took 1/2 a watt of power to process. Gemini takes 5 watts. Future models use even more electricity. AI advances are being fueled by a huge debt bubble that assumes that not just all IT budgets will be consumed - but all budgets for everything. If we normalize LLM vs human into operation per calorie we will quickly see that this method of problem solving is not viable. The tech bros keep saying that the future will be here when the AI creates itself. Heh - news flash - it’s stalling and the AI companies are fighting over the tiny pool of engineers that can create the models. In other words - the coming tech stock crash is going to make the dot com crash look like a minor blip.
youtube
AI Governance
2025-10-09T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGt8zLGGwS5Ije1jF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiQYfjl57zhCfC2mh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRshQakT0rFaInsLt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwffKRritCvaVzznPB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwrwFFnTn7pOdIY3Yl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwO-4lmXtEHQH9UTdx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAdHzEuJbdhJkfyEZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHlrNRc6D-WbPDJ1l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEOebJy7HjwTjvLh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQsa3qSECOG1Nhw1l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]