Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The true problem is that somebody's is making money whilst appropriating knowled…
ytc_UgxxoWW0x…
G
AI is a tool, just like the internet and many other things. It gives us more op…
ytc_UgyjBLc8s…
G
Hi, I am disabled. I'm autistic and have aphantasia and very poor spatial reason…
ytc_UgyE671Qb…
G
I don't think I've ever heard a politician scream "I'M DOING INSIDER TRADING AND…
ytr_Ugz964xqR…
G
We have had robots in production for decades now, some jobs are still done by hu…
ytc_UgzdRQ7Hs…
G
But, with the introduction of computers in the workplace it was supposed to make…
ytr_UgzEy_32M…
G
Ai will wipe us out, not because they deside to wipe us out, but because it is a…
ytc_UgwJZ9nsa…
G
Ai prompter:"But the IDEA is MINE!" Having a "good idea" doesn't make you an art…
ytc_UgxR1O-EF…
Comment
The real issue is energy, it's highly likely there isn't enough energy on the planet to run AGI. ChatGPT alone took enough power to run an entire suburb for a whole year just to train it's 4th model. Human-Level AGI would require the entire United States energy infastructure to maintain itself. A human brain only requires around 20 watts sustained, it's actually cheaper to use humans to do our thinking. And that's assuming constant power, if the grid goes down the AGI dies. Humans can hit starvation, fall into a coma, clinically cease to exist and still come back with full memory and cognition.
youtube
AI Governance
2025-06-24T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzggaTBHzHZbffaHBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8gYeWsViy1EkLlYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3yJi8bMVXGtxoQLd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxioJ6OWIryOkvsEvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyip9cpna_ev3nrny14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyVDTQ_Co8759GfV5J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI1SJdGhu8RAb7j0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBVjtBRhd2mO__Zf54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUUyw7VGem5NU3G_V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBBtMaEtkBGk9zHrZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]