Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My problem with the "philosophers will figure out whether this qualifies as 'rea…
ytc_UgwhJHE0X…
G
How Ironic that we can tell they are all humans, because humans try so hard to i…
ytc_UgwD3Gy_m…
G
The robot in the Middle with the Hat on looks like it's come from a laboratory i…
ytc_Ugzp7pQ6G…
G
What struck me was how forthcoming Sophia was, yet in other cases remained neutr…
ytc_Ugyl3CXen…
G
I'm probably not be deep enough into AI but to me as an informatics student AI i…
ytc_UgwYcPaYT…
G
Just Hate AI Videos Voiceovers Pure 💩💩💩💩 Also make me want to watch less of yo…
ytc_UgyhfxIPC…
G
Look, I have a problem with artificial intelligence to begin with I think to a b…
ytc_Ugz48H9rI…
G
I do paid surveys amd last year, 2 of them really stood out. The first was AI ar…
ytc_Ugx8XFuk_…
Comment
I'm very down-to-earth in my analysis, and this whole scenario strikes me as unrealistic for one simple reason: energy.
Even if AI systems did start advancing rapidly, we currently don’t have nearly enough electricity to support a truly autonomous, global-scale AI infrastructure. Running powerful models, massive data centers, robotic systems, and surveillance grids—all of that requires enormous energy input, and we're already struggling to meet demand.
Without a major breakthrough in energy production—like viable, scalable fusion—this kind of AI takeover just isn't technically possible in such a short timeframe. It’s like trying to power a spaceship with a car battery.
Before worrying about AI becoming godlike, we might want to ask: how is it even going to keep the lights on?
youtube
AI Governance
2025-08-03T16:0…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_ol9zdBr3S4n3n6N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQckF60wgNJEY-5y54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxYJiNbPMLSq-KFMvl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-tkzcd2QELpVdav54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWT7oLxRkMDhCrqLl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMwbK71tm0OVg311Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBcQNyM4ieGCsTLsJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweE9lcC8FHIYUW6mJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwStHNNqt9FDrwghbZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzywjcoY0JgGy2JQ6B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]