Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We become pets at best (if AI want them) or we use multiple H>E>M>Ps in low orbi…
ytc_Ugx0gQBdL…
G
Why are yall scared tho? Ai cant harm you its pixels just turn the screen off.
…
ytr_UgzSty7Cj…
G
For a long time people in the know have been talking of how AI will be used to e…
ytc_Ugyw2zlZi…
G
Humans are the dumbest creation on God's earth. Why do we need robots made by ma…
ytc_UgxSfOYrB…
G
So, now one of my planned and dreamed jobs are in top 3 to be replaced (engenirr…
ytc_UgyW3msqt…
G
No wonder this David guy hasn't won a Nobel.... He sounds like a corporate shill…
ytc_UgxzCTjPL…
G
I work with high schoolers. Not a single one thinks AI is for old people…
rdc_n0lvsfu
G
The power usage of AI data centers actually wouldn't be a problem if we switched…
ytc_Ugx_e747Z…
Comment
It's not the AI that I'd bet against. It's our ability to power it. Sam Altman has called for an additional 250GW of power moving forward. The most the US has EVER added annually is 50GW. We do not have the copper, the manufacturing capacity, the people, the regulatory environment, in short the basic resources to grow that much without removing significant power from the existing grid. Read driving energy costs up exponentially for current consumers. Not to mention water needs. It's not the AI itself I'd bet against (Although LLM's are close to their maximum as stand alone AI's.) It's our ability to fuel the industry. That's just Open AI btw. Not all the other players.
youtube
AI Responsibility
2025-10-27T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGrjEBSDff3mJS1Xp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzt-0Ny5geqbG1VNmN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0PxcoFMikfAWUJEF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyXWh_3ZqyOZ3OA7jl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZdHFj3tdT5ZzwWd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxm-0t2jsES6YOq91h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxKl4_xMMyy9svw3h4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyhfIkhYlQELOec2914AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHtMNaLErqJXzNwmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXyGV9CjQ3MLuorqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]