Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why I strongly support hardening ourselves by, every day, speaking to AI…
ytc_Ugx6lt36D…
G
Thank you for your comment! If you're interested in more AI interactions, rememb…
ytr_Ugwu6rqGi…
G
What gets me interested in this video?
What's the video about
Sophia the robot
W…
ytc_UgxLgbqRR…
G
it's sooo wild to me how people write those comments like "oh, i use AI to do th…
ytc_Ugz-yY2wc…
G
Yes, i use it almost the same way, but I would suggest requesting (rather requir…
ytr_UgwgmLGGE…
G
This is how AI art is meant to be used. not for anything public, monetary, or or…
ytr_UgxOJxaiu…
G
By soullessly mimicking human outcomes (ai doesn’t try to be human it just tries…
ytr_UgwwuhIZT…
G
The big lie of this video is AI outperforming humans. They don't have to. They n…
ytc_Ugw4Fd0lY…
Comment
1:08:00. A kind reminder, when speaking of AI solving environmental problems, such as the carbon impact of cement plants: those huge servers on which AI models are running are becoming some of the world's largest consumers of energy (and at the moment, not only clean energy).
A 200 MW AI data center running 24/7 will consume roughly 1,750 GWh of electricity per year. In comparison, a standard-sized cement plant producing about 1 million tons of cement per year uses roughly 1,000 to 1,500 gigawatt-hours (GWh) of total energy annually.
youtube
AI Moral Status
2026-02-28T18:3…
♥ 19
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz_vmNm2AVvxcMwb1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu7WuBnJmXbf1WQDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_U5XcU66OnSwML114AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzIJyDJEjfeTxNRSQt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvA94Ny2zOXkAgJPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwGU1M7_qfnTLKnaz14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJSiOJ6xwaTYBaeGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwgczi9xXpVeubJHoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzV3Lp-4PFD9SI-poB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtLC5215Bd4Dsm7TN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}
]