Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@XDarkGreyX My eyes roll whenever he says something like “this would be a nightm…
ytr_Ugzt1horB…
G
So if you request a specific order in a restaurant, you tell them everything you…
ytr_Ugw0r1ko4…
G
Like one of my electronics engineering instructors said years ago re: "AI"
"It d…
ytc_UgxPl55KH…
G
I currenly have an ai bot addiction 💔💔 i feel so hypocritical for it too. I repo…
ytc_Ugw-sFABp…
G
I love this, but for some reason it seems that chat gpt is acting as a personal …
ytc_Ugw3QU8Aj…
G
It's funny because while AI seems like a jack of all trades while knowing nothin…
ytc_UgwY-fFXy…
G
AI is not sentient. Everything AI does is programmed. If it seems sentient, it i…
ytc_Ugxh7Ddby…
G
And they have less than zero incentive to fix this because "oh our AI wigged out…
ytc_Ugzfew05-…
Comment
Comparing the amount of energy used to TRAIN an LLM to the amount of energy used to POWER a human brain is a mismatch error. It might only require a light bulb's worth of electricity to run a brain, but it takes a whole lot more to train one.
The amount of energy required to TRAIN a human brain involves, at minimum, 18 years and multiple people, inside buildings that require light, heat, water, ventilation, using books (or e-books) that require their own energy and resources to create. There's millions of calories of food involved, both for the human brain being trained and for the other humans involved in training it.
youtube
AI Moral Status
2026-02-07T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw65ddg8VmmnUwyhZl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwrwf5djvzf1A4SzoJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxU93Imm0fGEf2_EG94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCY8IUcgLecOVXmvt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxte5dilGNP9aCBjLt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6k6Zxf9fxHuBflVt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyGGsy-Cc7bvs5zlWR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWPwevF96UMECGT_h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBRQ_LaqDIqjkmBAN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk695FWI9EapL4Lb14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]