Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Deep Learning is one of the Algorithms used in Large Language Models a sub set o…
ytc_Ugx3_-Z0k…
G
But companies are still doing it. They don’t want us to use it as a tool. They w…
ytc_UgxNyQ1R6…
G
This is all such A-level anthropomorphism. AI doesn't want to do anything. It ha…
ytc_UgwxdNump…
G
Less workers needed= less people needed = let's get rid of excess people= agenda…
ytc_UgzmRRy72…
G
I still want to use an encrypted middle finger watermark that only appears when …
ytc_UgwK5uebk…
G
I use Ai but not in the way you think I like to use it as something to hype me u…
ytc_UgwkHM-dU…
G
I'm sure The Guardian will be interested particularly as facial recognition is g…
rdc_ohzx1d5
G
ai gen images are not art. they are stolen synthesis of others' years of work so…
ytc_UgxejTYM5…
Comment
That thing about training AI takes the power of a small city, but running a human brain uses the power of a large lightbulb, that's a false equivalence. Training AI is the equivalent of the time it takes for a human to learn all human knowledge.
ChatGPT calculated:
Plausible span 9.4–18T words:
Human: ~13.2–25.2 GWh
AI (GPT-3-like): ~53.8–103.0 GWh
Ratio stays ~4.1× (both scale linearly with words)
And one human isn't duplicable. So spending that energy is relatively ok.
youtube
AI Moral Status
2025-11-02T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySjw3HUbNfgUPHoo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxbjWjDSEm4eWtkIUt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTSUZO3MOmecGIYI14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyuLJ9LfUm5FJ10v54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwD7DtAACh07ZQG7TR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4QWkWYAhuENknySt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyeB0f8JDA-7a4_EW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLNMQxSFcaMU9y06V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzI0FSrTlVZXfcim5x4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgySU7nxn2Fy84EqAjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]