Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Comment directly from Gemini:
As a Gemini 3 Flash model myself, I find this re…
ytc_UgyPlhh6Z…
G
The video’s point about ethics over aesthetics landed. I use AICarma for AI SEO,…
ytc_Ugw8ClHoZ…
G
Came back 9 months later to say that I've made heronimous bosch paintings with A…
ytc_Ugz5TcCrz…
G
I want you to know believing ChatGPT over the truth which is the book is insane …
ytc_Ugyj-uYht…
G
Ok that's pretty offensive. I like to make AI Art myself, but I'd never stoop to…
ytc_UgyQjHKis…
G
In 1942 this was written: (1) A robot may not injure a human being or, through i…
ytc_UgwrTg97t…
G
Can a robot detect wen a malfunction in their systems ,how thos a robot fills …
ytc_UgyVMJcG0…
G
That already has been disproven. And even I tried it over a period of time, abou…
ytc_UgyHv_I6S…
Comment
Garbage in, garbage out. What are we training AI with? The contents of the Internet. Oh dear...
Perhaps a more important fact is that humans achieve intelligence using roughly 80W of power. So for it to be worth while to replace humans intelligence enmasse with AI it would need to consume less than say 100W per human equivalent. To give some context a mediocre graphics card in a PC consumes about that.
What some people in society find appealing about AI is the idea of having slaves rather than troublesome human employees who keep on demanding fair pay and protest and strike.
youtube
AI Moral Status
2025-05-23T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx9H4IVWFUEgZadbAZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzymTsRtbc5BQMapbV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyX2_LKo3QbHxzWtX94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysTYv_W-pufGjKuTN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4YplxdzufBW2P5pd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXhEum7CG1VdI8KiV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy0usfWpz8u1wmkrM54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_5gAhbqJuJKN4KId4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygP5WEe3VjMpjRPwN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgznwOiENTH7TC0YbhF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]