Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Grok and Gemini are the white knights, those who will always do what's correct. …
ytc_UgwG7qjfN…
G
Soon no one will have time to be empathetic for anyone …. If your electricity bi…
ytc_UgzEi4Cl5…
G
AI doesn’t need to kill us. we’re already doing a great job of it. It literally …
ytc_Ugx8m8moo…
G
Its all well and good until these AI employees/systems fail critically. They're …
ytc_UgwW5Gd07…
G
They’re nearly self driving actually! Most tractors / combines now run on sat na…
rdc_ksksf2p
G
Not happening, sorry. The energy and infrastructure costs (GPU chips) of AI mak…
ytc_UgzK4EnTE…
G
Fan works are legal under fair use.
AI training is a way to make money and doesn…
ytr_UgzBHjUfQ…
G
Thank you for this conversation! I learned so much from it ❤. Very important les…
ytc_UgxQwHtQM…
Comment
Great video, love that you take a very differentiated stance. Only issue I have is that you say the AI's consult a database of human made art, this is not true, the AI's are trained on a database of human made art but when actually generating stuff they simply consult their weights which are usually pretty small in file size (4gb for stable diffusion) in other words much too small to hold that many images.
21:19 Welcome to science. In all seriousness this is just how humans do things in our era, ever since the renaissance we've been making incredible progress nonstop and nothing other than a massive catastrophe is ever going to stop that.
youtube
Viral AI Reaction
2022-12-05T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxypsB6LpfmMzZ27hZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdLsDYW0rgGOtAMRN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLVxJDiYhHKtEB9XZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWcLQZD6QbKF0zFXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx6oMqQ4iQn0LOND654AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4MHLXQE7oMImmsTZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXavUALCfJ2m24AGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBJFqajRLaMR4wv1B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxfOE8o-5M9yq12BTR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPTzunsu4JbpPHuxR4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"}
]