Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have been a Graphic Designer and a Fine Artist…..I feel very discouraged and w…
ytc_UgyWuXCxB…
G
@orionnebula1136 it's not being stolen because you won't get the exact same resu…
ytr_Ugy-6YOpw…
G
Thanks for encouraging video to fellow artists ☺️. I do have a question though. …
ytc_UgyN_quN8…
G
10:25 (AI video interviews?!)
wait...those are a thing?
companies do that?
huh,…
ytc_UgyUFpq4N…
G
If you're going to use this technology I advise you actually take a close look a…
ytc_UgwKLDqlH…
G
for me,
those who use ai to make images aren't fully Artists.
as the same as th…
ytc_UgxG7bA1S…
G
8:16
LLM stands for Large Language Model. It doesn't learn a language, but pred…
ytc_UgzNWMwZV…
G
Humans will have to “grow” organic computers in and on our own brains to be able…
ytc_UgyVo0irO…
Comment
Some of you people are actual nuttos. Quit watching click bait slop. Ai is not real as of now, the current generation of ai are only a danger to the mentally unwell, and the ignorant; who simply do not know better. The idea that an autocorrect program is "aware" is as ridiculous as thinking a calculator is aware; it's not, full stop. Llms do not think, they aren't capable of thought, have no sense of self, and no opinions. The only real danger to society are the billionaires that think they can replace humans with machines as sharp as a sledgehammer; they literally cannot do basic math.
youtube
2026-01-18T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwCTbjdunOtb81FoVp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSNeRJBUfWECmxHTt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfCR5xiSqOBzYYEV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyThEwXNy_Av1tsiHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgyUHNMPonG29ivC0NJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNSAHZvJYfo8ElcFJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrX2C6Q2-8bPH8GI94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzslc19O05nRQZiJP94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxGySEgtUAVi7J25Y14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwpg0FE5xIEU9p_ogh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]