Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love that this video popped up right after I started reading about this stuff …
ytc_UgzvACb-p…
G
This question looks at the way ur curser moves not the actual answer to the ques…
ytc_UgzOdXwsh…
G
I mean, it kinda is still kind of interesting , if it was a landscape for exampl…
ytc_Ugx3nxcTL…
G
Omg in 2 years an already outdated paper says that 6% av jobs AI can replace…… V…
ytc_UgzmQDYbZ…
G
16:30 unironically, this is about the one and only time i'll use ai art, to gene…
ytc_Ugw1ADiFq…
G
is my art bad? yes
is it slop? yes
is it ai? hell no
my art can be as slop as i …
ytc_UgzP5jJDG…
G
How is not the right question.
The question should be how can we defend democra…
rdc_fdgpk64
G
No it wont. At least, not without a significant change in the underlying archite…
rdc_n3mkddg
Comment
Could be. Could be. I am waiting every 6 months to be replaced since LLMs became big in 2022. Right now, it is the biggest money-burning scheme. No profits and not even sure if they ever can be profitable. Data Centers won't help except getting government money. They are just not profitable and outdated after 5 years. Layoffs in the name of "AI" and companies hiring offshore (at least in some industries). The Internet we know is broken, as slop is created everywhere (e.g., 50 % articles you see today are already AI-generated). I would love for people to get accountable at some point. If that blows up, the normal people have to pay for it......
youtube
AI Governance
2025-12-04T08:1…
♥ 21
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwREN4FyXG7tN2FwFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1KSkc_CBCF8vkm_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBm_4sMJemVzr7VF14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvZ7_pmmm43evBynF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWAKDdP4sAqpkSl-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD54SpD1NMdfuSlEt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUKU3Q7ZZTTm8jF454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxprB8UiCWnAo8ULDt4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwNqjG86yxxs0pXUId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz098x72_JnwYp9xex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]