Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have done the testing.. I wrote a few sentences using completely my own words,…
ytc_UgxanEW0K…
G
The reason he is against AI is that he understands that AI learns by reading thi…
ytc_UgyGNWju9…
G
Art is better than ai, with ai u can only do so many things. But when u put ur m…
ytc_Ugwazx1Jc…
G
I was barely learning C on 1980's computers back then (Romania "Politehnica" Uni…
rdc_nntner9
G
Hi Yuvan, we are sorry to say that you got the wrong answer but in any case, the…
ytr_UgwXYpZSf…
G
Let's just have an autistic savant design A.I. to attempt a definition of human …
ytc_Ugxc80B2n…
G
We need companies like AstroForge to continue there work on mining smaller aster…
ytc_UgztMNdvE…
G
This man is great. He champions decentralization in education, hopefully reducin…
ytc_UgxETHq9j…
Comment
IMO, poisoning your own content is absolutely right thing to do. As a programmer, I wonder if there exist such thing for code. And I am not talking about just protection, because that is not so difficult, but about active punishment for these scrapers. I used to share quite a lot of my code, for PEOPLE who may want or need it. I don't do that anymore. I am not fine with feeding my code to AI models that some big company is going to capitalize on while taking more and more jobs from people like me in the process. Not to mention that it is us, coders, who played a big role in creating these models in the first place. Also, if I publish something to be shared for free (I mostly use non-commercial use only licensing), I wan't it to stay that way.
youtube
Viral AI Reaction
2024-10-26T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx_oOzJ3kS7jWxmBP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwnON1G8kmyfDAAxsZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwjQm_a3i0HSb_o0Rp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugy-zUMNLn_b4xUAMsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz5owynCKsthKkvTJN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugyi64_Sg13L4Kelcg54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwKus4xoJBF7auM18t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugzj5h4pNUsxSV4Zl214AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_UgyYvLZTcbKZWWrW0K14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_UgzUqBmGf4ZnBr_ouOR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]