Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are we really surprised that the AI would do something humans absolutely already…
ytc_UgwYoufLY…
G
I think the salient point here is that given the two options, if AI takes over t…
ytc_UgxzhGBmj…
G
Poor skilled artist without AI = Poor skilled artist using AI & High skilled art…
ytc_UgwiEpOnc…
G
This was like a 'Not Like Us' moment where the whole art community banded togeth…
ytc_UgzMVvHU2…
G
It's crazy because a lot of companies think " if we dont automate we are dead"..…
ytc_UgxUwiols…
G
My bet is 2035 idk why but 2027 just feels too close, also I don’t think LLM’s l…
ytr_Ugw1y2fBz…
G
The idea of ubi is to tax the big corporations then distribute those wealth to e…
ytc_UgyJAJvuN…
G
We ignore human- rights violations suffered by MILLIONS of living labourers , al…
ytc_UgyFLMz9R…
Comment
The problem with the subject of "AI" is that to understand it well, you have to study a lot, and I mean a lot, of mathematics, logic, and programming. It's not easy, even for those who are in the field and have specialized in it. Thousands of pages of linear algebra, advanced probability, advanced matrices, differential calculus, vectors, among other areas. All these subjects are covered in mathematical modeling, applied to what is called "AI," along with advanced knowledge of databases and programming. There's no way to discuss these subjects through any YouTube channel. You would need 1000 hours of videos and still not cover the topics enough. For these and other reasons, there is so much confusion surrounding the subject of "AI."
youtube
2026-01-31T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxSrgG4azayqQFuvNl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4Cjs40ocSkoDQ3Xx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxnEupeUCEYi-5Qg94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwhixNhwyn_8EqdvAB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzoUPOsJ2Bt_RammZx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBe--i0uJPQij6LWR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw81g-4vzwt4-cDTUl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSFjQKAbCjUPs7cMd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzj6m3U545DJXY-7rx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyAFHc2IbnSEORErep4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]