Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro i am 14 right now and want to become a software engineer is it worth it, cau…
ytc_Ugx4XUYwQ…
G
I think it can be helpful to use AI to get a boilerplate and waste less time on …
ytr_UgylQh5y1…
G
Hi Aneesh, we are sorry to say that you got the wrong answer but in any case, th…
ytr_UgyR6Mma_…
G
Why is AI being wasted for generating shabby images when it could be used for AN…
ytc_UgzLELE2Z…
G
OpenAI is already bankrupt and the data centers Microsoft built for them don’t h…
ytc_UgylHoLsX…
G
It doesn't matter whether it is built by man or AI; the jobs won't come back to …
ytc_UgyPsUepU…
G
This has to be one of the best 1 and half hour of podcast i have listened about …
ytc_Ugzjc6YhC…
G
What benefit can AI provide that is so great that it is worth the risk of DESTRO…
ytc_UgySmU9y-…
Comment
We’ve definitely jumped the gun. The very fact that we call LLMs ‘AI’ shows that we’ve overestimated their capabilities, since they are not truly intelligent. We then had to invent the term ‘general artificial intelligence’ to describe what AI was supposed to mean in the first place.
youtube
AI Responsibility
2025-09-30T21:4…
♥ 377
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhS3ICO-mhXOf4xkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyz7Ay55fH6sD1kmXJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-I-US2trVYXaCEVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSfKPM_sEcIaPvgOp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1ixDuhwBMajoNn9t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxd1ZqLFtS22uTrVfp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwykB5gNJ5-uA7arLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzkr3y5wMj-1hm3Crx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSpeZCuwALgKi8z8d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbKVN4B3pbuseMdX14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]