Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's crazy to see that alot of people thinks an AI that generates images it's th…
ytc_Ugxtws22u…
G
Maybe this is why millions of women & children disappear. To have their faces pe…
ytc_UgwQjmeOB…
G
The working class will put THEMSELVES out of a job by developing Artificial Inte…
ytc_UgzhpGWB0…
G
We should not be playing with robots and ai. We have already trampled the line w…
ytc_UgxBlw2AK…
G
Not him but this super intelligent a.i is very real its actual part of prophecy …
ytr_UgwPpeyUO…
G
Anyone who think Ai won't get rid of us is a really stupid human who need to che…
ytc_Ugy_k3ukE…
G
Can someone please explain the following to me........If AI depends on a million…
ytc_UgwSapLRf…
G
I think something like Terminator or I Robot is not really what it would look li…
ytr_UgwllDnyU…
Comment
I think LLMs will turn out to just be one useful one off tool, and that they're not going to get that much better. Companies will continue refining them for a while, but will find that there's only so much you can do with them just like how there's only so much you can do to a plane with the same engine.
The primary reason for my prediction is the fact that there's not much difference between the different models; if LLMs could go way further you'd see far more variation between them.
I wouldn't be surprised if in 10 years we get another AI paradigm that is more intelligent, but I think there will be a plateau for a few years.
youtube
AI Responsibility
2025-10-05T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwepnPRuy9qze-GB0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqcXEhFPEf7vcf-3h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwflgJzAyKk-eUi1gt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxC8nf_3W43KojxSvd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWfQew0g9jOLQKCFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww7Y4RI0hEue-UJj94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxoh1JKoPotXbcAuOZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxyAg9rQvt7HzZWEQ54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwa6we-B2WizGwJmpB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxdA2dWHS2l2ksqZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]