Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Time to buy a Starmer mask. AI will note the number of detections is far too hig…
ytc_UgzX11Hvo…
G
cant really give analytical results on something that companies are not measurin…
ytc_UgwnY3AjP…
G
This ban will work as well as the Catholic Church's European ban on crossbows, t…
rdc_cti4y5a
G
I hate ai art it’s going to take everything away from me. What’s the point of ma…
ytc_UgzWGCcPA…
G
Driverless cars? I suspect they won't catch on for a LONG time - there's such a…
ytc_UghkMgool…
G
In fairness for the doritos thing, the human who reviewed it is a bit more at fa…
ytc_UgxnNLOwK…
G
I’m not an artist or digital artist but… I do not support AI art. I understand a…
ytr_UgzLtJidq…
G
Well you got the gen ai part pretty wrong. But at least you managed to do 0,5 %…
ytc_UgzqIhWef…
Comment
But you still haven't made that 3 points against AGI. I happen to agree that LLMs alone won't get to AGI but I don't have crystalized reasons for it. Just like the prompt injection thing. That has nothing to do with reaching AGI as a technology. Humans suffer from prompt injection as well, by the way, and we consider ourselves GIs, right? Humans suffer from hallucinations, "optical illusions" are actually brain failures to interpret signals from sensors, ideology, faith, psychological manipulation... all of these are actual prompt injections. I think this video is worth redoing with a lot more depth.
youtube
2026-02-02T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzp5-FzgnUIhgRFz-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvPvbXeocPbqATHOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-3udv7cXg_U3sc8V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxrDLcTzck0wTQ_YCJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweweTiZQkal2zvM-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwnn2aiMcyY72IIZUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGsWjL7VjRQVp6mUp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySSHHmoesSA33re6F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxAhnsb3ib8ns0proV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqsrOn8RKVkv74GA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]