Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s an accurate representation of AI bros. They see a decent drawing, think i…
ytr_UgzGRYLwu…
G
I'd actually love if AI would purposefully give wrong answers if it leads to a r…
ytc_Ugx0eEqEU…
G
Firstly, a self driving car would prepare for such a possibility and keep at a s…
ytc_Ugif6gsoL…
G
I love "artist" crying about ai art
"Ai is goint to steal our jobs 😭😭"
Then do …
ytc_UgwKkTbxC…
G
It depends, which BEVs PHEVs,HEVs have electronic doors. My Honda Odyssey has e…
ytr_UgwHJGYK-…
G
what do you mean "no way" bro.
of course it can.
AI is still in the babystage so…
ytr_Ugx-ey_rL…
G
Where is AI plus quantum computer will have control of security we won't need pa…
ytc_Ugwokca-T…
G
The "you're just anti technology" argument is so brainless and you countered it …
ytc_UgykEt3k6…
Comment
@horrdev Technically, LLMs do not make 'mistakes'; they just produce output according to input and their internal weights. There's no technical difference between a 'correct' result and a 'wrong' result. Right and wrong only come into play when we compare the meaning of the output against reality. Which an LLM can't do.
youtube
AI Jobs
2026-03-23T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxL9PgwSiHFymJWQeB4AaABAg.AUenneThboMAUkJQu6eybZ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw888M1SwdwZDr2ifl4AaABAg.AUek3BaFpZcAUgSqS45PqC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwacHSPFEsiA19jHOB4AaABAg.AUedffMHmvuAUfBMpsYyV-","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwacHSPFEsiA19jHOB4AaABAg.AUedffMHmvuAUfGo5PdaZW","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw9HaPkIYiUtPtfCz94AaABAg.AUe_rVsCNZrAUiEdIXccpF","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw9HaPkIYiUtPtfCz94AaABAg.AUe_rVsCNZrAUjhgsnc2qv","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxZYCJDud8-ZcB-Gax4AaABAg.AUeN08oNxsnAUiNxd7LCtD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgygIcS4mjnqc6OeewV4AaABAg.AUeKWXiWjMwAUeUWlsUEzF","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgygIcS4mjnqc6OeewV4AaABAg.AUeKWXiWjMwAUgbsbYqI7u","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_UgygIcS4mjnqc6OeewV4AaABAg.AUeKWXiWjMwAUgh5DFWKkv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]