Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we reach an era where AI can genuinely replace jobs not just assist, money wi…
ytc_Ugx3mSmo_…
G
We need legislation to immediately catch up with this technology AND to imagine …
ytc_UgzIO9lZ0…
G
Does the AI translate the language of the caller to one understood by the call c…
ytc_Ugw_9L0lO…
G
When ai can be amarter than humans, why do we even have to learn anything…
ytc_UgyHBKRfz…
G
Morgan Stanley conference? LMAO. That’s all I need to know. Don’t use OpenAI, en…
ytc_UgyZapFnN…
G
THIS is what r/antiwork is all about! Doing only 10 minutes or less of labor eve…
rdc_hkfevsn
G
"Hey guys look at this strawman i made and won an argument against!". God, the a…
ytc_UgzVM_4MO…
G
And you wonder why we're being told to "Use AI to make your life easier" every s…
ytc_Ugwhwuxns…
Comment
It's the "G" in GPT that induces the hallucination effect, as the transformer is "predicting" (aka "guessing") the next most-likely word to appear. So, when it is "predicting" the next word after citing a case, it's guess is what seems most likely to follow, which in this case is a particular format for legal citations. When this happens in other fields of expertise, it often seems cute or perhaps satirical, but in this case, it's like a structural engineer asking ChatGPT to design a rail bridge...
youtube
AI Responsibility
2023-06-10T23:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgweKBRmXw7jhTGFtVp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyy4rhJG_YfCmrAAR54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwv1Vd7YhVeHiZTClx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxeTnSLM-KTkP1tmHx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxNobYELMGkPKfrzjh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-M4qXZCyHGpvXLvJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz6EcQ5fJuv8aCp8zV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyphYKJZtvSoUY7wdZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4uDUBVdnQ3oPiEWd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyT596gJenGm2joa9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]