Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a hard time taking this guest seriously during the key "What jobs will ex…
ytc_UgwQB441y…
G
When you mention that human touch it makes me think how A.I will never hide litt…
ytc_UgxP4PByL…
G
Meanwhile Musk buys thousands of gpu's to start his own AI startup.
But yea "L…
ytc_Ugx7M9LQy…
G
I thinks it's funny how people who have never used AI to make a full length vide…
ytc_UgwX0tEtQ…
G
4:17 As a person who enjoys drawing characters and OC’s I think AI art is such a…
ytc_Ugynb-Tb6…
G
Just remember that Sam Altman's little performance about ending poverty and ushe…
ytc_Ugw2HmD6G…
G
The worst one uve heard is someone asking me how im autistic and have that artis…
ytc_Ugw-9cyWA…
G
So the parents don't take responsibility for raising someone who would talk to A…
ytc_UgxCgu6xZ…
Comment
I asked ChatGPT if, under Massachusetts law, there is an insufficient factual basis to plea guilty to attempted armed robbery where the Commonwealth has made no factual showing that force was used.
It came up with the correct legal answer: It is insufficient to plead guilty (in other words, the state has to allege somewhere that you used/threatened force, or your guilty plea will be invalid).
Knowing this was indeed the correct legal answer — and well written, with case cites and everything — I was shocked. Is ChatGPT really good enough to lawyer? So, I checked the case cites... the cases had absolutely nothing to do with guilty pleas, factual basis, armed robbery, or anything remotely related to the (albeit correct) conclusion. One case was about the rape-shield statute, another was on something else completely unrelated. What garbage.
youtube
AI Responsibility
2023-06-11T01:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyRoaCg7x3OZJR4O6x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiLQLlrYp1XYwKxsd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzze33_x1vOLOMvbtx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxrsiClrjgnVYUyrXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt_gypvVU0_8YnB3B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8zsBAsgMaTbCvlCh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"amusement"},
{"id":"ytc_UgykTuYLzn5a3d20mT14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBSQmIDIkf6q3sc2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1-IV-b34l33M-zBZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},
{"id":"ytc_UgzBRbP4UfPLeavcek54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]