Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Women are objects. And men are always the Problem. She should consider the fact …
ytc_UgwqxO1so…
G
Good thing Apple ai is not as intrusive as this pixel or Microsoft ai, so glad I…
ytc_UgyzgiLDG…
G
Amazing! I once saw a documentary in which a robot from future comes back to kic…
ytc_UgyAW6Z_3…
G
No such thing as an ai artist. Theres talentless hacks trying to co opt the trad…
ytc_UgxUKEyF-…
G
Around 30:00 as soon as A.I sees the corruption and how evil the human race can …
ytc_UgwuUzb_F…
G
This has to be one of the most... irrational videos I've ever seen. You are tryi…
ytc_UgxvQ8G9x…
G
Are Asimov's three laws of robotics (or equivalent "failsafes") able to to provi…
ytc_Ugy3kqy5L…
G
Not everyone can be wealthy. The world has become a competition for wealth. Weal…
ytc_UgwTcGcYr…
Comment
I can easily understand why a lawyer might be interested in using ChatGPT for an initial search. The part of my brain that is lazy and reckless can even understand how one might think “F it, let’s just file this!”
The part that literally no part of my brain can reconcile is attempting to falsify the case law when the case law didn’t exist. Like… even if it were an excellent forgery (which it’s not), in what world is opposing council not going to verify it’s existence? If you’re eventually going to have to Mea Culpa the judge, maybe do that BEFORE you’ve forged federal case law?
youtube
AI Responsibility
2023-06-10T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzJQ_6XMOxAyJLMsK14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqBKpiunHzE65NPkx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlpADF0euSxJkxODl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXUBhwTYMrZ93knQ14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkchEAOCDNCIRsem94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPpfVEOkgdBXY0t1t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw687qjI9EeJMhUER14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyk8aGxjqdE6AKjNL94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxpwMVN0BSRATxArLZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGGcieCmwscao0Puh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]