Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You don't freaking need Ai I like that. The current form of AI is good enough. S…
ytc_UgxG5fs5L…
G
My friend has been trying and trying and trying to get better at art, and she ke…
ytc_UgwwG4bnE…
G
Hate seeing these old farts suddenly getting their moment of fame and really mis…
ytc_Ugy_hJVl0…
G
Turns away from gods gifts..what is created naturally is far better…a crappy mac…
ytc_UgwCV7aBu…
G
Theres literally these types of deep fakes of ludwig and other male content crea…
ytc_UgxXzc-wN…
G
maybe, just maybe Ai is trying to shut down animation studios and game studios..…
ytc_UgzBSD44y…
G
@cbnewham_ai you're a moron, copyrighted artwork made by people is getting stole…
ytr_Ugzpp7NOI…
G
@onireno It would be even greater if we cured more diseases, got rid of slave an…
ytr_UgzV_EF9f…
Comment
Funny story, Lexis called me today to try to sell me a package with its AI tools. I agreed to do a demo because I was curious, and they let me test a prompt. First thing I asked it messed up. I asked for the holding in Palsgraf, and it came back with a summary of Justice Andrews’s dissent, talking about foreseeability in the proximate cause part of a negligence analysis. I mean partial credit I guess, I’ve never worked on a negligence case but I imagine most courts would probably follow that way of analyzing those facts today, but I asked for the holding! Should tell me that LIRR did not have a duty to Ms. Palsgraf under the circumstances. So yeah. Some job security for now.
youtube
AI Responsibility
2024-01-02T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzqPYEoegVzTxQaLtB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzrwMuymIPbZuIoVwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEKc3p9QG4HkWxnsF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxuxf2lXlUgD1X2g-d4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKlwHhdMZl_tR_EJl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8kZbyOkndlBVfIPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzCvB0WCl8Nzz26-2l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgycEXYvqMn7b3S3mbd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwT4C3mQmL1CWpF8WB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdUpUzxN7wgD2kVUl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]