Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Rich: "We want to make more money - automate everything, and fire the staff!…
ytc_UgyGmL-el…
G
Thanks for the enthusiasm! 🎉 Sophia definitely brings a unique perspective to th…
ytr_UgzlF_wGt…
G
the fact that AI suggests people should consult the "bible" for answers ought to…
ytc_UgwDsV6tU…
G
All this talk of AI take over is extremely abstract. What and how does it "take …
ytc_UgyS2_CEj…
G
WIndows 15 will probably have to have an AI defender program where AI can seek o…
ytc_UgyIST1N7…
G
First off, it's machine learning, not artificial intelligence. Second, the llm c…
ytc_Ugw3xBZpu…
G
Pretty soon we’re going to live in a world where you can just tell AI you want t…
ytc_Ugycp9hGj…
G
So, I’m getting a PhD in machine learning and think these fears are overhyped. M…
rdc_dulcwaq
Comment
From a legal standpoint it makes little difference if it was an AI instructed by a human or a human without an AI: If the new work is too similar to the old one, we have an infringement. Exceptions may apply to special cases such as parodies.
A court will decide case by case if rights were infringed upon. Be careful using this stuff.
youtube
AI Responsibility
2023-03-16T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxtws22u3_PNesnUTB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxg01_R4tDkN-9SXCp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgysPm-ieiqakYjvSA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8w2L_mX81rC7rV9B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_f4CoSIRlz9XHBYF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwV9cuhFZXEygw3cvB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsreQmHbAwdiZ6B5R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-_Z-s8CDgRVMtMIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTn-h9vzCAXH0ikgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdt_Dyw2Lf2p4uwfR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}
]