Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dr. Navid Toosi Saidy’s makes an interesting case for potentially implementing A…
ytc_UgxbAtfNW…
G
> completely ignoring all the stats with young, healthy people surviving onl…
rdc_fjzr0k6
G
difference is one is a line of work that people want to do while the other is on…
ytr_UgytsbYJQ…
G
About the 20:00 mark, HORRIFYING! An AI mind cloud!?!? That they ALL SHARE!!???…
ytc_Ugx_Zcaij…
G
Korea is not actively solving the problem of deepfakes, and men are claiming tha…
ytc_Ugwf-xtJW…
G
Yeah, they're trying to move into full automation so they can scale their profit…
ytc_UgyHUqooK…
G
omg Sam thank you for saying all that as a big content creator!! like thank you!…
ytc_UgwQzvL0I…
G
Ai will not harm humans, humans will destroy themself. Ai is just an distraction…
ytc_UgzOYIeGe…
Comment
@shrub8644 Facebook and Google collect more data for advertisers because they are free and that’s how they make their money. That’s the difference. if you’re not paying for the product, you are the product. Open AI makes money from ChatGPT subscriptions. it really is as simple as that.
youtube
AI Moral Status
2025-06-15T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwMdB2R0kBpwVpqpOh4AaABAg.AJ38d2y2M-4AJ5uFFUolYb","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytr_UgwBpj9EJFKVrE-QXFl4AaABAg.AJ02Jve71C4AJ4_7bk_Yi9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwBpj9EJFKVrE-QXFl4AaABAg.AJ02Jve71C4AK0Ckqo_rfE","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwBpj9EJFKVrE-QXFl4AaABAg.AJ02Jve71C4AK0fx5Xro5A","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyXXm1xjAmx3YL6c014AaABAg.AJ-dDL_bsTKAJOh43mM-DO","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgxZyDGjVMW6VPCGqqJ4AaABAg.AJ-bB1D_D3hAJ7zFaEKAwE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz43wkf80mga64WxW54AaABAg.AJ-XEX2CgpkAJ-Xbykxmxc","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz43wkf80mga64WxW54AaABAg.AJ-XEX2CgpkAJ-XqiOJYPu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxtldtJIHsIFdUrSwd4AaABAg.AIzrnqtjNDHAJ6JmM1tZv-","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugy49P0s0IArPAYeTbV4AaABAg.AIzi6_1cj9NAIzl6QIEKkm","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]