Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get that the interaction can come off as a bit much sometimes! The dialogue be…
ytr_UgwM_9iCd…
G
Actually there will be no exams because cost of ai will be 2$/h but a human doct…
ytr_Ugzo5VnnQ…
G
Tesla FSD is still only a level 2 autonomous system meaning the drivers hands ha…
ytc_Ugx4ANWYI…
G
i hate generative ai so much you calling these human filth "ai artists" triggers…
ytc_UgznlVVdD…
G
Everything you said is disgusting and insane. You are an artist hating, narcissi…
ytr_UgyFw2goZ…
G
They do it to create hype and gain political sway.
People are more inclined to…
ytr_UgzP3v7hA…
G
very true. i am a software developer and i heavily rely on these ai tools to wor…
rdc_n3l6yax
G
My bet is on small, specialized LLMs that can run locally.
The big cloud stuff …
rdc_n7u8i1p
Comment
@davidloveday8473I don’t think corporations are pushing this programs as sentient beings. They have talked about this programs as Large Language Models (LLM) that are trained to gather and compute big amounts of information at a very fast speed. But this programs have no feelings or purpose; same way as a hammer or knife. Its “purpose” is defined by the person who is using it. (Like every other tool) the same knowledge that is used to created electricity and keep us safe is used to create bombs that harm people. Some people should not have access to such tools because they can harm themselves and harm others. It’s our job as family members, parents, friends and members of society to make sure of that; but it’s never the answer to blame technology for the way some people use it
youtube
AI Harm Incident
2025-11-11T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugyhzf6G0caLxDlPstt4AaABAg.APGWsc2UbSUAPKAfQhbd1v","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwPZHLxBdJ3bnwv4tF4AaABAg.APGQvjDp5x3APGRHlLQVdp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzVWA9mlNqzUPEIpEZ4AaABAg.APGLcbUFtpTAPNU249CnND","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwgv4Z6ASODv_vRdxV4AaABAg.APG-TEVXD80APKlrjlBchw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzRe2t-8QvCZyhkwmp4AaABAg.APFpG6HnHfsAPJQD8g1wDM","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzRe2t-8QvCZyhkwmp4AaABAg.APFpG6HnHfsAPPuKeb_Kl1","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzRe2t-8QvCZyhkwmp4AaABAg.APFpG6HnHfsAPW17twn5Wg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzRe2t-8QvCZyhkwmp4AaABAg.APFpG6HnHfsAPYrvGJoEGQ","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzG6tnYCKwfSo4PG_t4AaABAg.APFiym6aw2FAPGVEzcxZCL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgybTxE74saEuWog7mJ4AaABAg.APFC2uUOPc-APOgwng41ni","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]