Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI lies. Creating little robots using AI managing brains and imprinting software…
ytc_Ugy8FVzAq…
G
5:06 Bro just said our Art is slop but The image artstyles that AI make came fro…
ytc_UgzDW0-pv…
G
Tax them into oblivion. People have no idea how amazing AI is as a tool. Instead…
ytc_UgyN6Paqk…
G
#tl;dr
Microsoft's ethics and society team has been laid off, which raises conc…
rdc_jcbxllr
G
Your research also uses other researchers work. I'm a researcher too, and I don'…
ytr_Ugz0OcuNg…
G
Hmm i wonder why a president 2 YEARS YOUNGER THAN THE AVERAGE LIFESPAN BEFORE DE…
ytc_UgwdCziLf…
G
@Totally_not_a_pineappleI literally just said Ai as in Ai in general Which is w…
ytr_UgxMSbWcT…
G
People do it all the time, last major one was in covid with the greatest Asch co…
ytr_UgweWb8V6…
Comment
There are people that want a ai robot future, with autopilot cars and planes, enhanced genes on future breeds, and there are people against all these things starting with AI. But I think this thing cannot be stopped, because we can’t see that force that’s driven the developers and engineers, they’re doing it for fame? Wealth? Or humanity? We don’t know. Ai and robots and autopilot cars have already caused some unfortunate people to lose their jobs, although very few, it is a start. The engineers always say that robots can replace people on some of the works, so they can do more desired works. But that feels more like the elites trying to gaslight poor people. We a facing a scary future, 10 years? 20? What’s gonna happen?
youtube
AI Harm Incident
2024-08-23T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzGjPc9Fjcp-qPtKLl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxBx5DLX4OiqdQ8Qo94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzME1dB0xN5U0z-Vql4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx88mr2KUtKK-G6Byd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzeP-yxPncqVYUjWb54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhKsVJg4nHejBdnBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzO6dwNl3CAsP4km4x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz9iAnwNRaVZ4yucPF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbKW6NetzVqto6U5R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_6dSfQrcLqdlRaal4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]