Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i hate the fact the some people also developing ai prompting to go against ai de…
ytc_Ugx-nybHk…
G
the ai is gonna create things for us not only art, and the future human is gonna…
ytr_UgzWWm7ss…
G
I can understand that. But the question is still AI at a level where it can help…
rdc_jidepku
G
AI will replace artist for simple needs like business logos website graphics com…
ytc_Ugyz2U3o3…
G
Keep going. Artists in particular need to find every avenue to combat the pervas…
ytc_UgxHD_4Bi…
G
Ai is going to wipe out the entire working class, first it will be anyone in an …
ytc_UgwkxfYCN…
G
AI art have its own good, being cheap and good for the average normie on the str…
ytr_UgzcnB1D4…
G
Humans have many things ai don’t. The best thing we have that ai doesn’t is emot…
ytc_UgxDZmuWI…
Comment
I’m not sure you understand ChatGPT and AI. ChatGPT is already trained. What you say to it only trains it further for that instance. ChatGPT does not have knowledge of other ChatGPT user chats. You can’t ask ChatGPT for interactions with other users, since it’s not part of its Context.
Only reason it agreed to the case was because you trained it. But otherwise the bromide interaction doesn’t exist to ChatGPT
youtube
AI Harm Incident
2025-11-25T04:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxeKoXbWDP6jCtuKm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwOVXxM5w5OZrnTtzV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyzn8FTMwDo09nQqQ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwTR6mS8yQ6g0kfIdN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxakBYwZlLswW9dD94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfMqq9dOGI0tSbzXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7ph_LYkMJRYdn-jt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyotjxrnrN99NessuR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx8bkTKYYu2hqFpWFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxCBxW-rld6riaS6Z54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]