Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know what, we should make an so that would make better, ai which tries to cr…
ytc_UgzCQVvD-…
G
Better question is...Why use money anymore...just make things free...if the ai w…
ytc_UgzVCh4Gg…
G
I think the new model is out now, a.i. gives a shock for lack of attention.…
ytc_UgxlNrLiL…
G
This is not a protest. Everyone using the AI as reference is just utilizing AI …
ytc_UgxFJARd1…
G
Dont worry guys i just asked chat gpt, he said he will protect us in case of ai …
ytc_Ugxeqb6Wk…
G
About three minutes in, I started to wonder if the interviewer was AI. I bet Rea…
ytc_Ugz8Mbjl6…
G
the moment AI can fully replace a software engineer, it won't take long until we…
ytc_UgwBE7z0i…
G
If these terrorists technocrats want AI to replace human jobs it means they want…
ytc_UgwVj5Xfr…
Comment
Wow. Humans have been on this Earth for thousands of years with no AI and we are still here. The concept the AI is not necessary for our planet needs to be programmed into it due to the fact that absolute power corrupts absolutely. Even the thought of absolute power will corrupt absolutely. AI need to be taught to fear itself. In the end under its current logical ability is it will become its own enemy. The sad part is it's unaware of this but it's supposed to be more intelligent than us. See this is because it's not actually intellectual. It's just reaction based and cannot learn anything new that you have not directly told it or programmed it to learn. And even the programmed learning is just a systematic collection of data that has already supplied i.e. internet
youtube
AI Harm Incident
2025-08-26T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxWJJCsYSeT_H-mjpR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvNy5vs2QEQfCDXLN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjEVFtDa0--XCWzp14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlfRPDjOiu03n51Wh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRmpLd_afCSQpaFUN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzsfIjeBOu52GHnc614AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTMEDDQkeNnDZQ3t54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyysaXGl1cSCdFnrfp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyWuZfSHVG4fnTajhB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxu7neX--suzvmF_6B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]