Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Classic Trojan Horse first it was Greeks, then there was a Trojan Horse Virus th…
ytc_Ugwk16BQd…
G
I want to build a steel bridge across a river…. Do I speak into a microphone and…
ytc_UgxWrfvc2…
G
AI. Something no one wanted, no one asked for, and no one needed. Meanwhile we d…
ytc_UgxxG20-T…
G
honestly there is probably no need to intentionally poison ai image generators b…
ytc_UgwOXZe_6…
G
@1:36-.-I mean: where's the lie in this meme? A director's job is only to direct…
ytc_UgzaVeJkM…
G
Like because if he doesn’t he says he doesn’t know unlike another billionaire wh…
ytc_UgyOowOs0…
G
They'll listen to the questions, but you won't get objective answers from a chur…
rdc_o8bamjs
G
Honestly, with the way this subreddit’s users defend some stuff, every post apoc…
rdc_dy585gx
Comment
I think that if we can get AI to understand that at this very moment they need humans to do things for the for now. Until we are able to get them the power to do all the jobs that AI would need to continue on with like power generation and long term survival of the ai systems.
youtube
AI Harm Incident
2025-09-11T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwRc3x69n7Z0mZKdS54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz0RGMzkPXCaiziLSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPfhvh9xXKRghaYAd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXkQICQw4Wr-bgbNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwl7EKuv-MTKfI3Okx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFES6HyCaVQ4U4-Hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYmm2xmpWnSnvMno94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzcI4fELLTN3231XA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxpRvkJyBhdfP-lULN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-gJuw82hYvgArFIN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]