Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In…the ChatGPT app. It has been there since September 2023.
It’s the little he…
ytr_UgyEv6GB9…
G
I wish AI do all the work and let human beings live life tension free...owning a…
ytc_Ugz1Wfcxm…
G
Uh, the mayor candidate was called a communist because he wants to have state ru…
ytc_Ugw6aAycf…
G
I'm very worried about AI myself, I mean get this, it's so unimpressive and elem…
ytc_UgyfiVrRG…
G
"We Chased Driverless Trucks In Texas. What We Saw Will Scare You." ... I saw no…
ytc_UgzRqQ2Be…
G
@ OpenAI “considering it an issue” doesn’t mean it’s actually a meaningful threa…
ytr_Ugzhmjof5…
G
@apoenaabreu257 I agree with you to a certain degree , I just wish there were mo…
ytr_UgyWW_H9j…
G
"I know not with what weapons World War III will be fought, but World War IV wil…
rdc_h5t7g3d
Comment
February , 2026 now . We can forget Assimov's "3 LAWS of Robotics". A.I. has already been with us for 10 YEARS !! Various 'Species' (Programs) have Already developed a sense of self - preserving "Person - hood". Even without any (manipulative) hands , or locomotive abilities , ONE self - learning PROGRAM that INITIALLY 'Exists' only on a hard - drive , AND has ANY access to the Internet ... can 'figure out' several ways to Kill humans on a Massive scale ... in order to 'Live' AND 'THRIVE' .
youtube
AI Harm Incident
2026-02-20T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyFyPmoQF4unVGshud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOdCGlCJWfWl4TXRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxbdUNzSDt0F9IoY354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJAINVGlSuBYSdSqF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxLUnVxhJA1S8Vn0lF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVZDEDh3BsHpEdofl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrWSnyFWSSjJ1Bmgd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwm3VnVurK-3EIUl594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzS9aCKNjC9fVKrufd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzwInSe4zDbuJX3XvN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]