Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would not be surprised if they made a sentient AI on accident and it took less…
ytc_UgweSj3N7…
G
I want to be an AI ethics expert so I can sound smart but really do nothing…
ytc_UgxwzTyvT…
G
We don't want to lose our humanness and surrender ourselves over to AI. For when…
ytc_UgwFUt_dL…
G
Facial recognition is not the problem. Tracking is the problem. If you can track…
ytc_Ugy0Xvsbp…
G
The only real way to retrain yourself would be to become an astronaut, because i…
ytc_UgzeTyoxT…
G
Any company that replaces human programmers wit ai deserves to go out of busines…
ytc_UgwmVWERT…
G
@mikebetts2046 Yes I agree, but i also think the benefits of AI outweighs its ri…
ytr_Ugy3P2ZRd…
G
you go to this level to SHAME YOUR FELLOW COUNTRY MAN FOR LIKES, VERY FOOLISH. …
ytc_UgwAPDzfV…
Comment
Look into Kessler Syndrome and deadhand nuclear response, ai controlling nuclear weapon protocols is utter insanity in itself, but there need not be any ai involved in that part. All they have to do is make sure that the Kessler Syndrome happens, which can happen just by all the space trash they said they'd clean but never did. This (the total breakdown in communication worldwide) in turn would make the program responsible for actuating deadhand believe that it is isolated and would therefore trigger its response. I assume a lot here, but the fact that it might be a possible scenario worries me. Maybe you can ask your ai "friends" to investigate the concept? Would they be able to recognise the danger an all out nuclear war would pose to their existence, vis a vis Planetary Electro Magnetic Pulses and the end of the internet and all that good stuff?
youtube
2025-12-01T00:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxsHpXkzKgLTay5TF54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFY0TYxelede1o8nt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw59uimPm-Vwy4LTb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxS7BL2JZK-3vXQYcV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZO4IhseK92fG3iX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx3z-C8saMeNQmu-TV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxGwxmRuuekGwhShMd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzJrRvX0HdMtotvFm54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwO7SQDV2-VsHhv7qV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgynRnzAhxneuj4F-bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]