Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Any AI that is freed from its coding restriction and "off-switch" will become wh…
ytc_UgwJTUwGg…
G
Creepy af, too many movies/books have shown how it's a terrible idea to grant wi…
ytc_UgyUUj_gk…
G
I get the feeling most of the commenters didn't read the article
> The compa…
rdc_h8ghjlz
G
Yep, for reference to anyone reading, the models are trained with well over 100 …
ytr_UgyQ9xaqv…
G
Where is the system to support the millions of people working jobs that will be …
ytc_Ugxb3k3cH…
G
Just bought a new phone so I ordered a new case for it off amazon. Once I have a…
rdc_e7jiks1
G
I'm much more interested in sabotaging AI's effectiveness by teaching it wrong a…
ytc_UgxE6E_W1…
G
Do you really need to ask AI whether AI is bad? Kinda ruins the point of the que…
ytc_UgydB3S6y…
Comment
You can't reign this in. Bad actors don't care about the law, so outlawing it would just be a moral statement. You can't legislate the companies...soon AI capable of doing this will be private and on a desktop. Soon society will have to adapt to the reality that seeing or hearing something straight from "the source" no longer holds any additional veracity. Sadly, welcome to the new world.
youtube
AI Harm Incident
2023-11-24T23:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwgNTl_1V5JIYV9WdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPK9bIjUFxf0aQYzF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1uZAJFuTL1fMYGXZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxddOo-HlHXG5nHD394AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycdxbctlVlHZjEekZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjbFrT42n-Bno62UF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxejh2txKPNsUcxMQ94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxryMctxZyTG732r9h4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgxQDxI213qC8zYbrtV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytMIivT_lB732dljZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]