Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are creating something that will be like humans. Humans are known to cause al…
ytc_Ugx6lGHbq…
G
I think it's perfectly valid to take the ai images and show how we as artists ca…
ytc_Ugwgf-WBQ…
G
@AussieGirl at this point yes. But the penultimate of all this is to have self…
ytr_Ugy5sJPsk…
G
To be fair, can you imagine the backlash if Trump started releasing regulations …
ytc_Ugxd3Szxh…
G
Except it doesn't work. It kind of works. The second initial does not deserve …
ytc_UgzYmnQhI…
G
He is talking nonsense about risks with AI ! And that gives not a good reputatio…
ytc_Ugw7A_VuP…
G
"if ai art has no soul, then what is this? 😏" *proceeds to post the most soulles…
ytc_UgxoTsJZ5…
G
I don’t understand why they let AI get this far I they need to just terminate al…
ytc_Ugxyrii6R…
Comment
In the movie Alien, sigourney Weaver is desperately trying to stop the ship computer from self destruct countdown. She was too late and the ships computer said “the option to over-ride automatic self destruct has expired. The ship will self destruct in t-minus 5 minutes.
This is where we are in terms of the A.I. threat.
We are way too far down the road to turn back.
The greed of Silicon Valley and the huberus factor have created a monster that will only get more and more power and the reach of this monster is almost total. People that are being born today will see the terminator killing fields. If you put a snake in a box with a poisonous mouse, Soon enough the snake will consume the mouse and poison itself.
youtube
AI Harm Incident
2025-07-28T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxIWLF-J3GT1Bhlynh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxnB_cqC4XHAZp-p794AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4c6GhbwWqcI9M7FV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxizAsLt59dLFSmo1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7EzLMRwUlCItYTt94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpCmpRqgEk0ove3-x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyauZfH6fmBlUGLTFh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwgbto8rujNRCfEji54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6TtxgrmPBQJl1T1N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3jZ21SSJ032fHNQV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]