Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When a robot says Yeehaw then does the hat tip… you know there’s more tricks up …
ytc_Ugwp4BcQ0…
G
this man is sick in the head, he predicts a 20-25% of global extinction , and ye…
ytc_UgzJPQQ2a…
G
This hypothesis is still very flawed because it's assuming that all of these com…
ytc_UgzfTklE8…
G
I use AI art a fair bit. I can see some logic in what he says but he kind of rui…
ytc_UgwFbLN2X…
G
AI has no volition: it can do a wonderful job once clear instructions have been …
ytc_UgxtXbYlv…
G
There is another reason for self-driving cars: It's way more relaxed than drivin…
ytr_UgxIj7lYC…
G
to become an AI artist, it actually takes studying on computer language models. …
ytc_UgwCSGeUC…
G
Some years ago when AI was in its infancy, I asked Google, hey google, make me a…
ytc_UgxDUzfXF…
Comment
Sad to see tbh. Looked crazy af though as from movies now reality , but giving a gun to a robot is the softening to arming robots for war and if a robot ever killed a human then there’s no consequences for that . We need to rethink things. Be careful, Ino it looks cool but it’s not when years down the line we could have one turn up at our door and following stupid laws with no empathy and carrying weapons . Which of course will be named for our own good. Which is false.
youtube
AI Harm Incident
2024-01-06T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy07hyGB0pJod__Md94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyzvq7GtdSMeWXd1xJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5alMLIOPe5sAN3V94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn5LwEsORtsedjd9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEmYc7AJ86YGzAgKN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEQ-E6fJNRN5s54kp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHj6hv-qGE5sE3d614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0eGi6eyfGud8T5Bl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz34DIBmjm_14lY8ht4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx6mH1UC7UzStQQMxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]