Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand, you said he had "AI artist" in his bio so yeah it's AI gener…
ytc_UgydojdMV…
G
@SixthLightinreligionfirst I asked ChatGPT once to find a word in a document th…
ytr_UgySUNKf9…
G
About the 20:00 mark, HORRIFYING! An AI mind cloud!?!? That they ALL SHARE!!???…
ytc_Ugx_Zcaij…
G
Oh. Yeah. Totally. Omg! Oh no! AI is gonna take over! It’s lying to you on purpo…
ytc_Ugwxhhug9…
G
Do we really need robots in this world? That's what people are for, we support a…
ytc_UgxpDN6We…
G
AI needs anthropomorphic robots to take over. James Cameron nailed it with the "…
ytc_UgzvSXAMf…
G
What good will all the ai automation be if people will be out of work and they w…
ytc_UgyNKBI_b…
G
There are plenty of uncensored production level open source LLM models available…
ytr_UgxAZRGXg…
Comment
I find it somewhat ironic that:
- jamming is used to disrupt unmanned systems
- greater autonomy is used to get around that
- a good autonomous system might still have an abort option but carry on if nothing is received
- such a signal would have to be a short termination code, rather than a continuous literal “keep-alive” signal
- jamming might disrupt such an abort signal, i.e. the defensive measure would make the enemy’s system much happier to kill a wrong target.
I suppose that’s not new. Disrupt GPS and you’ll have a lot more JDAMs landing on apartment blocks.
youtube
2024-07-01T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw5vwEiFeKAcD3_yFl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvllhNpsengIpVFgx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkUxLHgRV-7roFadd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFbgeIQfRKuzHUv914AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzs8nEoVgYJs6LpsoN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfFCMSHskHtsAjqP14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxY4zM2ewRbtG6V1NR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwUje5RrkthvBnsIY94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwe02thYTsfVroDvaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyv78SegZlqTYSfgD14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]