Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is this really that bad? I don't think so. Even better, since AI gets smarter.…
ytc_UgwMAlGR8…
G
I think I can sum up the position on the blue jacket millennial - until AI start…
ytc_UgxdKutmv…
G
Its crazy bcs i was thinking ai will destroy the world but no its gonna be the t…
ytr_UgzO8oLg7…
G
Watch Yudkowsky instead. Hinton, concerned as he might be, gives the viewer hope…
ytc_UgyGj6HA_…
G
Wanna know how to get it illegal in every state over night? Lets do a collab eff…
ytc_UgyhOYlI1…
G
People keep saying ai is going to take over the world… I might believe them now…
ytc_UgxOPx29Z…
G
Sort of. But California is not a particularly unique outlier. Rather, it’s share…
rdc_et7n9kc
G
You can use AI to streamline tedious tasks like sorting through relevant cases a…
ytc_UgxNo5xD8…
Comment
AI Drone Swarms would be incredibly dangerous, but one of the only defense options is your own AI drone swarm. If someone releases a drone swarm and you don't have one yourself, you're stuffed. Almost like developing nuclear bombs, but drone swarms could potentially counter nukes as well; since you have a swarm of AI drones intelligently targeting missiles and suppressing would-be nuclear Armageddon, while the swarmless countries cannot defend nor retaliate.
So yeah. I would suggest we go hard on swarm research and be the first to that finish line, who knows what Russia or China would do if they made it first. Meanwhile the US was already first to the last finish line with nuclear bombs and we didn't go crazy with those.
youtube
2018-05-08T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzcCwCegUbtvngQPJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGa0-tnosDqblNKKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9dZ_JKNLYmU8ndnp4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJN32BfiHsOID-bad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCPC1G6daYnx_uaaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuBtrBfLrj0IdLust4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5YVOYSxB0W7uyJYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwsy769faCmWMqna-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypyowmCCN501wSRRZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzhzWKI7aW7yWeIdhd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}
]