Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nudify AI needs to be banned. Like, not only it's sick to have so many sick peop…
ytc_UgyDaneOb…
G
AI right now is garbage and I don't feel sorry for these tech companies losing h…
ytc_Ugxq5sSMM…
G
Can’t we just use a AI to go to other planets in outer space first and then take…
ytc_Ugz7B938Y…
G
I think people is able to notice pretty quick how bad are LLMs to brainstorm, so…
ytr_UgysIHfoX…
G
Clearly AI is only qualified to replace C-suite execs. I mean even a monkey coul…
ytc_UgydXZIZO…
G
34:40 "He's not like Musk, who has no moral compass." What about Sam Altman? Um,…
ytc_UgxN7uByH…
G
“What’s the point of even drawing anymore” the fact that like… that’s even a con…
ytc_UgwALiv-L…
G
>How do you compensate performers whose likenesses are used to create AI cont…
rdc_ks69yba
Comment
Just imagine the threat posed by the convergence of advanced battery technology, 3D printing, and AI in a couple of years. Long distance drones capable of flying hundreds of kilometers, carrying explosives and guided by AI, could autonomously seek out targets. A swarm of thousands of such drones could effortlessly destroy army units, million dollar military vehicles, equipment, and strategic areas like airports and bases. All that's needed is an AI controlled facility capable of mass producing thousands of drones using 3D printing per day and equipping them with explosives. It will be like a deadly hornet's nest, sending out thousands of killer drones every hour. Something reminiscent of the T-800 factory in Terminator movies.
youtube
2024-09-30T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw8ZAuaMVvWU2T9UxR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7PMUJ1cvOC5bavFt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9_nTWUnX4a63X_BV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznhmfRAviOPMvz6ol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoydP73rfal0vcKyB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYDjR9UpaL87C9HQV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgytMjrkBfW5t4QQo5x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5lEc7OydJnikPB1t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwurHhPHtiwk1alC-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyPXWp4i2FsBMGfmN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"}
]