Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is not the creator, he don't know shit about AI. Bring people from academic n…
ytc_UgwHJasA5…
G
I think the true main reason to be polite to them is because you don't want to p…
ytc_UgwnJBKHs…
G
Stupid AI is the most dangerous, AGI is okay, but it can't create, innovate, dis…
ytc_Ugw3HQzvK…
G
As someone working in a similar space, his view is very similar to how I would j…
ytc_UgwG7kGHv…
G
Why did Google just hire me as a junior engineer if they're gonna replace me wit…
ytc_Ugy5AqRGx…
G
MORTE MORTE MORTE ECCO COSA VOLETE!!!!! BRAVI E COMPLIMENTI A YOUTUBE CHE FA VED…
ytc_UgyJqp7Kp…
G
How AI will dominate the future. people will depend on AI, AI will dominate the …
ytc_UgwomXK8p…
G
It is too casual of a conversation for me. How can people be okay with creating …
ytc_Ugyer1gSE…
Comment
The definition is a BS term IMO. Brimstone (for example) is highly autonomous, can select one of more multiple targets and intercept them without human intervention once engaged, and been in service for 10 years. Many of the algorithms on (something like) Brimstone are more sophisticated than most "AI powered" systems that try and brute force the solution. Modern weapons are just a bag of (usually) widely available sensors that uses those inputs and clever maths to deliver a "payload". When we get "Skynet" we can talk about "autonomous", until then there is nothing actually "autonomous" about them, they're just evolutions on distributed "fire and forget" systems, like the Brimstone, Hellfire and dozens of other similar systems that have been around for a long time.
youtube
2026-03-10T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXDGygvxhKOLB1hbZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9Q0xh9YzYo7A6g2Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRVCTp_1wHz7lIJv94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy84HB8qHK2enSEPwx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGAcKqP9ouh9fvwPF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIZUJMmUwSwmr3Di14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNAZESJs6SnlTooi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEGHIj3h0QYVumwUd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy1VMBLGkV6gqrHbW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOAQfFmmlEv2YYo6B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]