Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@llll2071 I think you may be using a false equivalence. The term "Artificial In…
ytr_UgxSuNv4I…
G
1% chance all of humanity dies? What about nuclear weapons? The difference is Ai…
ytc_UgzPiCYAj…
G
No need
Waymos are enough of a plague that the elmo-cult needn't have anything …
ytr_Ugy5QK6Xi…
G
Back then evolution solved this problem. Now it's on the Internet with something…
ytc_UgwvdrOQ8…
G
What AI cannot takeover, jobs where you have to interact with real humans and so…
ytc_UgwRsTEiE…
G
Is ok ai if is realistic reasoning government have bo control ai is goog for us …
ytc_Ugzw06UBN…
G
Here is the catch, if it's really a sentient it would ask more questions than it…
ytc_Ugw3gLOP9…
G
What is the site where I can check wether a piece is AI or not?…
ytc_Ugw1gOCpQ…
Comment
I watched this 6 years ago not worrying about it. Now I’m watching again in 2026 as our government strong arms the only AI company with morales into giving them unrestricted access to autonomous weapons. As every other AI company is begging to give away their product for this. As Larry Ellison consolidates the entire US media market and data collection. As they build more and bigger and MORE data centers to support this. Once ASI gets here, we’re overcooked.
youtube
2026-03-06T07:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoQa2ltfXKd4nPBW54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzmq9yNCjHsQACPrFh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSJncPf952sXxD6F54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfRB_F5diNKB4BHsh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"disappointment"},
{"id":"ytc_Ugy1rdkptAkgQ4RJbEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxddePVK8Bki5XtMph4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5zwYTPPaepKCu-7V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqlYL3131GYITbz114AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx0Ini944P2PXGbqgV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOk1zwvmjJIIXOjlN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]