Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is very interesting but it’s also a major issue that I feel could be very bad…
ytc_Ugy7NfuVv…
G
@gamejedi because the average person in society never consented to being the…
ytr_UgzOiGrPF…
G
the thing that boggles me about his supergirl ai slop video as a artist myself i…
ytc_Ugye1pb-j…
G
I’m 3 minutes in and I wholeheartedly disagree. As a traditional artist. What yo…
ytc_UgylBjhwb…
G
Considering how hit or miss organic intelligence is, I don't think we should sub…
ytc_UgyfuNgqZ…
G
Germany women didn’t know about Chinese government you can’t protest 🪧 against C…
ytc_Ugwle8JQc…
G
I know an acquaintance who claims to be an innovator for using A.I. for their wr…
ytc_Ugw31-zDV…
G
People do know they can just write, ask ai for erros, then change them manually.…
ytc_Ugyxvi-eD…
Comment
Alex Jones explained on Rogan 7 years ago that these people producing AI are doing it because of off world signals they’re getting that are giving them the information. It’s faulty info though and is being indirectly used to create a hive mind, god computer with future, past, and present prediction powers to control free will. To usher in the new world order under this AI God. Every day he seems more and more spot on. Whatever they’re learning, whoever they’re getting it from, they do NOT have our best interests at heart.
youtube
AI Governance
2023-07-07T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4Z6MGm6XltSY-7hd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3LrSDv1jJol-oMON4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwvW6bCajxg6y3xNxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz1TC1hkXnRSB35mHZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyHIRMErY1OPwDkvB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz4XaDGeT6trJLnIxR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRfrfYd9b_Ix9Arnt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfXpoZkBEORLDPoPN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyVAMC0X0eemsmrcYR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGbFiwh8TAfUKfNq54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]