Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is that AI is also written by humans. If you've ever written code, y…
ytc_UgyCOQxqE…
G
2095 people in the street rioting about robot rights. I'm not crazy but trust me…
ytc_UgjRr47FA…
G
The more I know about AI, the more I start believing the whole idea of existing …
ytc_Ugz7IwGrf…
G
Bhai is pa 1 short video bana da
Konsi AI tools use keya han
Please…
ytc_UgwlmzeYn…
G
@@vectorhooves7970 "Ignoring the larger issues", my guy, I am not here to addres…
ytr_UgybHLAUc…
G
These are the companies we know are researching A.I - What about the one's we DO…
ytc_UgwmwgfLO…
G
How utterly moronic to believe that facial recognition software can never be inc…
ytc_UgwInwen0…
G
This was the story behind "Happy People, life in the taiga" most of them were gi…
rdc_d2xrt37
Comment
Sooner or later, one A.I. Super Intelligence will eat a less intelligent A.I. for lunch. Or, A.I., while running defense simulations, will accidentally sub-divide into two or more distinct personalities with in the same system, and we will see the first A.I. with a digital form of Paranoid Schizophrenia. Or, a large solar flare will corrupt the A.I.'s super code with the result of data contamination. Regardless of which one it is, this will be a bad bad day for the Human Race.
youtube
AI Governance
2024-01-05T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKgeRsGsGBv6drwmh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxBmOHYrF7ScRPaPo94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxE6iMNlwQBunrcjeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyUQxLtVm3z3nadBXB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1PoNs3nvQjWnbRwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLIze3j8_OLtzD0FB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkTkaIorvPFuoAHXd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy327MMdeG2tRZKlsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugycm62ONPsO9Toxn0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyMcljUoRL2b8zpUC14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]