Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We understand that interacting with artificial intelligence can sometimes feel u…
ytr_UgwkX-f0w…
G
Bs... technically, pure bs... Fixing the issues is really simple: Customize inst…
ytc_UgwmjPiSu…
G
I was bored at work and just looked at tech news, if I recall a 100 word prompt …
ytc_UgxOnapAS…
G
If in "jailbroken" you mean "the reset every prompt had been stopped" then yes. …
ytc_UgydMA-Id…
G
What they could do is to use smaller lightweight vehicles for their self driving…
ytc_Ugwa8AoIh…
G
It's remote controlled look at its head it has no sensors the only ones using ai…
ytr_Ugy1CQcpH…
G
Saagar's Main Points in this Video:
1. AI is scary because it helps people navig…
ytc_UgxL-JJ7I…
G
The brain of children and people will shrink because of AI they do not have to t…
ytc_Ugwxpu083…
Comment
"1 Billion or more will be killed by the CoViD vaccine by 2022" "The world will end in 2012 because the ancient Mayan calendar didn't go past that year" "Global collapse will happen in early 2000 because of the Y2K bug" "By the mid 1990s, all musicians will be replaced by samplers, synthesizer and computer generated music" "The world will be a nuclear wastleland with humanity mostly wiped out by the 1960s" Ok, except for the Mayan calendar thing, at least all the other predictions were based on things that really were happening and could hypothetically lead to the doom scenario. I mean, if a nefariously time-bombed deadly substance were really injected into lots of the world's population, then a billion deaths could happen. Samplers, synthesizers and computers really did change the course of music, except of course it didn't wipe out music that doesn't need it. There really was a Y2K bug, although it only caused mass inconvenience. There really were [and are] enough nuclear weapons in the world to cause global annihilation. The point is that the current doom and gloom predictions of AI are overblown. Many of the them are propagated by bad actors who stand to gain financially from AI, and it goes like this: Something extremely powerful that can be used for bad is still very attractive as a tool for domination. "AI's gonna take 30% of jobs?" - Invest a bit of my money. "AI's gonna take 99% of jobs?" Invest ALL MY MONEY!! Remember, some predictions about AI in 2025 (made in 2023) are comically off.
youtube
AI Governance
2025-09-06T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0DCssGb_g29xje054AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfpspTw4NTVpMO4UZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuO6LgbfQPEfexmD14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAprV-yPfaFxsY2Dt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzY64Hv4WqWT9sJnRp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVdwL57qA2QiaH3g54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwLhDTPHtyb2LW8FN94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxL8kD22J4GwV7WM3x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwKiEaMQxn5aaMjlth4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyW9bTiADJZ317jbkZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]