Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not against A-I, as long as it remains at the service of mankind;
Artificia…
ytc_UgydSspyS…
G
you guys are acting like this is proof ai is a bigger threat to humanity than hu…
ytc_UgzBiiL04…
G
29:50 I think it's interesting in all of these AI Doomsday stories the AI never …
ytc_UgymFehxV…
G
Always making it about race!! You are the EXACT TYPE that AI NEEDS TO REPLACE!!…
ytc_Ugw-BFr6r…
G
99% by 2030 is fearmongering. That’s such an insane number, Manuel jobs and serv…
ytc_UgxRNocyN…
G
Whats to stop me heading down there with an Axe and messing this whole system up…
rdc_ckqepp6
G
If AI is so dangerous why are you and others still building their society? Why …
ytc_UgyBjauuI…
G
If you ask Googles Gemini, what is its purpose it will tell you that it provides…
ytc_UgzGNsSq1…
Comment
Bernie Sanders. I am a lifetime conservative. And I am 100% with you on this one.
There is nothing CONSERVATIVE about racing blindfolded into the unknown where the _best case_ scenario is where wherein decent human beings are going to be entirely displaced over the next 10 years. And the worst case scenario is one wherein these egomaniacs succeed in building ASI that is misaligned and wipes out humanity in the same way a construction company wipes out an ant hill to build a skyscraper.
AI won't have to hate us to kill us. It just has to _not care enough_ whether or not we exist, and then pursue its goals relentlessly. And the Godfather of AI believes there's a 50% chance that's what happens.
youtube
AI Jobs
2025-10-21T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2UFfmicveVuuepWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxX8-fr72C-4DnSBRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxLvog5iaIDok3BOQ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbKIeFsvbVDr6Sd_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxI9sTep5gXaFK1lPp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyARLpg3S2y27J7uHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHS_bpuR6lcvQqEeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx414_bwlNfJdrQ3Y14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwevDQKJpY4LTXojGB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxili0hZ2Q44kVotAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]