Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It largely depends on how much hype AI companies can keep up. A stock value is b…
rdc_n80wgws
G
The beginning of SKY NET here we go man…. And if those guys think they won’t bre…
ytc_UgzTRIvNf…
G
Studies on the efficacy of role based prompting see varied results but in many c…
ytc_UgyLPfJQS…
G
AI thinks bad things about humans because we think those things….if we loved our…
ytc_UgwhI76lL…
G
"Context engineering" does not work. You can have say 50 debugged library functi…
ytc_UgzphHG2u…
G
He's pumping money into his own AI you sheep.. he's just behind so he'd like to …
ytr_Ugy7DdEIK…
G
As an AI Prompter and an actual digital artist I would never even consider tryin…
ytc_UgyrV0eWp…
G
Geofenced companies do it not because they can only operate in pre-mapped areas.…
ytr_Ugx-8_K95…
Comment
This is all just FUD (fear, uncertainty, doubt).
Imagine AI taking over 60% of the jobs here in the U.S. (60% of the population works, so that's around 120 million people). AI doesn't pay taxes and neither do any of the big corporations. Taxes run the country. When the feds see a massive reduction in cash coming in from that, guess what? No more military, no more new infrastructure or repair of current infrastructure, no more anything that the U.S. government is addicted to (mostly war).
And then it'll start hitting the oligarchs. When the oligarchs start losing money there will be a backlash against AI. And what would 120 million people out of work do if they got angry about the situation?
youtube
AI Harm Incident
2025-06-22T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqPKmJ_vrNLFmAemB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4Fr1Bl-HZM9xPQp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8lwQ90ldlGPHeE7d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxzpdlTU-RZPbhtLkR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweWieshkt1hvZH9Zt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgydRh_0wi8TiEFgfCl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx7OU5YcmHWuVofNyd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrtQERCo2_sDgfWZl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP3Uwcz8G2TfTC_1N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgySlMu92YjQI9n26KN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]