Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here's why you can't stop AI - even if you force regulation, nobody can assure t…
ytc_UgzblXJ4z…
G
Fantastic, anything but those Marxists cesspool that Democrats push on kids. Re…
ytc_UgzMusSkC…
G
AI needs to read all of Ayn Rand's books. That will make it understand morality.…
ytc_UgwDj1Sr3…
G
In a nutshell...the tech driven industrial way of living has led us to this poin…
ytc_UgxBgBcIZ…
G
Key Words: END TIMES.
Revelation 13:15 (KJV). After the Rapture, thee Antichrist…
ytr_Ugyg5eXBD…
G
I'd rather risk AI apocalypse than continue to live in the current system. It's …
ytc_UgxIHt2cV…
G
Ai supporters are an example of one of humanity's worst traits, trying to justif…
ytc_UgyPb1ScN…
G
1 of 2 scenarios is going to happen.
1. Billionaires and the mega rich want to…
ytr_Ugwv15uOi…
Comment
AI has amazing potential, but like you I would be happy if it disappeared today.
The problem with comparing AI to tools that boost your work with the tractor analogy is that with AI you wouldn't need a liscense to even drive the tractor. Someone can just click a button and it will start generating work, this is something that will take over programming jobs.
I know for a fact that it will be the case because it already has, you see a lot of people with no experience building tools with AI that would require a developer, it's not about gatekeeping, you wouldn't let people do this with doctors or any engineering other than software
youtube
2025-07-27T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGyU18N8HrmsFP4_l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrLs1cpJek-NzF-Bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAmX29NXPadi75USF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzP_FaugVblBJY8gkF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi0DrMNRmMFVVBZ1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxqG7CVAtJu9UeX42N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxy5fq195pbyFeuAfd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkC-CTCzXHiZtQbC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAML2yx4imXUoohnF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1oUYLc_Za7Un9P7t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]