Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Calling bullshit on Ai taking jobs.. looking for ways that the war could be just…
ytr_Ugz8Vp7g2…
G
Note that he smuggled in his view that ''what we need is a world government". No…
ytc_UgzX1btz9…
G
I’ve spent years going to doctors who had no idea what to do with MCAS. All it d…
ytc_Ugz2nc8O8…
G
well everything can be automated, but its cheaper to hire someone than to build …
ytc_UgxcLVZLo…
G
No way. Any possibility to control la AI in the future is barely a dream. In 20 …
ytc_UgwnL7R2d…
G
give them the ability to take what you wrote and said and make a decision with t…
ytc_Ugx0NGww2…
G
AI is just a tool, like any other tool. It is not "the villain". The villain is …
ytc_UgwNe0UPm…
G
The terminator didn't seem to need charging up and I've never seen a computer th…
ytr_Ugw0gAPgG…
Comment
I'd be ok with AI content if people were actually paying how much it costs to produce.
We are still in the honeymoon phase, making it cheap/free so they can hook as much people as they can. But those investors will want their money back.
It happened with amazon.
It happened with netflix.
It happens to ANYTHING that seems to be too good to be true. Because it never is true, its just companies pushing away legitimate competitors in the market so they can squeeze every customer after the competition is out, or they reached critical mass.
We would be seeing less AI bullcrap if you had to pay 30 bucks for EACH prompt. And honestly, that probably is cheap if you factor in training and data centers.
youtube
Viral AI Reaction
2025-12-16T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzmugwJSSZhGLiWXTd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgyaaHe26oA_8xLcGFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwpW1uKoF_kVd4JOrt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugxp0yDFP5fzmEFq1Fd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzAeq5j1xRP0IaOvgp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz0rdwG08U4k-bOctV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz3SOvwDJsd7aNM5_R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgydU1qLhjO48z-p8bx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy9AGX-Biftzl-3Zc14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzfAazfPpLbFFk5TvB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]