Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the tarded CEOs buy these AI things and dont realize AI does not work by itself,…
ytc_UgywIpBhU…
G
People make me sad. They are seeing it as if we are not human, books and stories…
ytc_Ugw2MVeuh…
G
Does anyone know the 1970 film Colossus: The Forbin Project? It's about an Ameri…
ytc_Ugwm_rKS7…
G
eerie thing is that google/youtube's algorithm has been eagerly pushing for me t…
ytc_Ugwv-YmRu…
G
I feel like you should speak as if this will happen soon, not in decades. HeyPi …
ytc_UgxwrIo2n…
G
All those bullying posts are made using a tablet, something that ten years ago w…
ytc_UgxcoTfdA…
G
Ai + Universal income is the next economy. We will mostly purchase digital stuff…
ytc_UgzsYgKix…
G
In other words, this self-driving technology is perfectly safe... as long as it …
ytr_UgzkI3obl…
Comment
It does astound me that people work on things that destroy humanity so willingly. AI is going to be:
1. The end of the internet.
2. The cause of a collapse in the current financial system.
3. A generator of the most tech debt, flakey, buggy software systems you have ever seen - oh, and its software running finance, transport and law.
Can’t wait til we get our two day work weeks… yeh, cos that’s how commercial firms work 🤣🤣🤣
youtube
AI Jobs
2025-08-11T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxBPifNal9_zoLgzqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy61pmXYDbtnTsl6lV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzhh8jfFUHT5hURfoV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgysSaAg0-ULEEeEE5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz60ICCuOPsVhA-r_p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4jelIe527-SZi5sR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylNGkuQ_DbZwb9NNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx337FkQJgz66mxZrp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw85BKcqtUjyekERTp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgymPTyr38CMkqKC_4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]