Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We put up a sign saying, "This is a sentient-AI-free zone," so we don't have to …
ytc_UgwRWZ2m3…
G
Its refreshing to see a kid with a cast on his arm.. that means he doesn't spend…
ytc_UgzrtkTB-…
G
3:16 Which is illegal in the united states of course, But i can't help but notic…
ytc_Ugxq08GhR…
G
Having an Gen-AI ad before this video speaks heaps about the current situation. …
ytc_Ugy1aR6IZ…
G
Both this host and guest can be replaced now with AI . Same with script writers,…
ytc_UgzHQrC2a…
G
AI could be a family and individual therapist dispensing of good mental health g…
ytc_UgzQTq0sp…
G
Please know AI calls the cops on people using these cameras. Lowe’s employees li…
ytr_UgxQOK_fR…
G
Not true 👎🏻 chatgpt banned me for just trying asking him about it and helping me…
ytc_Ugy47JWkG…
Comment
For the people worrying about their jobs, it's not true at all. Because of what they said about regulations. As a government, you don't want jobless people, this can and will cause riots all around the world. Imagine if by 2060 we are maybe with 12-15 billion people on the planet and all the jobs are done by AI. No one is making money, no one is working, this will cause a war between governments vs the people.
This wont happen due to regulations. There will always be jobs. If not, humanity is going extinct or is already dead.
youtube
AI Jobs
2025-11-08T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxyKu-8R1oUIUTyWD14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwppvDMRT4cBc_63U54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-cckfFWQkwyB3nw14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXXH6APWUj1DkKEe54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwURmToERbObg61I-J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwWL0OYkeYHot0UaZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwB1fJvKaLDehOQYaF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqCfsHUx8A4GlHEYN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHujA6QRvNXP-yNot4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwbJuvwL8fT-0pHeBt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]