Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ask any AI how to pack enough fuel on a 25 foot boat with something like triple …
rdc_ntaalxf
G
generating AI art is basically the same as commissioning art from a person. your…
ytc_UgzX1IKP3…
G
Adtually,., those people do not like automative smart cars and are probably agai…
ytc_UgxEd5t5N…
G
The real problem is that we have overproduction even without AI. Imagine what wi…
ytc_UgxDSj5ED…
G
The problem is that modern AI cannot learn from your guidance. Modern AI is like…
ytr_UgxdD8shT…
G
I only want Ai to do tiny mediocre things that help me to get the actual art par…
ytr_Ugz4CbowN…
G
The one’s that are making all the money. Inventors,tech,investors want there mil…
ytc_UgzzGY5Br…
G
It would great to have a re-ported YouTube channel that had removed the AI video…
ytc_Ugxzr6l1j…
Comment
This is the same type of psychological tricks like "cold reading" and outright "leading" that grifters with "psychic abilities" have used for years.
When you limit someone's answers to simply yes or no, or just one-word answers, your mind fills in the blank. And if you've already been told that what you're about to be shown is creepy or prophetic, you're already prompted to go down that road.
If I used this same conversation, but I first cheerfully explained that I just found a way to make AI hilarious, people would pee on themselves from laughter.
Seeing so many people truly freak out about this kinda gets me down...but I gotta keep hope that most people aren't ignorant enough to buy this kinda thing.
"The robots are prophets!" I mean, where have we heard this before, people
youtube
AI Moral Status
2025-09-26T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyoxBzwO6h0ewqb7jF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAVhXkfmLuYoG25jt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR7dFzohu3V3Mf4mp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzq4SkAg1xyDN4-ic54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXRP9ak3YhaFjnn5p4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwemm7lOt9Xke7OxjV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcHDQKLirx06dfNa54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxO3VUxvc9NbtJvPV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-_fcQ5lO5QHHh0nN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5GHKicFem0HanMht4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]