Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is being put out there to sow the seed of plausible deniability in people's…
ytc_UgyL4WGj-…
G
BUT, what about RIGHTS? If a robot is sentient, conscious, capable of feeling em…
ytc_Ugxm8Det6…
G
My job will be one of the last to go as i inspect Construction infrastructure …
ytc_Ugz2SwgaU…
G
As if ChatGPT and "I" are two different entities... chat bots change multiple ti…
ytc_UgyJFcYQH…
G
@makel3elieve I dont think my conception is lacking. if we invented a chip in th…
ytr_UgzlH7xf5…
G
Bro, making things like masala movies, like Bollywood. AI always needs the human…
ytc_Ugxff0M59…
G
I like how we have hearing these predictions for years. I remember in 2014/2015 …
ytc_Ugy7I4CxK…
G
I HATE AI FART, BOOOOOO
seeing old paintings from the masters ages ago, makes …
ytc_UgxMdxWkg…
Comment
No, we are not in alchemy, that would imply that there is chemistry. That is your mistake here. You are all implying something that has no sign of happening. You are all thinking that if you put enough data into an AI that it suddenly will do something different to the probability stuff it does already now. There is no sign for that and i am REALLY disappointed that someone like Hank Green is so easy distracted from the situation of reality. Like you are literally seeing someone making the first flight and you talk about the risks of becoming a multi universe species. That is what is happening here. You are all talking about something that we do not can target cause we don't even know what the target would have to be. I am so confused why this is so NAILED into the future of AI as if that all is a real possible future. There is right now a possibility of 0% that something magical happens. Sure, there always can something happen and if i ring the bell a monster might appear and plays the flute. But by what we know so far, there is no probability that something magical happens that is so significant that you could use that obscure undefined label "AGI" or "ASI". That is the absolute truth of the situation.
youtube
AI Moral Status
2025-10-30T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1lxfTWilYllBJG5F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz4kHbcpJBOP46Ifl14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7WniCkN-N8KLJgbp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxLv3EAXxRBQZrzcH54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuNrHVIO76mi4l9al4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwhJHE0Xw6pRv7TYz94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw33QRQLgC9LkVEuDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyaGTEsAQ1XU_TmzZR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2z9Qt1hW3GTC3v4V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwA7PYa6nANsdVzNGF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]