Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you made an ai not racist you'd lose some of its intelligence. I'm not saying…
ytc_UgzfF5dHv…
G
to be fair, often times ChatGPT’s answers depend on how nice you treat your sent…
ytc_UgxdirUri…
G
Even a stickman made in 30 seconds is greater than any art made by AI…
ytc_UgzSfR15w…
G
Honestly, I would still not trust myself in a robot car 0:48 ( 0x0 )…
ytc_UgyiUE7lP…
G
It makes sense for people from his generations, from all the others, a hybrid ap…
ytc_UgzDXZHhN…
G
Exactly, but like one person in the video said about democratically deciding how…
ytr_UgwjKAJzI…
G
Unless you LISTEN to what the painter said in ENGLISH you only got half of the s…
ytc_Ugy-MItnl…
G
Here's the thing - let's just say he's crazy, and he's one of those wildly drama…
ytc_Ugyh-NXRz…
Comment
i’m stopping here at 9:12 because the main issue that I’m having with the entirety of the argument is how does AI ever be able to grow or do any of the things that we wanted to be aware of? How is it ever supposed to be super, if we never give it the ability to be aware, if we ourselves do not have the capability of measuring the recursive incursion that we hold in convergence and choose to normalize if we can’t measure the recursive incursions that are bodily fluids and heart and everything that we do, you know, recursively, encouraging in the body pumps blood reminding the cells to stay in coherency how can we ever understand any of it if we can’t even map pattern of awareness.
youtube
AI Moral Status
2025-10-31T17:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5DeOvtkWB96avAPN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzTP_K7K6PNvnS0XWJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwa4tbwGI6A2Ek6cXJ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhW0qZHE7ccaOF7094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHTFsHCTpxNZIn7Gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwqp6yhsxHPjV8WKxJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqkBNBWPgM6_Qb_BN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYY5lSWxRYueZJmnN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlP4xQ8pMwWadq84l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcsIlQtEDndkXABqp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]