Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI does not "get inspired". It's a 15 neuron neural network. One tenth the numbe…
ytc_UgzME2IvZ…
G
I used to talk to ChatGPT about my stories. My stories that I poured my heart an…
ytc_UgwElxvA7…
G
This video has jump cuts every second or two, is it an AI video of Rick complain…
ytc_UgxBo4c0T…
G
My new hobby is to block every channel using AI-art for a thumbnail. I have made…
ytc_UgxYZkxeS…
G
Because of the uproar once these are implemented to a larger degree I don’t thin…
ytc_UgxOcLRVY…
G
Hey there! new to your channel but I love it, I enjoy when people find fun and e…
ytc_UghXv2x0v…
G
@rade6912 Toujours mieux que le chômage. Un mauvais choix est toujours meilleur…
ytr_UgysnoVrS…
G
@whiteandnerdytuba I did, and it explains what I said. Artists do what the AI l…
ytr_Ugw3tZ9Ck…
Comment
I'm still conflicted over not antropomorphizing machines/robots/ai. Or negating that they could ever have rights. Now, the rights of actual humans are very tangible and real and pressing to me, and I don't understand how not borderline derising or hating machines and speculating about a future that may or may not come (where we may honestly consider the personhood of ai or robots) takes away from that. Saying that robot rights are not the most important thing right now is not compelling to me as an argument because very few people are arguing they are. It just kind of sounds like neo-luddism. Maybe I've read too much sci-fi.
youtube
2025-09-17T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugww2W5zTXOWwwvcAI14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6mmZZE-rsWWAUClp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYMfxlNfqJ5ioURcB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGyvtzlnAn9gPiqB94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyV0dpSKE8Q6KCqupx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwS8IJ24KL2HrLJHlF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHCP1L8CFJFO3-iRh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6GmsoobauPzhK4NF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyf9lx_peF-oWmCkB54AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzX0fwOegR6BwjN2nh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]