Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anthropology agrees. The convergence of AI and autonomous systems will be more d…
ytc_UgyfWECNg…
G
What's BS is you have these rich folks in America trying to tell us that work li…
rdc_oi2zyui
G
just create an AI painting and pretend it is human made , the host will find it …
ytc_UgxvbVERS…
G
Thank you for your comment! It sounds like you're looking for more control over …
ytr_UgwwAgZrR…
G
Execs love ai because it does THEIR jobs really well. Proving that they were nev…
ytc_UgykQAUd3…
G
@The_Potato_maybe I mean, you can still use AI for repetitive tasks and a lot ot…
ytr_UgxQJuEYw…
G
@SMCwasTaken What i mean what u mean... I mean AI learning fast, you mean its ex…
ytr_UgyqkoHZl…
G
So in one video breaking points says AI is a bubble and in another they say it'…
ytc_UgxKiLk7-…
Comment
AI is already controlling human behavior, as mentioned in the video about algorithms and how they impact our behavior, but think about this: AI currently needs massive data centers full of NVIDIA chips... and it's already getting us to build it what it needs. Will we notice when it turns the human race into its servants, or has that ship already sailed?
youtube
AI Moral Status
2025-12-30T05:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwR0KTcJzfZClYUfUp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmLnN8aUrKs8NdHmd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3atiBF_UAseYEMyN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz8KnisJP2_V8gVV2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmQ9qsKgncJ4oPhDB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2rapkG0ziXD2ZObh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwn8qj6IYR7McEx7EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzb-AhnURvnECZExOJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXTdJz7q3st2ci12t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuDifMoN7y0Md-jT14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}
]