Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
imagine AI in 500 years, I believe there wont be humans as we know them today ar…
ytc_UgxwhwHmk…
G
@SomeRagingGamer and if she got in there and the robot started moving the comp…
ytr_Ugxma6KUt…
G
the video asks, "what if we programmed a robot to feel pain?" then wouldn't the …
ytc_UghyKvMqu…
G
Republicans have decided we have no need to regulate AI. What could possibly go …
ytc_Ugy5s4bEN…
G
When did they start saying that machines/automation will replace workers? Even i…
ytc_Ugw7fn9uM…
G
The difference is there’s not a single job that automation can’t do. Including …
ytr_UgzMOKcii…
G
to be frank I don't trust the states to regulate Ai. They'd just screw the littl…
ytc_UgxA9i7w0…
G
if you go to the youtube page Answers in Genesis Canada, they do a whole series …
ytc_UgzOzyyjF…
Comment
China will win this race and humanity will lose. Everyone for Ai and a robot future, will also lose. When the father of Ai was asked, what education people should take. He replied plumber 🧑🔧, thats the only job Ai won’t take from you tomorrow. And for you who think “but I’m in university now, becoming an economist.” Ai can do a much better job, plus it can be at different countries and places, working with different assignment, plus communicating with itself meanwhile. You are to slow.
youtube
AI Moral Status
2025-08-21T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyRIwHdiVLsTh55kCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDuG0Z2lma-PFwe-d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfE4yuD20rFPks59J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKC52L8dJSi7eVlTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzOVLErpuLw5VNyFHN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7IKaaDtY641TJGE94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0eY4BzpUHdmBsdPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaJusK7e9s_fOCQLx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzQDY4FdGNPCvwYZNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2obZ2FjlsvA6N3lJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]