Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wouldnt worry about ai taking over or being a danger. Its more like ai will e…
ytc_Ugwldno69…
G
Please spare us the automaton who parrot's what you place in it's programing to …
ytc_Ugwh0hcUM…
G
Friday, October 31, 2025 . . . Greetings, Everyone. From The Original Star Trek …
ytc_Ugz6wjYM-…
G
Let's index UBI to automation. Did you and thousands of other people just lose …
ytc_UgyWsJ3qM…
G
Tesla autopilot doest make mistakes, its user error. We have some dumb phucks fl…
ytc_UgxRZpyc1…
G
If a brush or a camera were the same tools as AI, then why didn't he just "simpl…
ytc_UgyONKSPj…
G
Why do we stay calm while corporations plan to exterminate the majority of us? A…
ytc_Ugx7yiwse…
G
@LC-mq8iq what happened to reading comprehension.. we are talking about pursuing…
ytr_UgyGCnu8H…
Comment
So that’s why they’re discouraging women to have babies, couples to marry…they have the goal of diminishing the population. However, ultimately it seems like you end up with something wrong. All these people that can think only about machines and bots and computers and automation are missing the kind of things that could go wrong with all of this. The concept I’m speaking of is life itself, the essence of human beings and individuality. You just cannot reproduce everything about life on earth with robots, and that could become a problem. You can’t see into the future so you should stop predicting it. This is what you think will happen, but you don’t know that it’s not a fact.
youtube
AI Jobs
2025-11-01T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzOwn2b4TvE7I5w94t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzsys-7o__T_KJVmWl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkUadC_cT2IHBV5-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQQmWk0lh2ZhsR9Yp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlTJdIwGZXJojuohJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8QC-iBASoko5AgC54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFoxlhGBoUhVQ7n954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzfsc0GqMKnBtpoPrJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweZ4S1M8XyCK1qDgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyG-cGOIpK8eSOcHXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"confusion"}
]