Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He’s wrong about super intelligence not existing in sci-fi. Read The Culture nov…
ytc_Ugwce-TrI…
G
Meta Turing Test: Have an A.I. convince a human that they're nothing but a biolo…
ytc_UghZuYnwW…
G
This data point will train Tesla's AI self driving software to perform better in…
ytc_UgxDDJTKB…
G
@Chonbeee I think the term "use ai" can sometimes be a bit misleading because …
ytr_UgxhdV2bS…
G
Explain how not using AI will help the labor force? You’ll just be at a disadvan…
ytr_UgyF7NsOt…
G
How am I supposed to know if this entire post and all comments weren't created b…
rdc_mlm72w2
G
Really appreciate the no-hype take on this. What I've seen in practice is that A…
ytc_UgxpMjxsp…
G
AI sucks when the image is 240p lmao they need better security cameras for ai de…
ytc_UgyLFLOR4…
Comment
This commentary left the rails when it mentioned AI thinking. It can't do that. It has no desires and if it's not told to do something it does nothing. The development of actual intelligent AI has been just round the corner since the 1950s. Don't believe AI experts who say the same now.
youtube
AI Moral Status
2025-08-12T10:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw-tO0av60SoHHoM5l4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx-ze1050uQcWV3tAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyudrfQ1c--l-NlFxN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwRI8HESVZPezsfwt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwitntJbeESiQPJ2ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwMlDNE5zORajsKTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTQUkmYdZEJ4FtYLB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwA70gVbYsfWS9qIzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygIgNJ4SnCcfeQ1_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyC3boZmiYW5rG7kDp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]