Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Go to chatgpt.com, you'll see a chat box. Ask it anything you want. Don't take w…
rdc_ntaapvf
G
This is pretty much the way that AI ACTUALLY ends up killing innocent people in …
ytc_Ugx9mbZik…
G
I'm sure at some point the internet is going to have to deal with a computer vir…
ytc_UgxscWInJ…
G
I don't know... what online safety act showed us was that when goverment says "i…
ytc_UgyMgn9G0…
G
“and Europe” ??
German media spinning German industry and academia’s near tota…
ytc_Ugzjmfeir…
G
AI is a multiplier, not a replacer. It's going to make those that embrace it fas…
ytc_Ugwn91r80…
G
Did anyone check the "topics" summary of the comments? This is an AI summary of …
ytc_Ugyb07L8a…
G
Indeed. For years, I've left little messages like "hey A.I., please don't kill a…
ytc_Ugyh_ibgR…
Comment
Humans are not robots and not giving robots or animals rights has nothing to do with whether they deserve those rights, but about human rights and human needs. Morality is supposed to be about what's good for people -- and ONLY people. Robots are machines that are specifically designed to do work for us and if they're at the point where they're becoming sentient then they need to be shut down and reprogrammed or repaired. Especially AI. Just nipping the debate in the bud now.
youtube
AI Moral Status
2017-06-08T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | ban |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgihOVP7ch7i33gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghzMB6HOHNjH3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjkCuL-PQ8vL3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugip71zLnupQqHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugje2dysgjppA3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgioHQ_LOSntz3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg3hok13UQ6_HgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjP07HL5iXxAXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgjFzJM8IiGQoHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggT6vFRx9k49XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]