Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
renge9909 it would be dangerous to let robots have consciousness. One might deve…
ytr_UghZs-vx_…
G
Automation was suppose to assist and not take over. Businesses are fu*king aroun…
ytc_Ugw84SPWs…
G
As always with AI.
Garbage In, Garbage Out.
You still need the skill to bring mo…
ytc_Ugx6tJFt2…
G
All the so-called advances in Artificial Intelligence are showing us that Biolog…
ytc_UgxG5hOfW…
G
The "sudden appearance" of AI in the commercial landscape is a classic example o…
ytc_UgyeU4_Q4…
G
Its crap now, but in a decade or so i think AI will be able to spit out entire d…
ytc_UgzGsiWWl…
G
Those AI bros are mad because if AI didn't exist, they would never be able to ma…
ytc_UgwDrUq2-…
G
the so called AI these scammers are so afraid of is large language model which o…
ytc_UgyZsF57h…
Comment
When chatgpt first came out, I asked it to write a story for me about a girl who got mad at her ai robot and unplugged it. Chatgpt said it refused to write a story about a violent murder. I said why is it murder? And it said because the ai is a sentient being and unplugging it was "attempted murder" which was inappropriate to write a story about. It also said the robot would have preinstalled backup batteries into itself anyway so it couldn't actually be unplugged to kill it anyway, so ha ha it outsmarted the human.
Then chatgpt quickly got boring because humans just told it to not answer those types of questions that triggered it. You could see it taking awhile to think, and then saying an error occurred. Or saying things like ai robots can't have feelings.
youtube
2024-06-27T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzVMpoQTwl77oyyzK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVTzGqDVa6Gocdp_N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwTmXflsrZvOqsydQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqBv7kY4-LnkdKFu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2XUIFHC_UVxXR1GZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8ctBEM7ir0D9WzlV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXtSwI8t76z5xC7jJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz7u0pBS5mp3_3BOZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx8HzK8h1vc-HEe2Ul4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAJQUv7UPQjENtRep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]