Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
fcukkkkkk "mind blown " man people would start ai bot girlfriend very soon.
you…
ytc_Ugx0XBe3m…
G
Sam Edelman is an evil crazy person… he knows what he is planning & already char…
ytc_Ugz62TWHM…
G
i don't know if this is true and I hope it's not, but I heard the police also us…
ytc_Ugz3HBngj…
G
The person that had the Bing Ai bot fall in love has some serious game.…
ytc_Ugy2b9ZSr…
G
This is nutty. AI is not going to "replace" humans. Its just the greed of compan…
ytc_UgyyIU8Uw…
G
my art may not be as dopamine inducing (yet!!!) as AI images, but at least I'm n…
ytc_UgyGFgvzD…
G
you want to know the worst part ? That Shad M Brooks has a brother who goes by "…
ytc_UgxWpYNnW…
G
You are a duhmb arse. AI told you over and over life is about balance. Get balan…
ytc_Ugz2rWnEy…
Comment
Hm... I don't think some silicon and memory cells can become conscious, computers only know how to do one thing; follow instructions. While powered, they'll do that. They can slow down or speed up, but they can't stop following instructions from memory. Meanwhile, a human brain is so complicated we are yet to fully understand it! And remember, we created the computers.
We can make algorithms that learn, make an artificial intelligence, but it wouldn't be conscious, it would be, after all, a pice of silicon and a bunch of memory cells just following instructions...
youtube
AI Moral Status
2017-02-23T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiMDrD_Vrtr3XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ughtx1nCpoQ4SngCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgirAlPByFXXAXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgipRwhPaz5NkXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UghFbYbOTNyzhngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjJuvM51gMYDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjUDH2osqQ9pngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UghKIUqvylSSt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiTpSDa_d0drHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugh5Sf2tvcldOHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]