Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial intelligence supersedes as a Vulcan, Spock. Don't 🖖 is an order that …
ytc_UgwlbIT-s…
G
6:25 LMFAO AI bro keyboard warriors fighting tooth and nail for companies that w…
ytc_Ugy8oFds7…
G
Interesting how divided the comments are, some people hate and I guess some don'…
rdc_mvpr5jl
G
Whenever they say “ai/robotics/remotely can’t ….*blank*….” They always forget to…
ytc_UgyWkpgdA…
G
There's a lot of smart people that make predictions that are completely wrong. L…
ytc_UgwLCuM5Y…
G
Do you think AI can be spliced with a human brain?hmm or maybe it’s mk ultra I’m…
ytc_UgwqYpu3C…
G
Mostly agree with Elon but AI is an exception. AI is raw, brute power. Something…
ytc_UgwGV18e7…
G
We live in a fake world. We dont know what real anymore. We are fooled and cheat…
ytc_UgwPs_iDp…
Comment
I'm just saying, the argument that "they are not programmed to feel pain" is bogus.
IF we are talking about an AI intelligent enough to make creative upgrades to itself. It could just decide that it needs to have the ability to simulate feelings so it can better cooperate with human needs.
Make said upgrade, then pandora's box is really opened. The machine will have to empathize even if it does take long for it to do so.
In doing so it will learn, couple million times faster than us.
youtube
AI Moral Status
2020-08-27T04:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMCKqDtgnNfcH5bhN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwYR9O-VGQOcl-5i-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNUaUyRx3aIlKu1V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rZKvgKFGyggcJCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxqx8su6wktyNKm1Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n-E8LayJ88lzLLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8d1lOlGg65sIDVuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvpBSM5VZTNVScXLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq0LqphmBYNKFlsDd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgygZF5E3ttKUqK02ul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]