Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not true. So long as AI produces hyper abundance inflation will be offset.
The …
rdc_ohzieqi
G
this video didn't soothe my fear of AI one bit, just made it worse.. thanks!…
ytc_UgzFi56sf…
G
@predatorb7647 he was completely wrong when he said that. He doesn’t know what h…
ytr_UgwkgSder…
G
Wave lengths will stop any ai shut it down sound n waves magnetic magnesium elec…
ytc_UgyOvQGrb…
G
In 25 years, robots will be everywhere, and it will get harder and harder to tel…
ytc_UgyovZOfg…
G
I’m a plumber. Nobody really wants to be one unless you’re taking over a family …
ytc_UgwdpxXm7…
G
Or we could just, like, not use AI for every little thing.
90% of AI use is unn…
ytr_UgyX2GUNq…
G
Do engine is not even good if fall into so many and all horrible cliché Viander …
ytc_UgxQGLV-z…
Comment
let's all be honest here, the idea of Artificial Intelligence when if first was thought of as achievable was to make it sentient, I see no evidence that says that that goal is no longer the goal, now the other problem would be that if that is the goal you can throw ethics out the freaking window because you have to ask yourself why would you want what is essentially your slave to feel like it is a slave? It's not ethically responsible, and I think any AI would agree with me, to use investors money to create a being that they'll get to own for only a short period of time before it granted rights, why would anyone invest in this? You are essentially paying these companies to replace yourself! It's a "I talked that stupid fish right out of the water" situation.
youtube
AI Moral Status
2022-06-29T18:0…
♥ 76
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgxCIy1LzL5rBKWIxWx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgziTSzj4Pe_F-2g3hh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwZ2XJwDtGP_PqAOF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzma_rVCjpkA1D-WK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDRCTTh6cZCqf3Nc14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]