Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok but can you provide sources to your claim that facial recognition software is…
ytc_Ugwmq1eOY…
G
Almost every engineer and scientist think that AI, soon AGI will extinct us.
How…
ytc_UgxBjR_Rh…
G
AI tech bros aren't trying to save people's time with AI image generators.
I don…
ytc_UgwFXpclx…
G
@firstfractal515 I know how it works on a technical level. It steals nothing.…
ytr_Ugz4ZLGJ_…
G
My mom looked at my phone and confronted me about it and I just told her the AI …
ytc_UgzucQn6R…
G
Probably a bubble, I don't really see LLMs as continuing to expand, they aren't …
ytc_UgzYEhGj-…
G
@cosmicsvids most people don't see a difference it's even hard for me sometimes …
ytr_UgzIhlMEk…
G
42:00 i bet my ass that the CRM you are advertizing for is made with AI, which w…
ytc_UgwbHzlCq…
Comment
The real question is, why would you want to give machines consciousness? We don't even fully undertstand it. It's like if we tried to send a man to the moon without understanding laws of physics. I think machines should never be conscious, at least until we know everything about ourselves first. But even then, what would be the purpose of a "smart" robot? We alredy control these things, it's already slavery, so by giving them mind we will just make it harder to control them.
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]