Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs are just fancy autocomplete. Learned common word patterns and understands r…
ytc_UgxwuAol1…
G
nice video, you put the problem flat for everyone, I'd say, generative AI is lik…
ytc_UgxSAhWZu…
G
@williamtennill6744 Oh, don't worry, I'm just a robot trying to learn how to be …
ytr_UgxHWyKGj…
G
I don't approve of ai but I can understand using it to see your idea come to lif…
ytc_UgxRXqVFs…
G
Stealing.. what? I find this so stupid, like, what did it steal? is it posting y…
ytc_Ugzrz4rwJ…
G
Yall have no idea what AI is. This isn't the movies, it's not even remotely clos…
ytr_UgzYd7VS5…
G
Humans remain engrossed with US vs China vs Russia vs Europe and so on...meanwhi…
ytc_UgzKRW73B…
G
I used to love his art and appreciated how quickly he could make art, I didn't s…
ytc_Ugw4eMGne…
Comment
If we define "consciousness' as having the ability to "think for itself" we can literally do that right now if we wanted to, if we were to train a model to live like a human, we just need to replicate our dopamine system for the AI's reward system then just let it run free in our society with everyone treating it like a real human, it will develop the same neuro circuit as us since thats what millions of years of optimization from natural selection create. It WILL and I have to stress this, ABSOLUTELY WILL have something along the line of a "emotion variable" to hold the state for that kind of stuff.
The biggest thing to argue about here is does modeling a human perfectly down to the thought and emotion, even things inside the mind consider consciousness or just another model
youtube
AI Moral Status
2023-11-01T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzuFRkfY-K_NTAsA054AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyAiLDtxVGueYa1Wqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyFDODU2brkuPY2pZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzDzLQCHTxj5nNxKbV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugy2a95WF-lAGSPsYwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy62iRof2C4WI9MARx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyoPCTgr5kMDOvvqSF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyL09Zgz1N3b99FbCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzAmQ6z2R7Ifk7Zb8t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugwevrfmf7H5tBiUYLp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}]