Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ai robots will be turned on the general public. 100% probability. Useless ea…
ytc_Ugye7X0I8…
G
Plot twist: We live in a simulation generated by AI. AI collected all the data a…
ytc_Ugxsw7IE0…
G
It did teach me that the gray area being discussed, when it comes to 'teaching o…
ytr_Ugz0nMCsZ…
G
If we get to a point of AI becoming conscious, we have to ask this question………Wh…
ytc_UgzSXBExl…
G
Wow, it seems like the dialog between the presenter and the AI robot left you sp…
ytr_UgzWYmoKH…
G
I love how we’re going out of our way to mess with AI’s abilities
I just watche…
ytc_UgzeQcSOJ…
G
Yes, It is incredibly different.
When data is shoved into these training algor…
ytc_UgxncPAub…
G
When capacity becomes so massive that a machine perceives correlations between m…
ytc_UgzkHNbBC…
Comment
When it achieves self-consciousness. Embarrassment is a very complex emotion, its the best evidence we have of conscious experience.
I dont think it really makes sense to worry about AI consciousness though. There are a lot of scary things that AI could do, but becoming conscious enough to be perceived as humans isn't dangerous. There are already 8 billion things on the planet that seem human, the vast majority of which you can only ever meet online because you're not going to go to the other side of the planet to talk to them. If AI became conscious, there'd just be a few more humans you could only interact with via computers.
youtube
AI Moral Status
2024-05-15T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsN5mlkeF5T9UzFNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRFzYk9vIlcMo2bAx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxm6v_yOLHp3uXy-_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhUKlDq0i4EKIp3l94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7EQRYJF6ZZnu6QT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpS4ISv5MsWtCA-P54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyL86HTOujGPgl8oiN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSNdUXHDPLy4SsGaJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgynRxABJ8HPi5mYGm54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqXYFa9avcdxSfUiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]