Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Another interesting thought. Better drivers aka people that drive slower, don’t …
ytc_Ugxg3LXjQ…
G
The "AI" industry is definitely going to experience massive amounts of unemploym…
ytc_Ugw7urMVB…
G
AI does nothing humans have not done. Take art for example.
All art today is pos…
ytc_UgyMLUARr…
G
What do they mean by "gone wrong?" That went exactly as I thought it would have.…
ytc_UgzNMNU3a…
G
@menyakababb9583 Umm.... What are you on about? Did you READ my message? I am DU…
ytr_UgwFOqTph…
G
We're glad you found Sophia impressive! If you're intrigued by her responses, re…
ytr_Ugy6vUAzz…
G
They never believed AI was going to replace 10s of thousands of developers. But …
ytc_UgwJcSagg…
G
That's actually one of the issues: this "box" is NOW big enough to contain all o…
ytr_UgxPbs-or…
Comment
It seems that Elon doesn't understand that the type of AI he refers to, is not the AI people are developing, ratherly he refers to a certain type of concious agent that would possess computing power of AI people are developing. The thing is that the ghost in the machine, oftenly called singularity, is certainly something we can't in principle create no matter what kind of tools and technology we have, because that would mean that we would be masters of the universe in no time. Just think about the fact if we could produce self replicating living cell from crude matter, how far would that be from a concious agent that emerged in the last few seconds in the 24 hours of evolution.
youtube
AI Governance
2023-04-18T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJIQWH2jl7IbZayOZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyX2CIj0Kbr9uRX7d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwS52u1v0DRrAl2ec14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugza3zzRTM8QWStvFDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyT23MzcJ-yre4hLHV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1dMBVvhXPvC4BNEB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgziWOaTSjVb4GzmXaJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznkJbPzfZcxx9ldml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWHu3z4QKbrPWyLhB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxm1WtIYn8kbO6uN3F4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]