Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For AI to consider everyone's
opinion would be a wreck not sure the majority of…
ytc_Ugz_QmUbr…
G
Ai is theft. Anyone that supports it just wants to steal art because you aint go…
ytc_Ugyd7R5mQ…
G
@kristiandixon3510yes. AI slop is objectively emotionless and lacks humanity. I…
ytr_UgxReG7si…
G
Google has been part of the genocide in Gaza, as well as many other AI companies…
ytc_UgxFFuy6S…
G
Another proof that physical beauty is only skin deep; but that's enough to affec…
ytc_UgzyMDjAx…
G
do sentient self aware ai /robot, clone deserve right , ask your self a questio…
ytc_UgiMadlbS…
G
As I write this Deepseek has been always available and very helpful, while ChatG…
ytc_UgxV4wE-R…
G
The real threat to the world is a mix of capitalist sociopath and emotionally de…
ytc_UgzFOuqyp…
Comment
His solutions at 10:30 to 12:00 are AI Scientists he is developing. It stayed a bit vague so I looked it up. AI Scientists have no goals nor do they act independently, they only process data and come up with theories, so in science and tech they can bring about wonderful innovation without danger. Idealy there wouldnt be any agents until we know they are safe, too. But as there probably will be (no matter the consequences) the AI Scientis could act as safty barrier to check for missalignment of what an Agent is planing to do.
youtube
AI Responsibility
2025-05-23T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgziOuzoxBpk8Im9JLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgydruAdXOqEm4MfHYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykrU_gTbE0oHDrS3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyLsZU8syeC7kUvuO54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywpSZxkWek8BpWY954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvaBXaZh48vb6vE2V4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxkc6gKWQzGA--aldF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwyU2BSytrBEOJUizF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyqEjWBDhtFzQMPw0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxQ3-_reGnC7iagOjZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]