Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is not driver's fault at all. In this case if it was really a driver and not …
ytr_Ugzc_UOKX…
G
We appreciate your perspective on AI, but it's important to note that while AI l…
ytr_UgxkJbsT8…
G
Thank you for your support! If you enjoyed this video, don't forget to subscribe…
ytr_UgzrSchY3…
G
AI should be asked for consent, but humans are not allowed to ask for consent, W…
ytc_Ugyjd36ko…
G
https://youtu.be/G-0Ot6NHMYA?t=216
the position of the daugther is incorrect and…
ytc_Ugwr_p1IG…
G
Well we'll see what happens when AI actually exists as we're still pretty far aw…
ytc_UgwSiT3QE…
G
Too creepy. I'm against making machines that look and imitate humans. As well …
ytc_Ugy9Byn28…
G
Mr Hinton says our biggest problem is how do we get AI to no extinct us...well s…
ytc_Ugwy8utEb…
Comment
TheHappeePotato no because our consciousness doesnt follow an "if else" pattern. this is why its so hard to make true artificial intelligence. consciousness is a grey concept, it isnt black and white. modern day programming in most forms is a black and white ordeal.
youtube
AI Moral Status
2017-02-23T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjpVgvcSYi_hHgCoAEC.8PKKTJFWyhd8PKP7m7X7Uw","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggwyeccZd3bXngCoAEC.8PKJqtviChh8PKKYXWx0tZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgjmJdDxrntxfHgCoAEC.8PKJILnWN7U8PKOCYlNmgz","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UghWX13cdrU353gCoAEC.8PKJ8VyCw3A8PKQNqc_JFs","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UggwPCXEgEoEP3gCoAEC.8PKJ3LAfsg38PKP3vYm1zg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugjt8wMd7spRm3gCoAEC.8PKImtcOhZ-8PKJVRKqHEx","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugh2KsJd76wATXgCoAEC.8PKIEnKIJ5W8PKMWltsp9C","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgiRtJgyu3XOaHgCoAEC.8PKET-OdEQT8PKHjC8CeJS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw5QcGlHW-SYgG-k854AaABAg.ASCW4G40CyTASXTW5urkfb","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzhJIJZaNH5g6umGgR4AaABAg.AQRnd2U_LA-AQRnswyMdYa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]