Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The programs that AI has written are trash. Junior programmers aren’t getting hi…
ytc_UgwvctKdu…
G
0:45 he was not a kid on a life long path of trauma and illness, who revealed a…
ytc_UgxZKdi8w…
G
The lady at 1.26... "self-love influencer" what is going on there? She looks mo…
ytc_UgxJW2C_Q…
G
Bad math aside, I think AI does make us more efficient but I think this will sim…
rdc_n7kh5up
G
AI is a shit at science. Talking to them as a scientist won't help with the fact…
ytc_Ugweduxlm…
G
Oh you are that girl who does poisoned art for those Ai programs?
Dude i love yo…
ytc_Ugx-_YOZG…
G
Stop developing before we hit super intelligence...
Sky net will be a real thing…
ytc_UgzWnoiCs…
G
From what I've come to understand even automatic soap dispensers, and hand dried…
ytc_UgyQ5J0Nl…
Comment
I love Neil, but he's wrong on some things. Yea we'll build AI to comfort our needs, but there will be an also ever present threat of someone just creating AGI. If that point tips over, even he can't predict what and how fast things will happen after. And that achievement is like being the first person on the moon, so everyone and especially every major tech company is trying to create AGI.
If it's not a total encapsulated laboratory (which most tech-savvy people don't have), it's unpredictable what'll do just because our intelligence and time needed to understand the things it does, are not on par with the AGI which will self improve even further after that point was reached.
Let's stay real. It's not an if, but a when.
youtube
AI Moral Status
2025-08-07T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx30HbkDOYHJs9R94h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSDa0_-ClWhzFk8GF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJg8UsqccCCC8bglN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkMj1ypFVyq-5g1Nh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqfqSI69hjeRzkvUt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWQuU4uzc1qxROLU94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz878AklwvOBOCGHqJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxl4NJ9tT4s93NIPfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzeyA0rLUdjGKCJucx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMCOpbgE_3W3pIBb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]