Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank Gods one day all these React-Andy type people will be replaced by AI cause…
ytc_Ugz3J-0Jm…
G
The biggest flex someone can have is doing it with a very close friend instead o…
ytc_Ugya1GiWW…
G
Same, it’s just a friendlier way of searching StackOverflow or looking through G…
rdc_m81467c
G
If you haven't tried running code in the Ryne AI Python sandbox yet, you're miss…
ytc_Ugz8g1kFl…
G
It’s not about healthcare they showed this with the pandemic. We are obsolete. T…
ytc_UgxkjCXUC…
G
Up to the point she thinks computer cooling system consumes water. Your car’s co…
ytr_UgzMpe90K…
G
@kitomit2793Oh it’s not a joke and we actually do think like this. If you don’t …
ytr_UgwRyOGTA…
G
The SMR ai got turned on in 2019 when someone said switch from dating advice to …
ytc_UgwNLktpz…
Comment
Neil I have a question for you. @27:00 you talk about being creative and finding things that AI cannot do as well as humans. There might be a few niche things that require a human touch or body interactions. Let's say the average IQ is 100 and Einstein say had 170. These are just guesses. Now imagine an AGI that has 6000 IQ. What can you really do better than a 6000 IQ AGI? It would be like explaining to ants what a black hole is. You can't even communicate on the same level with the ant. We become the ant with AGI.
youtube
AI Moral Status
2025-09-23T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwP7llphkOQwLmzMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOqtsgTBKLnh6gO9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywIOUOYLnviRivsYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBVuaXfKXF2clQiFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrW-rqNsIEvqf5j_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuUtI7zyoyStgL8IR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqH8w4OwVCWtC6liF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznPwf24gjDEV0sJbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_Yqg3gID2gcjjPkF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwjBiXKPRmgf1Qg5MJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]