Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It doesn't matter if hes using ai at least if he doesn't use it for brainrot vid…
ytc_UgxPyeiOD…
G
Its one of the oldest profesions in the world, and will continue to thrive AI do…
ytc_Ugyifh9v-…
G
How long have I been sleeping on this video what the fuck who thought it was a g…
ytc_Ugygo2LBj…
G
Self driving car learn driving in GTA5
# and this is not a joke #…
ytc_UgzzymE-p…
G
after asking for “not human kids” the ai creates a photo with black children. Ba…
ytr_UgxmiNlVC…
G
All it takes is one human like robot to suddenly escape and somehow take over th…
ytc_UgwLJJwuN…
G
I feel like this is going to make doctors similar to mechanics. Mechanics these …
ytc_Ugz317LY9…
G
You make a fair point, but making them autonomous I think actually helps prevent…
rdc_ohsk2ob
Comment
When it comes to Sora my assumption is that this is an interesting side effect of working towards a larger problem.
Lets start with saying that the dream is to have AI do all the work. To do that you need to be able to walk around in the world in your robot form factor and not fall into traffic. In order to do that in a cost effective way you invent a digital world that has perfect physics so that you can practice everything in. Sora 2 is a stepping stone on the way to ai being able to create worlds for itself to train real world tasks in. Maybe.
youtube
AI Moral Status
2025-10-30T21:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxnwHSSlGCuivTFszJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdLssxoriB_tmqhQB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuDnfAUuhhHdwnjcN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzrQ8DTBT42E71OiXh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyZ6jC9iPewbul9Dw94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlOMjrzxfH4J9Rfi94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwx8tuo7uUno_HpBlx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwAqXRJeAyO5U0o07Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRMg66zYDt84P8JlJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMsKMJXSf5w7PJ60R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]