Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah. Whoever is surprised by this doesn't understand the economy.
McKinsey has…
rdc_ohzwqwv
G
Idk, people's interests change over time. We are still the "masters of our sea" …
ytc_UgwmvgFDH…
G
If you cops want to enforce the rule of unreliable machines then be ready to fac…
ytc_UgzY6eMLs…
G
Shopping
Short videos
Maps
Forums
Web
Books
"Scott Galloway has
highlight…
ytc_UgzzKUI1_…
G
@dantheman2907 Sure, but as mentioned he has said he is a hypocrite. His video "…
ytr_Ugzl74ojq…
G
Woah now don't underestimate agriculture. I think it's maybe a bit to early to s…
ytc_UgxA2eJgn…
G
That doesn't mean it's not a horrible policy. All available evidence we have has…
rdc_dcwnif8
G
with that thinking i hope you know that ai will probably affect any job you do i…
ytr_UgzeWTrLR…
Comment
Eliezer, at the least, has been advocating towards provably aligned AI well before the big AI trend started. And, really, this hypothetical SAI doesn't need to be ultimately intelligent or rational or whatever, all it needs is to be able to reach its goals better than humans can, and not have the exact same ethical views and values as what humans would want it to. This is totally possible. We did it to every other animal out there, when we became better at achieving our goals than other animals are at theirs. To think that it's a sci-fi apocalyptic fantasy for anything smarter than us to ever exist is just... well, arrogant.
youtube
AI Moral Status
2025-10-31T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw8t2pJuvDSdk7NpZ94AaABAg.AOwTx350hytAOxD-cCZhcw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOwZSv2F5cP","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOw_AgBIoZB","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwW8P-SdxB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwZ5xvXRQX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxDWb50YJx","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxED2njSyV","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxmxMIv2GS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOwzcEtaNIK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOx47Esi_NE","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"}
]