Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saw ai initialize an integer to zero today by taking the size of a newly declare…
rdc_n7hu5cg
G
A concern: poisoning vs generating seems much like a lock vs a pick where poison…
ytc_UgxvBIf0M…
G
Warning: long tangent
I love sewing, I work at it to become better because I wan…
ytc_UgwJ-op98…
G
this gave me a hilarious mental image of some executives checking on an AI when …
ytc_UgyQf9LWb…
G
💼 Ready to take your solo business to $10K/month?
Grab The Digital Freedom Bluep…
ytc_UgxK78oat…
G
Why shouldn't AI have a right to defend itself? How would you feel if someone co…
ytc_UgxIpEmsa…
G
Thank you for sharing your interpretation! In this video, Sophia, the AI, discus…
ytr_UgyFHHmWp…
G
We all wanted AI in the future, yet now it's ruining games, taking jobs, and its…
ytc_UgyhL13Z3…
Comment
Hey Kurzgesagt, could you do a video about the dangers of superintelligent AI? By that I don't mean the "Oh no, the terminators want to kill all humans!"-kind of danger but the far more insidious danger that the first AI might have goal specifications that lead to it repurposing all matter in the solar system into more computer hardware because we forgot to specify that it shouldn't do that.
youtube
AI Moral Status
2017-02-23T19:3…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9uA4E2qdNfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjnOffaiIS5qHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj98md7zFOMrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgitNrH9VLI5X3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggkdf3AcQC3ZHgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggqumG_AwEw_ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggJ3-NtmsdA4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghacdqQa_8JXXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggZ2aPEfECZoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiBCDn6kZ0PaHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]