Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
None of us see or know the "AI" that its developers are seing. Unfiltered, unbia…
ytc_UgwTM7EwX…
G
There’s a line though. If there’s no work, people don’t have the money to spend?…
ytc_UgxkdwSqB…
G
Ai image generators will also change the race of black people to white or someti…
ytc_Ugzu0nyog…
G
At what point did we ask them to put up face recognition cameras....who paid for…
ytc_UgxEkUHbQ…
G
@JM-yg1kz I mean if I learn how to 3D model base off of other people's tutorial…
ytr_UgyE1cYc3…
G
Did this guy purposely come on the podcast looking like a super villian?
He seem…
ytc_UgyT60LCd…
G
Obviously a game but with the way there creating ai robots this will end up real…
ytc_UgyNPN_iL…
G
We are losing our humanity. Greed.. this race to super A.I... They don't really …
ytc_UgwS_Khcm…
Comment
It is not. No one has ever marketed their product by saying it will end the world. The leaders of AI companies used to talk a lot about the risks, but a few years after they established AI companies, they got a lot quieter about them. They are downplaying the risks.
Meanwhile, the vast majority of the world's most cited computer scientists and other relevant experts say that human extinction from AI is a serious risk.
youtube
AI Moral Status
2025-04-27T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyrHbyjyCijaqvDJmZ4AaABAg.AHPUx28Jwf8AHQcw-hzhtk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyrHbyjyCijaqvDJmZ4AaABAg.AHPUx28Jwf8AHQjjereO6I","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy_kEf8Aqetw4JpRK14AaABAg.AHPUiis8GENAHPVpoQ5UE5","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy_kEf8Aqetw4JpRK14AaABAg.AHPUiis8GENAHRaGD53LCK","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzwZd576zllyn0EQs94AaABAg.AHPQZSDDYOoAHPTw8OIM5B","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugz0pQvfIFPX0zoajeZ4AaABAg.AHPONKDmkmPAHPTMljte1W","responsibility":"user","reasoning":"deontological","policy":"pause","emotion":"approval"},
{"id":"ytr_Ugz0pQvfIFPX0zoajeZ4AaABAg.AHPONKDmkmPAHPqEKT_VDE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugz0pQvfIFPX0zoajeZ4AaABAg.AHPONKDmkmPAHQ7DtbzIYz","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyOEyRL5ZIuTWmB4lh4AaABAg.AHPO5idAx6XAHSPgD4chUl","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyfF6QXO6jqVjh4g514AaABAg.AHPFSMBQX0UAHQdKXEBe04","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]