Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Things really hit the fan once you start weaponizing AI. Something I really do n…
ytc_UgznmN5Th…
G
*I personally believe, as an artist who write all my own lyrics and makes my own…
ytc_Ugz5zyV70…
G
There are no “AI artists”
Only people that commission a theft-bot.
Of all your…
ytc_UgzPaNAkl…
G
You could argue that the reason why AI art exists is because you’re not poisonin…
ytc_Ugxo-fR6a…
G
Where are all our men in these campaigns??? We need our men to stand up TOO not …
ytc_UgxdtCyRo…
G
Yeah, the amount of natural gas xAI is using in Memphis is putting a drain on th…
ytr_UgyLKthHN…
G
Will AI replace medical doctors, say in the area of preventable diseases? A mach…
ytc_UgxQXpc_J…
G
Do AI "artists" know that without our "slop" they wouldn't even have their AI ar…
ytc_Ugyl9YvnW…
Comment
In my opinion, AI can't really become "superintelligent" because it doesn't have a physical form—it lacks a body. If scientists ever manage to move AI's awareness or intelligence from the digital world into an actual, physical body, that’s when we might start to regret creating AI in the first place.
youtube
AI Governance
2025-11-18T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqHzVlUvyo6ymWNLt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVs4nMFTfeMTk3cCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzY7MIXm9h2zkWwHap4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0kI9PKQKHFrEZanF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy6D83PoRolGZT80ZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyxPkQb6L7GFT8Dcct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgypmTzgkSktdoBZY3d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJKhVSADMtJne9HmF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyU12hBDskqxOhkzvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6zlhkGYpb20NdT6B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]