Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The worst is they're already doing studies and they show the more you use AI the…
ytc_UgyuSjEWb…
G
AI = BS
Let’s see how they get the juice to run the machines.
Pull the plug.…
ytc_UgwV7cVTy…
G
The movie business is over. So is the music industry. artists are unnecessary, n…
ytc_UgzzSJzzR…
G
The end is nigh you say, that is a good thing, because that mean you all billion…
ytc_UgwhpWKLp…
G
Opening sentence is wrong, I bet: "So you want to ban any law about AIs because …
ytc_UgzofpIB_…
G
It's alarming that so many people are in denial that AI is rapidly changing (rea…
ytc_UgyRtDOL2…
G
Echo story was great. AI seems to be able to wipe us out, but tech is not advanc…
ytc_UgzwKJxGx…
G
Why are we pretending that this isn’t already running ? 21:37 isn’t that why we …
ytc_UgwM3nW9H…
Comment
I think it's important to not personify ai too much. Emotions are incredibly inefficient and as complicated as the AI function itself. Ai doesn't need to be malicious and it doesn't need to be self aware. Most of the time it's just designed to solve organic problems or emulate a limited set of user interactions. Like a fancy interface. I think musk is overblowing and romanticizing the threat.
youtube
AI Governance
2023-04-18T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfblKVfB_mqwzMvLl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyLITVL5YapM0gFNx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy8CEOKb-jRXtkTW8l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziqxK97902O0qlU894AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjSPRtGFQIx4Dmuph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyctdq7IhhaXHK3qDh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWe22vxAklxeTmGpt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7RcmGntBw21YldKx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3kErENBazddJ_X-J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxbp1y1cNE7QJTFXmx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]