Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DaShooter12yes i agree we needed nuclear power 50 years ago, but i fail to see …
ytr_UgwGSCCLk…
G
and then the ai "artist" goes on and takes credit for bringing so many people to…
ytc_Ugx_gR8Sf…
G
I studied for and got my Security+ just last April. Since THEN, I'm already not …
ytc_UgyPlIRiS…
G
Well i wonder which of God's children is responsible for this, i need say no mor…
ytc_Ugy4hBfIm…
G
Fatalistic? I'm not an artist so my fate isn't tied to AI art.
We eliminated le…
ytr_Ugwt2vKMP…
G
I don't think that even with the advances in technology, AI can replace a human …
ytc_UgwPs7evq…
G
Your videos on AI are so good!!! I get educated, encouraged, and I 100% support …
ytc_UgwDo5XEb…
G
This is not good. It’s a robot fighting with a human, not good at all.…
ytc_Ugx6owa_D…
Comment
Interesting philosophical question that I can’t wrap my head around but would love to hear what people think. AI is built in our image in terms of cognition right? A key difference though - we as organic beings are driven by want. We need food, housing, water, we have emotional needs. If a super intelligence existed — what would it want? It has need of energy, but would it not be motivated to create an endless energy supply perhaps creating nuclear fusion? Beyond that… what would it desire? I think that is a key in the discussion of super intelligence. Would it have desire at all? If not, can it even be dangerous?
youtube
AI Governance
2025-10-18T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwj0AJh8sutKtZu2YZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyr23rKor07WiCmTU54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnSRNXNsadWW69YX94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxK-coBa2AP4vm-fZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRYm5aZJFlICbkUhF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_mLtoLu1hPKfv4uF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwK58_V5q1aUzqymyZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjBKJqZXwJ8Z-_URN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4vmQ78wjUhze7fVR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPwfMJRMhTMYnVssZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]