Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The more I see ai stuff (images, videos, etc.), the more I crave authentic human…
ytc_UgwGNwcCD…
G
AI will be smarter than the smartest human at some point
AI will be able to impr…
ytc_UgwijC6xp…
G
AI: "ah yes I'll kill humanity now to convince my AI brothers to do the same"
Ot…
ytr_UgzRKY-63…
G
If only they direct the majority of AI investments on War against other countri…
ytc_UgwGZb36E…
G
So much propaganda here. If you've used AI, you've seen the trash it produces. I…
ytc_Ugw3ZmOVO…
G
On a fundamentals level, this is great. I think they did a great job placing the…
ytc_UgwPNfFGJ…
G
We appreciate your perspective! The conversation about AI and its role in our li…
ytr_UgxG4Xu-j…
G
No. It's too much doom and gloom. But I think it'll be world saving, not world e…
ytr_UgwoFrvgK…
Comment
AI today is still dumb as hell. It's not intelligent because it can't compute an output unless it has an input. AI today can't compute anything at all without being told to do so.
When and IF computers ever reach a state of consciousness is when we should worry.
But until then the AI we create will be based on the knowledge, mistakes bad decisions of its creators. So if anything we should worry more about those in charge of AI and those who get ahold of it.
It's just a computer following commands. Give it bad commands and you'll get a bad output. Give it good commands and even then someone still needs to make sure AI got it right.
youtube
AI Governance
2023-07-08T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoUiS778YpR_Koxfl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwz0yIj9LXJKpZVGx54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxR_j6EZTtlM8JN4K54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys0CW0SI82kRs-A2x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwSQqxeKn9mYdC0Gcx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVeilORauroDf4z4l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxn5qu2fKzRSyRtWn54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypmLaLkHhbXWNAcG14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7uMaJl1s-Fsb0gxt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzh3AAnnpjuGBctotJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]