Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah well, nobody during the opioid crisis also was shoving oxycontin into the p…
ytc_UgzC3ve21…
G
Haha yep. Relentless ads for gambling, AI slop generators, or trading apps.
Ma…
ytr_UgwR3TnUy…
G
AI can now be controlled by us
on any platform we created some new software you…
ytc_Ugy_1MGau…
G
This is episode is only missing one point, AI is getting smarter while (in the U…
ytc_UgwU4DhLG…
G
Sam Altman or any other AI creator doesn’t fear AI causing significant harm to t…
ytc_UgxTD22jA…
G
@NahinSarker-pb2yr i know its fake lmao... im saying imagine in the future.. a …
ytr_UgykoPFeu…
G
So wrong, with the right prompts you can get something like this, you just halfa…
ytc_UgzNwJAxe…
G
No it's not. Our use of language as chemists is always SUPER dependent on contex…
ytr_UgwEwlLhE…
Comment
Humans are conscious. AI is not and will never be conscious.
To be conscious is not about intelligence or thought. AI can do those things better than us. But being conscious is about volition and intentionality. They are the pre-cursors to thought.
We are conscious and that is all that matters
youtube
AI Governance
2025-06-16T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxBPOatUE649J6EnIN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjlHTvUmH8NGseLSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzjA6g3CIf9W5qpJTd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZeWybmm68Og5K_f14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzys_q2gQ6ncBuWAUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3DBCAztbHgNzlzOp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBwrP1dWuv5W72mvZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz_k9GKdXP8izXVcZV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_uVhuObLO7RLhlqZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdkJilvcZ6F6OveZh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]