Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't need to be conscious, it just needs to be intelligent enough to carry…
ytc_UgxZIlHVf…
G
It's just fallen angels, who are waiting for their torment sentencing, trying to…
ytc_UgxexwroI…
G
It’s called creative destruction. Every generation believes that new technology …
ytc_Ugyyfh9pw…
G
So, until now I thought politics/ UN / cabal/apocalypse - reset / WWIII / govern…
ytc_UgwQU6AEu…
G
4th wall break here:
This video is scripted and even my comment could be AI-gen…
ytc_Ugzf-oEot…
G
Why in the hell would you turn nuclear weapons over to AI when humanity has alre…
ytc_UgziDPyV3…
G
Social media blurred things.
AI removes the friction entirely.
And without frict…
rdc_oh1kgks
G
This is that one scene in Elysium where Matt Damon tried to talk with the robot …
ytc_Ugz9V9bZ_…
Comment
The first step is less about asking if AI should be used and more about your desired outcomes. It depends on what skills you are trying to learn and what goal you are trying to achieve. LLMs can be helpful if your goal is to figure out where to START learning something, but I would caution against it for finding facts.
youtube
Viral AI Reaction
2025-09-05T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyXGl6bIzVEjwhWY5N4AaABAg.AMg0w8jeVvEAMgtFYnrHAo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyXGl6bIzVEjwhWY5N4AaABAg.AMg0w8jeVvEAMlGthSxxs7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyXGl6bIzVEjwhWY5N4AaABAg.AMg0w8jeVvEAMmiuEU0oE9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxCpfBkbkep-FnoKx94AaABAg.AMf_-x9fGrEAMguBm02yAX","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugw-IRJwYpEwbNwLYnt4AaABAg.AMfL53P3lSAAMmiWHHl6M8","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwSLzA3uvCiXZ1ocG94AaABAg.AMfGbbbaePzAMgsvLlTrfD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyT1Ob_LCTb-hxaH594AaABAg.AMf58XA_LLrAMfdWlq-cWe","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyT1Ob_LCTb-hxaH594AaABAg.AMf58XA_LLrAMhesohuXvt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyXdcHnCV8eHlWWOAt4AaABAg.AMf4DGazUSYAMfy8H9Cklc","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UgysmdyIyKmKSxZYpPx4AaABAg.AMf2kI0juQ0AMgu514HIED","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]