Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have learned a lot how to detect an AI generated or assisted videos on Ytube. …
ytc_UgyFSaYlN…
G
Certain parts of the us are now enacting laws like this. Anything the government…
ytc_Ugxt72jGL…
G
"I panicked" you are a bot! Even the best ai can not stop larping as a human no …
ytc_UgwquvARt…
G
that AI babysitter is partially true. the most time consuming task in vibe codin…
ytc_UgzNAe88Y…
G
If you want a deeper dive on this, check out Robert miles AI safety (also featur…
ytc_UgwGd1Xcv…
G
I was skeptical about AI, but Pneumatic Workflow maintains necessary human contr…
ytc_Ugw_RyyF4…
G
Say what you will about AI being used in war and weaponry, the AI drones that Uk…
ytc_Ugz781chg…
G
So if we create a simulated world to mimic our world and fill it with AI, then g…
ytc_Ugxz8U1BS…
Comment
The most accurate thing that can be said about A.I. is that we're five years away from A.I. and we always will be. It's just not something that can be achieved through software. It's far more likely we'll give up on trying to create A.I. and either become brain-bots ourselves through cyborg type machine upgrades. Or we'll use cloned human brains enslaved by a reward system that injects them with endorphins when they do what we want.
youtube
2013-10-10T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxaj0qSxj3vDQlTGVd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrKIEsMAXBfMPcpgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWn5MR2FOw_MAycW94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZIp8bU550INSwa9d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzyJ5EOqeiQVxhpM-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4SRQt-sSjsYTNA0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4ccxwKaV3Gg63AqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf9Cghtmcxj_l0aeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq1HwO7cdbe8WhnnN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJJzfgi3Iw4Kz1WI94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]