Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just wish that they don't have robots and AI because that takes away from the …
ytc_UgwwYmm8I…
G
Freaky? Dawg you dont wanna see janitor ai or literally any other ai chat sites…
ytc_UgywsXM19…
G
Robert Blackford because they have far better reaction times, they don't get dis…
ytr_UgzU4fmva…
G
Just be like me, fucking shameless. Who gives a fuck if they deepfake ypur head …
ytc_UgzbcaHCM…
G
Are you going to show videos of all the people teslas self-driving cars have kil…
ytc_UgxfJFYxJ…
G
I already know that AI will come for my job soon. The company I work for already…
ytc_UgxZvyhu5…
G
I think it will be up to the people to push and demand real people to have jobs …
ytc_Ugy-3STLS…
G
Yes, since it is programmed to do so but how well depends entirely on what skill…
ytc_Ugw2_Le55…
Comment
This is a kind of brainwashing. I particularly noticed when the host said if AI killed us during the night, and we all woke up dead--"that would be kind."
I've yet to think of a name for it, but it reminds me of the film Cries and Whispers, it's a velvet collapse.
youtube
AI Governance
2025-06-06T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiLXYUzt_krrC3CSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKrKiWKHhLp15lLAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmpVaJAhJ2sJrChsl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_SsFcuE-PO3CNDqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzElq51atImiBIFduh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZ1h6jIA69jpnLpo14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6YkbNtTueIn-rzMl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzsUyQF1jYeG0RGq7x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTNPzHHbIGFGMeTPl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdsAZrcVt6UXwXSad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]