Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Downvoting not because this is fundamentally wrong. But beacause it's a low effo…
ytc_Ugw7H8T6O…
G
His brain was already cooked if he was reaching out to AI for medical advice.…
ytc_UgwcAtNJ-…
G
That was the point of the ad, the implication that humans make spelling errors w…
ytr_UgxUNstKW…
G
As a CS major who has worked with AI learning, there is ethical ways to do this.…
ytc_UgyE9EEbG…
G
People who are scared of AI taking over programming have never actually studied …
ytc_UgwsQkq-l…
G
Is this autopilot or FSD? I wish news networks would actually do their job and t…
ytc_UgwMR50D8…
G
An art discord I'm in has a channel dedicated to ai generated characters that fo…
ytr_UgwKB7-Jy…
G
AI is not conscious. They don't know how to constuct consciousness. It reacts to…
ytc_Ugw7n5J_z…
Comment
“I just listened to the interview with the chap from Google who helped create artificial intelligence, and honestly, it was incredible. Hearing from someone at the very centre of such a revolutionary technology was both inspiring and eye-opening. The way he explained how AI has evolved — and where it’s heading — was deeply insightful. You could really sense the depth of his understanding and the vision behind what he’s built.
It wasn’t just about technology; it was about the human mind, creativity, and the possibilities that come with pushing the boundaries of intelligence. I came away from it genuinely amazed, with a renewed appreciation for how far we’ve come and how profoundly AI is going to shape the world in the years ahead.”
youtube
AI Governance
2025-10-14T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwAVqXMTV0ZnrSLws54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0-qAZ5PLKcI3dmq14AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugxia6tJM2rmd6whSx54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwHtDi4lBo0gp-g3MF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw5F9qoMwKoTtdQYgh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwA-bz1NiiS30XS-il4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxk62yWkQ7pM1GhYd14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-kRgCaoekLf7lcRZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4LIqXXbSYv-HmEtR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwbtz84vKD4bHPhCq14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]