Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the moment where the unaligned Claude would call you… I cannot even writ…
ytr_UgyyzcShT…
G
This is an interesting video. Im 46 years old and went to commercial art school …
ytc_UgyLtGOpV…
G
Those saying "Why commision an artist when I can have ai art for free lol" don't…
ytc_UgxNLSG-6…
G
Hello Dr. Cellini,
As somebody pursuing a PhD in medical image analysis (the br…
ytc_Ugxy6CQ_t…
G
We shouldn’t be scared of automation. We’re only scared bc we live in an oligarc…
ytc_UgwsJpKBp…
G
She meant that based on the historical data AI had about women and credit cards,…
ytr_UgxXlJ4Vo…
G
So who is responsible, or should be responsible for ethics in the development of…
ytc_UgwSJCXQE…
G
Right, the study isn't related to sentience, just on algorithms. AI and sentienc…
rdc_dy4eiqx
Comment
It’s great to see Stephen Bartlett doing such deep, serious dives into AI safety with heavyweights like Tristan Harris and Professor Stuart Russell. It feels completely clear that this is a genuine passion project for Bartlett rather than a commercial move. He’s obviously well educated on the subject himself and is doing everything he can to communicate the urgency to an audience that is largely oblivious to how short the timescales are before our societies are utterly transformed, for better or for much worse. It’s a wonderful podcast, and I really admire the balance and thoughtfulness he brings to what is arguably the most important issue of our time.
youtube
AI Governance
2025-12-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwAu4RSAvyCqYwO7_F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9jYO7r9MVnbvLnQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxURsOiegkAFxbdgIt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygtdsVGOWQ-jTsREJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzU0zDansQYJIiYv_Z4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw18weYW9nco7gt5Q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMLfJ3Lr82pv8CB14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxE_aqYPhB36tPy0SJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw2-Lj6fU2sbsljGQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrHGB8hs0c2QJEgOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]