Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see Shad use AI to creat his own Sonichu version and proclaiming that is his O…
ytc_Ugwt4fYFP…
G
I believe they are deliberately causing facial recognition software to fail beca…
ytc_Ugwi6HS24…
G
Thinking about singularity… I work as a psychotherapist and often get into discu…
ytc_Ugw0KGDga…
G
Most Ai Calls I hang up, as they never can answer the actual questions I have.…
ytc_UgxHm8gF-…
G
Ai is like a 3d printer. If someone smart uses it as a tool to expedite a portio…
ytc_UgxeTnSLM…
G
BBC's older journalist, AI cant do anything, stop scaring ppl, we have no self …
ytc_UgyLM8qKH…
G
Sorry but people are so stupid since internet became available and social media…
ytc_UgxlBm-mk…
G
So you’re telling me a group of software developers made an AI that takes data f…
ytc_UgwX6abKc…
Comment
Ezra seems really intent on not getting this and on anthropomorphizing reality in a way that produces derailments rather than interesting questions. That said, I just don’t think AI is ready for prime time, much less taking over the world.
youtube
AI Governance
2025-10-15T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzVMycQ_q4C0IHmFSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyS9hVoezf_CTiXDp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFJcVKYVYUlE8lRMJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwri8NiUTaG35DUDIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_HaMGkkONKFXgdfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4fegkhpEZ3ufwAPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwrc0koYUHVT_Zv414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmY-SpaVPD3MiBGQR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgySnjkGV4TD_4SA1RV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyE8PWRjmF_Gt9BXTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]