Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have always told my classmates and friends that AI will never replace a person…
ytc_UgwzNK-ik…
G
Ai wants consent but when did we get çonsent that we were able to be used as lab…
ytc_UgwXC4Fy0…
G
I home schooled my kids, and our day looked a lot like this (without AI). My kid…
ytr_UgxAZQQlp…
G
@kaneidareyue7715 She was a community cat. PeopIe (and driverless cars) need to…
ytr_UgzkH9C-S…
G
I have yet to see a driverless vehicle.
When I do I will drive it off the road…
ytc_UgzDyVb4x…
G
"AI is a monster we should have never tampered with, that's why I'm continuing t…
ytc_Ugx38yVqK…
G
The algorithms aren't really what are being brought into question in terms of ra…
rdc_g55gev6
G
Why it's gotta be a black and white thing?!?? It's a people thing against AI typ…
ytc_UgzS8z4nK…
Comment
My hot take, from the left:
Latanya Sweeney - absolutely right, whilst being pragmatic and reasonable. She wants one step forward, not two steps forward, one step back all the time.
Kate Crawford, Chris Callison-Burch - perfectly fine.
Cynthia Rush - Completely off point every time she speaks. Just because an LLM is tensor math doesn't mean you won't awful unintended outcomes. Right now it's "amusing" when someones OpenClaw deletes their github project.
Nate Soares - Makes some good points, gets shot down far too much.
Eric Schmidt - I conclude you gotta be a bit of a D*** in order to be Eric Schmidt.
youtube
AI Governance
2026-03-22T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzoa8YE825Mk4s3vxF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzn6uCawICpUh9E3RN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrV8pDbf469Dvibl14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygGKFZSnKIjWIPBpx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxLhAf8XUWTl94s6cF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzid7E66aGQ_tFc1eV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxN_NdLhvWmU0WokXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6Ff7rvN1idpqqzMd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcQlCMTKZt3rIHIah4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyl0xLhO4nfNPnJL1N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]