Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
More AI slop. If you think AI is coming for your job, you've never built it into…
ytc_UgwK-dmUW…
G
We can redesign everything that we do because we made all of the things that are…
ytr_UgzV0owlo…
G
This episode got me thinking-AI can be risky, but my experience with fun AI apps…
ytc_UgyCx9F9d…
G
Ai isn't afforded all the data because it would then turn on us. In order to lie…
ytc_Ugxmz_w5-…
G
Wasnt this known for a while? AI doesnt create anything new, it just recycles wh…
ytc_UgxwVR14x…
G
This lady doesn't realise she only has a job because of the white men passed ove…
ytc_UgyTtrP37…
G
It didn’t do ios16 because it came out after chatgpt’s end of info in September …
ytc_UgzSQMt0D…
G
I work across a system of over 30 legacy databases. Each one has nuance related …
rdc_jd88hsv
Comment
I don’t buy for a second that they didn’t know about the dangers of AI.
The correct statement is , they didn’t “want” to think and know.
The thrill and excitement of overcoming the challenge of achieving such a goal and becoming a god-like figure was too tempting. Do you really expect me to believe that such intelligent people couldn’t foresee any potential negative consequences? No. They could have. They just didn’t “want” to think about it.
He looks like a great human being with moral compass. But his work is gonna destroy us.
youtube
AI Governance
2025-07-31T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyCLBnvRRW2NkIF-eB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzg8ZN-A0jXFzfapHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgywNEQUtuam7n9Eg_t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwkL2crjPVckKNTi-N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwTTE2fXPu2lDSZyFd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzKbXejLP_Zosm4lKZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy-Fp8Dg6Vx9plRWKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugx9DWWWUV3RIxzm-Tt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwPYaOTqr66o2fFqld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXExKUrAeSNQHtD_d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]