Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i got every one right, it literally can't be easier to stop AI from humans…
ytc_Ugyix6EbW…
G
We can and will have robots that drive the car as well as self driving cars beca…
ytc_UgyOvmZsi…
G
While I don't use AI to create music, I have been using it for my next music vid…
ytc_Ugz6IF1UC…
G
People keep urging me to experiment more with AI tools. They don't seem to reali…
ytc_Ugy9xb--V…
G
Not only are they screwing up the neighborhoods. Probably going to end up costin…
ytc_UgxomlBOj…
G
ChatGPT is the orchestra that knows how to play lots of instruments
I’m the con…
ytc_UgxE83-no…
G
@seanwigginsIt's not that strange actually. It is not William that decides when…
ytr_UgwS4q0iq…
G
the capitalists are going to fire all of us chasing profit whether or not it pro…
rdc_jf6zzpm
Comment
@matthew_berman Yes, but why agree with a pause in your video? We have had DECADES to think about the consequences of computer AI. We don't need to pause now that it is finally moving forward where we can actually live the future we have been waiting for.
Think Matthew, where was the OPEN LETTER TO STOP AI DRIVING??? It was never there. Did AI have some accidents while driving... yes it did. Did 1000 CEOs and Tech People run screaming to stop them from making cars that drive by AI? No... Where was all this "concern" when actual cars were crashing? The reality is they just want to use a pause to catch up so they can do it too.
youtube
AI Governance
2023-03-30T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsY-cBEF5A","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsbW4K7MBw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrn4m0trjv9nruHw0PaSq","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nrle7iFSFV","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ns6RZtoKDO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nt0NShWS_s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntJ7Qv2sCu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntlXtisU_-","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw6EtfFGqbU3EKNXFx4AaABAg.8ebBLFhnP-u9TQaU28JdPc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxyULC5OslX0G74cJx4AaABAg.8eZkIXf7xt38e_xmX9IADA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]