Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Metaphorically, this is like strip-mining. Sure, you got the resources, but now …
ytc_UgwF6LrUm…
G
The problem with this dilemna is the self driving card would NEVER drive so clos…
ytc_Ughblm0pN…
G
I will not use ai, I don't want those skeletal fingers getting anywhere close to…
ytc_Ugwta7sLW…
G
We don't need robots to do the jobs we hate. Just need to restructure society an…
ytc_UgwiF5u2D…
G
Start building your business on the side now! You could do so much better on you…
ytr_UgwdO6Iss…
G
My god it's difficult to hear this person talk, zero social skills, a robot can …
ytc_UgwfInkLr…
G
The reason we're short on compute these days is real-time inference for people w…
rdc_oh3h3dn
G
@Spe-chann
For copyright, I don't think so, you can copyright a drawing specifi…
ytr_UgznQgjJP…
Comment
My question is why do we suddenly need AI so badly? The talk about risks is what this debate is about but what are these magic rewards that AI can provide that we are unable to do for ourselves. Progress over the last 150 years has been bizarrely rapid and accelerating and yet now we need AI to solve every problem and cant function without it. Why again?
Btw the thing Yann said as a positive of having a AI assistant to work for you and be subservient to you sounds both horrifying to me because of the fact that I dont want help on how to think and also kind of morally wrong like why should this entity be forced to serve me. Yuck
youtube
AI Governance
2023-07-06T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxs32GfFAuVJqXtsER4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyP32EFA3Y5ktq3NCR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxpm-nkEA4Jlj1DWUZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3O-lecstqLqiaL5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5LT0M-B6vvyirP9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgywSt7QVnzDLLJwsnZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzR72iHwgV5RJqN_6F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx1ltpClDN2cUZQHmJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwB-pGM8x1G4L7K-sB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugymf1lykKqLfaW0dVN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]