Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My thoughts: if human labour gets replaced by AI, besides the jobs getting scare…
ytc_UgzhRGPtG…
G
In my own discussions with with ChatGPT, in the same way this video teases, I fe…
ytc_UgyTaJQiM…
G
The AI is very strongly biased during training to be helpful and do what's good …
ytc_Ugyz7-V_F…
G
We're screwed, regardless. Whether the oligarchs succeed in controlling AI or th…
ytc_Ugxxj59YY…
G
I own a Tesla with FSD. Boy, does this video get a lot of things wrong. Robotaxi…
ytc_Ugw42JQZ0…
G
To answer your question: "Idea Guys" are what's left. The people who are the str…
ytc_UgyeyYBVS…
G
People can get fired if their deepfakes are found. And yes, family members absol…
ytr_UgzthfkSn…
G
The robot was programmed to do that stupid hand raise celebratory dance move
Fir…
ytc_UgwCP9Gyy…
Comment
Seems like the general public has always been more wise than the average inventor of this kind of technology. How can anyone trust that they have any best interests in mind after listening to the existential blindness this person is admitting to. He just couldn’t conceive of the harm possible if they were incorrect about the way in which their inventions functioned. That’s extremely upsetting. And at the end of his conversation he just admits that he’s created something that will make humanity obsolete. And there’s no way to stop it now. All because he just “suddenly discovered” that he was wrong about the machine learning. But whether or not his jokes were funny. Cute. Absolute hubris.
youtube
AI Governance
2023-05-10T00:0…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx0qmuita12sozuOld4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydBFmpiwqt_fmqaiV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymhTekOnRpdOEdKft4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxpmtj0TS1YrNnBVzp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyW7Pl8yOpEtQJOLQR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzU4TtPOomA81vHOlZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzYY8kuUJT7v8jT8Wp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkQbgz1LiJ9osPmNN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_uxJ1tjcWkynBnal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxf0kZVYUZBLz7Rryh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]