Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lex sounds naive. One can understand the motivations of a Yann LeCun who has de…
ytc_UgylC4Tov…
G
1:23:00 If we get AI to solve that problem then it may just decide to eliminate …
ytc_UgxbojNuz…
G
Still don't get whats the overall benefit to humanity overall. A law needs to be…
ytc_UgzwlYxI1…
G
5 jobs? Cmon dude. In the medical field alone there are dozens of jobs AI wont e…
ytc_UgyMb7sFG…
G
....you are not born able to draw! I am so sick and tired of this "you simply ha…
ytc_Ugy12vT-9…
G
Bro, everytime i use c. Ai /talkie ai, i just straight up roleplay, like straigh…
ytc_Ugzncca0W…
G
For one group, AI has replaced human contact, with some even marrying their AI "…
ytc_UgyY-H5J_…
G
And we are all paying the higher electric bills for AI, disgusting. Say goodbye …
ytc_UgwzCexuq…
Comment
AI can be used to do very complex things, but the humans operating them will tend to oversimplify the data and equate distraction as bad, the best innovation comes from thinking independently, this will end up worsening the education system and creating identical students who are dependant on the band to communicate their doubts. The biggest problem in good schools is the students feeling scared/ashamed to clarify their doubts openly and accept that they are confused and need to solve their problem, due to the unessary pressure to score high marks instead of learning (as people equate high marks with better learning). Once they grow up, they will not be able to function as natural human beings, will be more like primitive robots
youtube
AI Governance
2019-10-02T03:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyYlHZDfjUV9Ct0KEF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyk-AAYcFrU4NrZq6N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaNP58QIHg8YLBMp54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXOzc_OcETF1iF97h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxOxR3gbh6reeqdL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwli3lZi9GW4O1x6Dd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW6pfE5E0eLAL68QR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUD6Tu0_-3DN4Ah-h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7Uc6gudSRJR0Wf7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwDjlr8Njte9alCukZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}]