Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Some quotes of professor Hinton from the book Genius Makers by NYT journalist Cade Metz: “If I have a robot surgeon, it needs to understand an awful lot about medicine and about manipulating things. I don’t see why my robot surgeon needs to know about baseball scores. Why would it need general-purpose knowledge? I would have thought you would make your machines to help us,” “If I want a machine to dig a ditch right, I’d rather have a backhoe than an android. You don’t want an android digging a ditch. If I want a machine to dispense some money, I want an ATM. One thing I believe is that we probably don’t want general-purpose androids.” “The progress is being made by tackling individual problems — getting a robot to fix things or understanding a sentence so that you can translate — rather than people building general AI,” he said. At the same time, he didn’t see an end to the progress across the field, and it was now out of his hands. He hoped for one last success with capsules, but the larger community, backed by the world’s biggest companies, was racing in other directions. Asked if we should worry about the threat of superintelligence, he said this didn’t make much sense in the near term. “I think we are much better off than Demis thinks,” he said. But he also said it was a perfectly reasonable worry if you looked into the distant future. (Note: the book Genius Makers was published in 2021, so what he said here was before his realization that AGI and superintelligence may be reached much sooner than previously estimated)
youtube AI Governance 2023-09-12T12:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzbKulsqmqnVWwMSHR4AaABAg.9v5ipgp2iTx9v5uu39BXpD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzK9k4hAU0kT3OuYqB4AaABAg.9umxiug8sjK9uqLTsLCF0c","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugx2Cf5WSEBZCIxmt6l4AaABAg.9ujjTfg_rXL9ulPiMVqwpS","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzoWp_VUj3Ot3nZG914AaABAg.9uXWw5l5Qgy9uZIs03wwOA","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgzoWp_VUj3Ot3nZG914AaABAg.9uXWw5l5Qgy9uZKEXyZbfs","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgzoWp_VUj3Ot3nZG914AaABAg.9uXWw5l5Qgy9uZLXd1Pwsc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx5l1HO_7jVU66445B4AaABAg.9u5fkrn_dz09u9WQcWl1y6","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxmd-DgZCmV5Dp38Wt4AaABAg.9tp_9J4UzHv9tp_hBia1m2","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx9cKosGmQfk4IwBHp4AaABAg.9tnUaNc7BNQ9tr5pdO-sUY","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx9cKosGmQfk4IwBHp4AaABAg.9tnUaNc7BNQ9tr7LcA1XfC","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"} ]