Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dr. Hinton keeps missing one of the big examples of loss of income. Let me summarize: Here in Silicon Valley, California, we have the San Francisco Bay. The area is divided into three parts: (1) everything North of the Bay, (2) everything East of the Bay, and (3) everything on the peninsula between the Bay and the Pacific Ocean. This is the West Bay. In the West Bay, there are an estimated 20,000 people who drive Uber/Lyft/etc. in order to earn an income. Waymo is now authorized to run automated taxi services over more than half of that area. This alone is a serious problem for an area where many are on the edge of being able to afford their rents. I'm going to send this in an email right now. It is about noon in England, where I believe Geoff is this week. P.S. As an electrical engineer, I have been suggesting to people that we monitor energy usage of large AI systems--what we call foundational models--in order to evaluate their cost to society. This may fit with Dr. Hinton's suggestion of "taxing the AIs".
youtube AI Jobs 2025-09-10T10:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzwYHCVo275So7Yztt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzG4iFqzn_BbzHC8n94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwO1XUtAhqemipKM3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwb1ivPJTnEJLyM2Lt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzUlzE1hQ-YkimvKTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz4eAqnUkzqL92o2ZF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxbgHQG8kmWYJMcCXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwAqK2GtxjUyw_7QwN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz7fAqPaESjShXjqDR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugy61B3Ay89TpyHRKId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]