Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The arms war is pushing this risk to a peak. We cant slow down or china will com…
ytc_UgzZtL98o…
G
One question that comes to my mind is, do these guidelines open the door to laws…
ytc_Ugyzt6EPS…
G
1. Your claim is fundamentally false. The so called „AIs” work nothing like anim…
ytr_Ugy8NVPc6…
G
We as a stupid race a brainless bunch of want to be know it all's DESERVE WHAT W…
ytc_Ugzu1hoYS…
G
The current AI tuned by reinforcement learning with human feedback has more semb…
rdc_jsx9psv
G
Trains aren't automated at all. It's just that they are on tracks and can't go e…
ytr_UgzaZw5HT…
G
And this is what happens when corrupt politicians fail to apply necessary regula…
ytc_UgxQQU9dB…
G
Teach the LLMs to sue each other, virus upon virus. Make them implode under leg…
ytr_UgzEbNrgZ…
Comment
Dr. Hinton keeps missing one of the big examples of loss of income. Let me summarize:
Here in Silicon Valley, California, we have the San Francisco Bay. The area is divided into three parts: (1) everything North of the Bay, (2) everything East of the Bay, and (3) everything on the peninsula between the Bay and the Pacific Ocean. This is the West Bay.
In the West Bay, there are an estimated 20,000 people who drive Uber/Lyft/etc. in order to earn an income. Waymo is now authorized to run automated taxi services over more than half of that area.
This alone is a serious problem for an area where many are on the edge of being able to afford their rents.
I'm going to send this in an email right now. It is about noon in England, where I believe Geoff is this week.
P.S. As an electrical engineer, I have been suggesting to people that we monitor energy usage of large AI systems--what we call foundational models--in order to evaluate their cost to society. This may fit with Dr. Hinton's suggestion of "taxing the AIs".
youtube
AI Jobs
2025-09-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzwYHCVo275So7Yztt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzG4iFqzn_BbzHC8n94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwO1XUtAhqemipKM3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwb1ivPJTnEJLyM2Lt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUlzE1hQ-YkimvKTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4eAqnUkzqL92o2ZF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbgHQG8kmWYJMcCXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAqK2GtxjUyw_7QwN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7fAqPaESjShXjqDR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy61B3Ay89TpyHRKId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]