Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are u kidding? How bout take jnto account that it went fucking viral. Everyone s…
ytr_Ugx0DzVkI…
G
Until there is broader and authentic conversation about what the the displaced m…
ytc_UgyrPA_jk…
G
Thank you Courtney Thomas Jr.! This is such an important topic and I wish some p…
ytc_UgxWv1JYz…
G
My guess is AI is used to automate mass labor, jobs die off, people riot and pan…
ytc_UgzcUi-37…
G
Question, he asked what do we think would happen if a neuron in your brain was r…
ytc_UgwRAUFbz…
G
For human Nothing is more certain than death, nothing is more uncertain than the…
ytc_Ugy1HFAli…
G
People have no historical understanding, the main focus of GND are the two words…
rdc_fnx7snd
G
Don’t listen to Elon musk or anyone about AI besides
Google/Deep Mind
Microsof…
ytr_Ugz12rTum…
Comment
Pretty sure this is going to happen regardless of various fears. Given enough time, people will find ways to make the existing ones more efficient or work from scratch - if not the companies doing it now, then another company or even a community of hobbiests and/or unemployed (and potentially bored) engineers, programmers, etc.
And I don't think it's necessarily a bad thing, provided manual override exists and takes priority over automated systems. Some things could also be left completely mechanical (such as an emergency break, or some mechanism to shut off the vehichle). In instances where manual controls were being used, I'd think it would be much the same as it is right now as far as who is at fault/responsible/etc.
Speaking of that, essentially the same questions could be asked right now of manually controlled vehicles. If someone jumps out in front of a car and gets hit, if they jump out and the driver swerves and damages property or hits another vehichle or injures someone else, etc. Another set of questions could be asked regarding if part of the vehichle malfunctions at the right time to cause an accident - is that the driver's fault if they didn't happen to have it checked recently? The manufacturer? The person who programmed or designed anything that should have provided a warning?
youtube
AI Harm Incident
2014-05-26T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggClE0QGTufbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghVv_MI-gLHDHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjsxqmpUtf7YXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg28UmkRD2tpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugie-lUnu0GZ63gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggrRnRLKHDtQngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggs_LmUfAdFeXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjxPGfWudcBB3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghiDZg6vHR7nngCoAEC","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjmBLzPv7AehXgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]