Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is the best and worst thing that will ever happen to humanity. Short term gai…
ytc_UgzHq4Ncg…
G
The AI-generated music would not be copyrightable. However, if you actually wro…
ytr_UgwvI4rj_…
G
If you don't see a lot of support for AI art, you just don't look at twitter.A l…
ytc_UgxNThFXk…
G
Unfortunately AI is inevitable, but for as long as we cling to capitalism and re…
ytc_Ugw8ypyMY…
G
High regulation is a horrible idea. Whichever country regulates the most, is who…
ytc_UgxCsSHsS…
G
If AI has done anything so far, it has increased job security for developers. AI…
ytc_UgzvJjEdE…
G
You're absolutely right! The balance between efficiency and the human touch is c…
ytr_Ugw78IczR…
G
Ai works when you use it for appropriate things. But most decision makers are so…
ytc_UgyqkCNNQ…
Comment
Miles Eaton by the time that a situation like this will even be a thing we will likely have computer tech that could at least give a rough estimate as to whether some1 will live or die. given a couple minutes at most a powerful enough consumer could give a rough estimate by using simple physics to calculate the likelyhood that some1 will live or die depending on the speed of the vehicle, weight of the vehicle, angle of impact, etc. and comparing it to the average human's body weight, height, size, etc. i have a intel i7 4770 cpu (which is a mid end cpu) and i have no doubt that it would take my cpu a couple minutes at most to calculate that as long as the input data is accurate. my cpu is made for general use and gaming as well as productivity and is not specifically made to run calculations like that so with the right cpu even today it wouldnt be too difficult. now combine that with moore's law which states that every 2 years the number of transistors we could cram into a cpu will double and the fact that self driving cars wont become mainstream for at minimum 10 years and more likely 20-30 years and id say by that point that it shouldnt be a issue whatsoever. by that point id say there is a good possibility that rather than calculating it based off the average person technology it could analyze the actual people and calculate it based off their individual bodytype/weight/whatever.
lastly if u dont believe me about computers being powerful enough by that point then listen to this 1. it is estimated that we will hit a technological singularity by the year 2045... that means that computers then will be so powerful that AI tech has the potential to be on par with the human brain!!! ur literally having something as powerful as the human brain control ur ****ing car. if u dont believe me about the singularity btw look it up
youtube
AI Harm Incident
2015-12-20T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghURWjOQRHtGHgCoAEC.87XLJSTRT9v87fLE84qk5k","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87XK_wn12JA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87XONe3EM_1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87y50F6qLMu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugh_9XnDJVggxngCoAEC.87Wc9DkkXtp87Wchi_GyLJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgicExh_IjSyAXgCoAEC.87WYVOIx-8F87ZTNZ0VNmJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghGhqPWHO9c13gCoAEC.87WWBZuvBJo87WWVV6CX64","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugis53FXvmFe9XgCoAEC.87WLQIbZmmb87ZV4jiwVBr","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugis53FXvmFe9XgCoAEC.87WLQIbZmmb87ZvXkThznz","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87WHAQUJkyb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]