Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The next civil war will be due to AI. No jobs=no money=can't eat or pay bills=pi…
ytc_UgyNwPo46…
G
I am concerned that they are doing this without regulation. The driverless taxi …
ytc_UgwG7ENst…
G
when he say smarter than humans what exactly does he mean like can AI outsmart …
ytc_UgzQHooUj…
G
Thing you'll find out is alot of those are self reported Cuban statistics, and b…
rdc_f9eu458
G
i got chatgpt to write full essays with 1% chance of ai writing 🤷♂️ i’m just hi…
ytc_UgxyYrKCC…
G
"you'll own nothing and you will be happy" that's what AI and universal income i…
ytc_Ugx9B48_0…
G
It will replace 4 out of 5 devs. The fifth dev will do the job of all 5 using AI…
ytc_UgwUIz6gu…
G
Well, this wasted his time making this video. Dunno if he’s deliberately trying…
ytc_UgwFFXKWe…
Comment
Isn't starting from the premise that the car couldn't stop in time flawed in the first place?
A self-driving car would be programmed to always keep enough distance from the truck in order to stop in time in case the veicle came to a stop immediately for whatever reason. An object falling off would never approach you faster than a braking vehicle.
This solely based on my intuition but isn't that true?
youtube
AI Harm Incident
2016-01-10T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]