Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The water and energy resources used are adetriment to our environment as well. D…
ytc_UgynonR7V…
G
at this point , ai is like google for me. super helpful and saves a ton of time…
rdc_l56p0de
G
No wonder i told Meta AI to make Goku ssj 2 form and it did not get it right. Sa…
ytc_UgxDhc3aV…
G
Hmm... A Unified Basic Income (UBI) is a very interesting concept, and I'm prett…
ytc_UgyP_VO0u…
G
AI devs: How is it so bad?
Also Devs: Code it to be predatory and work for mega…
ytc_Ugy8pI1y-…
G
It's a bit pointless given that AI already poisones itself. But points for doing…
ytc_Ugx5yi_XX…
G
Well, the devil is known as the ‘father of all lies’ and a ‘great deceiver’ so w…
ytc_UgwwRnx5n…
G
This shit is as hilarious as it is stupid, meta-data exists for nearly every pie…
ytc_Ugzn9I5sT…
Comment
This example uses human error to portray a problem with a self driving car: driving to close to the vehicle in front of it.
A self driving car would be programmed to keep enough distance and drive at a safe speed so that it has enough time to brake when an object suddenly stops.
If you replace this example with a person suddenly jumping in front of the car then it's still the same, because it will react much faster than a person in that position and it can look at multiple points at once.
Safety on the road would increase even more with the evolution of the self driving car. I would imagine that at one point it would be possible for the car to track moving objects in it's surroundings and estimate where that object is going and/or all self driving vehicle will be linked together to warn other vehicles of objects moving in their paths.
youtube
AI Harm Incident
2015-12-18T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugi_GtOLGK5NvngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UginTf9w9kAfAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggPMgtotN-UAHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgggUF_2Qb4HPHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UggYd0QeaEjoT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjFNMbehN7Mp3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiV6VqZ6rqWDngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgisBoqYfIeKMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiL9TsF2gC4CngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiMNk6vckjqtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]