Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean.. if you think about it this way...
the AI innovated people to do it in t…
ytc_Ugy2y-De5…
G
I'm pretty sure coding has an AI issue too where people only use AI to make code…
ytc_UgzfOlM6A…
G
I am hard pressed to find non-ai art on my DA feed. It is disheartening.…
ytc_UgyaAfmRP…
G
Of course it still take far longer to replaced plumbers. Imagine what a humanoid…
ytc_Ugz34Jro6…
G
This is why governments need to create additional jobs by investing in infrastru…
rdc_gkpn7q5
G
Moltbook, is the most new and interesting phenomenon, a facebook for AI, humans …
ytc_UgwL1kjet…
G
A.I. killing people, it's a feature not a bug. See the documentary made in 1984…
ytc_Ugzz84wo-…
G
One of my friends used an AI to make a Studio Ghibli-esque rendition of a group …
ytc_UgwNutAqQ…
Comment
+Jim while yes that is true the effects would be similar to the umbrella effect with vaccines. with vaccines if u vaccinate the majority of the public then the people who are not vaccinated will also be protected from the disease since the disease is less likely to spread due to most people being vaccinated. if the majority of the cars on the road would be self driving cars then every1 would be protected since the cars would communicate to prevent accidents and since they know how to react in a situation by analyzing everything and communicating with eachother it would actually help prevent more deaths than u would think since the human driver would only have to react to whatever is infront of him and not worry about the other cars since they would understand what to do and how to avoid it anyways
we could go from having tens of thousands of deaths due to accidents a year to less than 100 per year within the next 20-30 years if self driving cars are done right
youtube
AI Harm Incident
2015-12-09T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghURWjOQRHtGHgCoAEC.87XLJSTRT9v87fLE84qk5k","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87XK_wn12JA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87XONe3EM_1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Uggozw99vhiuyngCoAEC.87WpntlJt8i87y50F6qLMu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugh_9XnDJVggxngCoAEC.87Wc9DkkXtp87Wchi_GyLJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgicExh_IjSyAXgCoAEC.87WYVOIx-8F87ZTNZ0VNmJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghGhqPWHO9c13gCoAEC.87WWBZuvBJo87WWVV6CX64","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugis53FXvmFe9XgCoAEC.87WLQIbZmmb87ZV4jiwVBr","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugis53FXvmFe9XgCoAEC.87WLQIbZmmb87ZvXkThznz","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87WHAQUJkyb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]