Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I only recently found out my health problems were due to a rare genetic disorder…
ytc_Ugy0QW4ig…
G
How did Bernie let them pass no regulation for 10 years without raising the flag…
ytc_Ugy1CBdZC…
G
Without real human art, there wouldn't be any AI art I've genuinely seen people …
ytc_UgwKCmSZT…
G
Why are we lowering ourselves on the food chain without a kill switch? Ai probab…
ytc_UgxQf1gGA…
G
I was told that tech was going to bring about a "golden age" by liberals for dec…
ytc_Ugz0V_9BW…
G
This is not true; you're buying into misinformation.
Companies have to adhere t…
rdc_ksmh881
G
if those deepfakes can be used to weaken democracy, is it because deepfakes are …
ytc_Ugx-RxYS0…
G
Thank goodness that our connection to Source and Spirituality will forever be ou…
ytc_Ugzu-ID4E…
Comment
This question has been answered in the past. Self-Driving cars will always have enough space to brake, they will only go as fast as depending on the space that is in front of them, or how far they can see. As synchronized machines, they don't need to get close to other cars, because it won't make much difference to wait 1 sec. to have a few extra meters of free space. Yet, people like to make things complicated, asking the wrong question, and try to fix a problem that doesn't exist.
youtube
AI Harm Incident
2015-12-08T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugj-WH6OpZhDSHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQavChndvc5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghZ2CeGeDq4y3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8HfmGm2p6hngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggesFpy1EznlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiRW9mWll7FTHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgifUAfLDoDb23gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg_yjdSah1yH3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjkwzfB0yQ1NngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_9XnDJVggxngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"})