Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm still wondering if superintelligence is possible. That aside, this conversat…
ytc_UgzWGnkC3…
G
No one man, nation or company controls all the AI in the world, nor should they.…
ytc_UgwbS9XhW…
G
TLDL: Get into birding so that... The apps and algorithm driving me further and …
ytc_UgzGChNAS…
G
@Liam1991 Meanwhile, you do realize that DARPA, the Pentagon, and every other ma…
ytr_UgzWWOZf1…
G
I like my skin, so I always remember to thank all A.I. brothers from another mot…
ytc_Ugym1rryQ…
G
I hate shit like this.
May this boy rest in peace, but the fact his parents ar…
rdc_naumsmr
G
That robot strikes in 0.5 seconds defending itself from the hit where the percen…
ytc_UgzvCqNKK…
G
The AI is apparently uninformed. In Israel, there is one main religion and that …
ytc_UgxgblNZI…
Comment
Why would a driving algorithm allow itself to be boxed in anyway? We already know that situation is one of the most dangerous to be in and many motorists, myself included, avoid being boxed in.
Also, on major motorways and highways wouldn't we eventually create a larger grid system for self driving cars? While those who are on motorcycles or driving manually may be excluded I imagine all self driving vehicles would "log in" to a larger grid with individual destinations, etc. considered. To minimize traffic and the risk of accidents a larger highway system could easily adjust and align most cars on the road to be in the least dangerous configurations, along with having emergency contingency strategies - just like human drivers.
I feel eventually we will have both smart cars and smart roads.
youtube
AI Harm Incident
2015-12-08T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]