Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t call your government. That unfortunately never turns out the way you want.…
ytc_UgxgDoPFz…
G
Ultimately, AI under the control of an individual human is either as good as or …
ytc_UgwKxC7JJ…
G
Can’t wait to update my CV and add resilience under education. I bet openai will…
ytc_UgwqzRZzX…
G
The way of technology, though, is that these controversies ultimately do not mat…
ytc_UgwSyNaDw…
G
solution: if the tesla cars has a light indicating self driving is on, this coul…
ytc_UgyxCuwEq…
G
> The “skill issue” crowd genuinely thinks that finding semi-reliable deriva…
rdc_n9ig45a
G
There aren’t enough AI specialist positions for everyone. A truck driver isn’t a…
ytc_Ugy2nMz5M…
G
Ultimately, it’s a 50/50 gamble on A.I. I do believe that A.I centers should be …
ytc_UgweDHt-a…
Comment
@josephmartinez1267 If? I very much doubt they're are using AI to learn because AI has a greater fail rate than hard coded algorithms. Failure here can not be tolerated. And there's no need for AI because the units travel fixed paths and have limited functions. Traditional software can handle this in its sleep. While AI introduces uncertainty and greater risk. Any programmer worth their salt would be well advised to stay away from AI in this application.
youtube
AI Harm Incident
2025-02-12T01:2…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxsDDSoKC7J8mWwk0V4AaABAg.AROx4mMDgVwARRkENYAhse","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzKujrFDLf-NF9p41x4AaABAg.AROu5Q4rdA-AROyoqNfh46","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugxaw7nW-S9m6y6eBGN4AaABAg.9rdWKQ8pfd89rlltrUuDGM","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyxawlktYEKSX5CWll4AaABAg.ACh6zuCjn22AEirOu3D259","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzoPMpOlJEnsk2ol2B4AaABAg.ACahcxaeibNACsnoxFSrlT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzoPMpOlJEnsk2ol2B4AaABAg.ACahcxaeibNAEYIdBJmySD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzoPMpOlJEnsk2ol2B4AaABAg.ACahcxaeibNAEYWmBfGDAw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzEc7OThzwBxEbZ2aJ4AaABAg.ACR6edwtllIACc2T_8Mh3B","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz7zvP9_HTpi0Qkub14AaABAg.ACQ2mFlT69GAEQKussF4t8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz7zvP9_HTpi0Qkub14AaABAg.ACQ2mFlT69GAEQ_WHP6RDY","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]