Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why did millionaire investors bet their fortunes on Builder AI and OceanGate’s s…
ytc_Ugw3L3DaT…
G
The cgi is good but the cgi masters forgot to make impressions in the sand😂 from…
ytc_Ugymqamud…
G
AI is here to stay, Don't kid yourselves. They simply must slow everything down.…
ytr_UgxfdzLCQ…
G
The great revolution that is coming is not a technological one -like ai- it is o…
ytr_Ugy4vT1AZ…
G
My FUTURE POV: counseling, teaching and healthcare are EASY targets for advanced…
ytc_Ugx4cBtIa…
G
Why can’t ai make political figures pregnant with cigars in their bums
Answer …
ytc_UgxNkrqSM…
G
The elites and all government main agenda is depopulation. What if they use AI t…
ytc_UgxyxnOcw…
G
Either way, I don't think I fall into those normal categories about how society …
ytc_UgwM-pnyA…
Comment
I work in a machine shop running a 5-axis mill to make medical grade implants. For all the money they put into this amazing machine, it still has an error tolerance. The programs made on hypermill (another expensive piece of software) can have problems. And this is on a precision machine that, for all intents and purposes, runs in a completely controlled environment. Imagining a machine that has to navigate a constantly changing outside world in unknown and dynamic conditions? It's not a surprise to me that there are times when the Autopilot is more dangerous than a person driving. If anything, I'm surprised there haven't been more problems like this. I love machines, I love automation, I do not trust vehicles to drive themselves on the road with people.
youtube
AI Harm Incident
2022-09-05T04:5…
♥ 462
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwBpAMugJFs56h7T5F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy8nG_qlMITcCzVnNN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBlFeeAEBseE5VbuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzRpRwK9qSerMyCerh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyR26KVaZ16mbw1Zh14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYy20RnDHPVC2cLKd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgynoJQ1rDqEg7CfyrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxGFRO40uyLs743S5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnbgRmoLULVYiPPPh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUlvxor9_DHJI4YHR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]