Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All this is true until your AI hallucinates and one small buried mistake turns into an expensive mistake, a lethal mistake. AI is a tool, not a employee. If you see how easy AI makes mistakes, how easily negligible those mistakes look and how confident it is in those mistakes, you'd be afraid to let any AI do engineering jobs because those mistakes aren't about profit margins, they are the difference between life and death. I don't want AI as a doctor or an engineer, not until I'm sure those problems are ironed out. Kinda like having self driving cars, everything works fine until it doesn't and when it doesn't it's your life on the line. These self driving cars are actually the easier of AIs to make and we're still years(maybe decades) away from that actually being a thing for most people
youtube AI Jobs 2025-09-09T06:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxDvt2cwhLKgJhLeKN4AaABAg.AMpMw_3pbx7AMpVf5hyl4A","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxDvt2cwhLKgJhLeKN4AaABAg.AMpMw_3pbx7AMpj_rYLS99","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxDvt2cwhLKgJhLeKN4AaABAg.AMpMw_3pbx7AMqEmWv-DO-","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgynuJxodUACQudFqWZ4AaABAg.AMpMln_IKskAMpvOLMcwKG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwIp9vsCCltcmGgnYp4AaABAg.AMpMcJde7jWAMpN8jxOM6v","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwIp9vsCCltcmGgnYp4AaABAg.AMpMcJde7jWAMpO5VeC6hF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwIp9vsCCltcmGgnYp4AaABAg.AMpMcJde7jWAMpOghBXFhh","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwIp9vsCCltcmGgnYp4AaABAg.AMpMcJde7jWAMpQvdzmGGc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwjZh0YccRo_JTglDl4AaABAg.AMpLLLCT5biAMpMpSelweR","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxZgwswcYYlnwi3Tx54AaABAg.AMjTfc5IfQ_AOB-qBdEszg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]