Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, if you installed Ollama and ran a model locally, e.g. Qwen, then you'd have an AI agent locally for free. Personally, I'd rather keep the agent in "Ask" mode, that is, ask it for suggestions without allowing it to generate and modify files directly, unless necessary.
youtube AI Jobs 2026-01-20T08:3… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyCpa0XYDvFDDHeATF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz_zjG99hwBDzipKUt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgwaO50VVydUKUQpWyd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgwY-kF9BJIPtOKWkGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzYt45LxutQXyx6UpF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxKZDfOiuPiFjAcmVx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxHRajQYSpDNt-JZwt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzdBXoQbfcGzNmgMdx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzAUqkqjFCe1FA90fd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwLlR-WA7Q0RRhQatx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}]