Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can definitely pull it off with some of the plug and play hardware acceleration options available for the Pi 5. Its tps wouldn't be stellar but could still spit out a few concise sentences a minute if you asked with a battery bank. The [Hailo AI acceleration module](https://hailo.ai/products/ai-accelerators/hailo-8-m2-ai-acceleration-module/) would be a lot of help and keep things small. But that's assuming the course material is general enough to already exist in a generally available model already out there like deepseek r1. If you're patient enough you could do it without hardware acceleration but it might be a waste of time in an exam environment. I haven't seen one try to run a model without acceleration.
reddit AI Surveillance 1749535504.0 ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mww7v0t","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"rdc_mwww4a6","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"rdc_mwyyot0","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"rdc_mwx47px","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"rdc_mwuxo2g","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}]