Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes, but it requires resourcefulness. SETI@home attempts recoup idle CPU power, as do distribute computing projects like BOINC. At my university, I had a colleague who got his hands on a 3D printer and printed himself a whole lab. He used the 3D printer for the gears, case, etc. But a lot of the other material was salvaged: old G4 macs to process the geometry. Used arduinos to drive the stepper motor. He was using old Microsoft Kinnect controllers as 3D scanners. But there will be limitations: a 2006 Mac Pro, for example, isn't really useful for machine learning, even if most of the compute happens on a gpu. The PCI interface is slow and not entirely backward-compatible, the CPUs are missing modern instructions like AVX. The ECC ram is ideal for a file server, but these things idle at like 200 watt.
reddit Viral AI Reaction 1776626398.0 ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_oh39v71","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"rdc_oh3dj4k","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"rdc_oh4og86","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"rdc_ohbn43p","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"rdc_ohfanfg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]