Credit to Minisforum for where they took the form factor. The EliteMini AI370 measures roughly 128 × 126 × 52 mm — a footprint genuinely comparable to Apple’s Mac mini — while housing AMD’s Ryzen AI 9 HX 370, a current-generation Zen 5 “Strix Point” APU with a dedicated Neural Processing Unit and a Radeon 890M integrated GPU. Packing Strix Point into that footprint is engineering worth acknowledging, and NotebookCheck’s review captured the achievement in its subtitle: the mini PC “sets new standards.” Tom’s Guide’s piece framed it as “the Windows Mac mini,” a comparison that positions the product where Minisforum clearly wants it: in the conversation with premium integrated-chassis compute.
The product’s marketing leans, centrally, on the AI framing. “AI mini PC.” “Ryzen AI.” “On-device AI workloads.” That framing is where the rest of this article lives.
The 32GB ceiling
The EliteMini AI370 ships with 32GB of LPDDR5X-7500 memory, soldered to the mainboard. Soldered — not on SO-DIMMs, not on SODIMM slots, not on LPCAMM modules. Fixed. There is no upgrade path. There is no 64GB SKU currently in production. A buyer who wants 64GB of memory on a current-generation Strix Point mini-PC with an NPU cannot buy it from Minisforum; they can only buy the 32GB product the company has chosen to offer.
In most other product categories this would be an inconvenience. On a product marketed as an AI platform, it is a structural limitation that changes what the product can actually do.
Large language models — the thing the phrase “on-device AI” commonly refers to in 2026 consumer marketing — do not fit inside 32GB of system memory at useful sizes. A competent quantised 70-billion-parameter model at 4-bit precision needs on the order of 40GB of memory just to load the weights, before any context window or KV cache. A more capable 8-bit variant of the same 70B model needs north of 70GB. A 32GB mini-PC cannot run a 70B model at all. It can run 7B and 13B models comfortably, and can fit a 30B-class model with tight quantisation — but the frontier of what buyers currently associate with “AI workloads” in 2026 is above that ceiling, not below it.
Minisforum’s AI-branded product cannot run the AI models the AI branding implies.
The reviewer who couldn’t get an answer
The 4-month review by Ivan Voras on Substack is the longitudinal piece that carries the most weight on the ownership experience. Voras is not a casual hobbyist; his writing history reads as that of an engineer with specific, technical expectations of hardware. His account documents: the machine shipped with a delay that ran into months; the chassis is plastic at a price point that competitors fill with aluminium; soldered memory is disclosed only in the fine print; fan noise is louder than the marketing would suggest for a “mini” product.
The detail that matters most for this article’s argument is one that Voras notes in passing. He attempted to contact Minisforum through their official support channels for guidance on how to access advanced BIOS settings — settings that an engineer working with Strix Point would legitimately want to modify for performance tuning on a product he had paid for. He did not receive a response. Not a “we can’t share that.” Not a “here is the documentation.” Silence.
A reviewer who writes publicly on Substack with a technical readership, asking a vendor for a BIOS documentation pointer on a $1,200+ product that the vendor is actively selling, does not represent an unreasonable customer. If that reviewer cannot get a response, the rest of the buyer base — less visible, less articulate, less connected — has no realistic path to the same information.
What “AI” names, and what it doesn’t buy you
It is worth being specific about what the AI branding does deliver. The Ryzen AI 9 HX 370’s NPU exists. It accelerates specific Windows-side AI features — Studio Effects, local transcription in certain apps, some on-device image processing in Adobe products, and a growing but still narrow set of developer-facing inference APIs through DirectML and ONNX Runtime. For those workloads, the platform works as described. A buyer whose AI use-case is confined to Windows-native Studio Effects and narrow-scope inference tasks will get value from the chip.
What it does not deliver is the larger interpretation of “AI mini PC” that the phrase invites in 2026: running general-purpose large language models locally, replacing API calls to cloud LLMs with on-device inference, or serving as a prosumer inference appliance. None of those are possible on 32GB of memory, regardless of how good the NPU is. Marketing a product in a category whose defining workload it cannot run is the problem the AI370’s naming creates.
The lasting critique
The EliteMini AI370 is a good mini-PC that would have been less marketing-tangled if it had simply been named the EliteMini HX370. Sold on its chassis, its CPU performance, its iGPU graphics capability, and its form-factor credit against the Mac mini, it holds up well against the same competitors Minisforum clearly wants it measured against. Sold on its “AI” credentials, against a category where the reference workload requires two to three times more memory than the product has, it fails the implicit contract its own product name creates.
The silence around the advanced BIOS request is the tell. A vendor that is confident in its product’s positioning responds to technical reviewers; a vendor whose product is marketed above its actual capability envelope leaves those requests unanswered, because the answer would require engaging with the question of why the memory ceiling is where it is. Minisforum has not engaged with that question publicly. Until they do, the honest description of the AI370 is “a good 32GB Strix Point mini-PC marketed as something it is not,” and the buyers paying for the marketing premium are paying for a sentence the product cannot back up.