Dont Buy a New Computer in 2026! (Even for AI Use

SPREAD THE WORD

5G
There is no Law Requiring most Americans to Pay Federal Income Tax

  

📰 Stay Informed with My Patriots Network!

💥 Subscribe to the Newsletter Today: MyPatriotsNetwork.com/Newsletter


🌟 Join Our Patriot Movements!

🤝 Connect with Patriots for FREE: PatriotsClub.com

🚔 Support Constitutional Sheriffs: Learn More at CSPOA.org


❤️ Support My Patriots Network by Supporting Our Sponsors

🚀 Reclaim Your Health: Visit iWantMyHealthBack.com

🛡️ Protect Against 5G & EMF Radiation: Learn More at BodyAlign.com

🔒 Secure Your Assets with Precious Metals:  Kirk Elliot Precious Metals

💡 Boost Your Business with AI: Start Now at MastermindWebinars.com


🔔 Follow My Patriots Network Everywhere

🎙️ Sovereign Radio: SovereignRadio.com/MPN

🎥 Rumble: Rumble.com/c/MyPatriotsNetwork

▶️ YouTube: Youtube.com/@MyPatriotsNetwork

📘 Facebook: Facebook.com/MyPatriotsNetwork

📸 Instagram: Instagram.com/My.Patriots.Network

✖️ X (formerly Twitter): X.com/MyPatriots1776

📩 Telegram: t.me/MyPatriotsNetwork

🗣️ Truth Social: TruthSocial.com/@MyPatriotsNetwork

  


Summary

➡ The author advises against buying new computers for AI purposes due to misleading specifications and high costs. He suggests that older computers are sufficient for most tasks, including AI, and recommends using cloud options for AI needs. He also warns that new hardware can be buggy and unstable, and that the high demand for AI has inflated prices, especially for memory. Lastly, he emphasizes that AI should be run on a separate, standalone computer, not your main machine.

Transcript

I bought a new computer last year, which is my current daily driver, and just recently I also bought an AI computer so I can run local AI models. Lately, the biggest driver for buying a new computer is to see what new things you can do with AI. Because I have to test things for you so I can make these videos, I have to bite the bullet and buy these things now. Sometimes it works out and sometimes it doesn’t. Generally, we buy a new computer expecting to use it for some particular application or new expected features, like AI in my case.

But I found that many of the expectations for improvement and performance are wrong and often extremely costly. What I discovered is that all the major changes going on with chip technologies, AI solutions, plus the changing marketplace for computer accessories like memory or breakfast cards. If I were interested in updating my computer, that this may not be the time to purchase high-end hardware. I gave some advice on what to buy in the last couple of years, and again in retrospect, all of those computers would now be bad choices. It’s getting worse in 2026. I fell for the hype and I don’t want you to.

We’ve been faked with specs that are actually useless for most everyday things and even if you’re interested in AI, many of these recent computer choices truly suck. And if you want an AI machine, a dedicated one like the one I got, maybe it’s best to wait and see and use a cloud option for now. I’ll talk about that too. What I would suggest to tide you over is actually to buy used at least in 2026 and then wait it out and see if things get better in the next year or two. I’ll tell you why you need to heed my advice so you don’t waste money and what you should do instead.

Stay right there. The biggest waste of the last two years is considering the so-called Copilot Plus PC, which is basically a Microsoft-approved hardware configuration that can run Windows 11 with a TPM security chip and able to supposedly handle local AI inference using Copilot. One of the main considerations was 16 GB of RAM minimum and an NPU supposedly with a minimum of 40 tops. Tops is trillions of operations per second. Now I use Linux and for those of you focusing on Linux, that NPU is not used at all. So 40 tops of nonsense. One of the particular models pushed by Microsoft was the new Snapdragon Copilot Plus PCs from Qualcomm.

Avoid this. It is not Linux compatible yet and lots of incompatibilities. The only reason to buy a newer computer in the last two to three years was supposedly that you will get a better AI experience and we’ve been fed a lie so far. Because the reality of how AI will be used is different from what they imagined. A shocking bombshell message for you. From a raw processing point of view, there is no real observable difference from a similar model from three years ago when performing normal computer uses. There may be a big advantage in some types of computers for specialized tasks like video editing and gaming, but I’ll go into that later.

What is an NPU? Specifically, an NPU is called a Neural Processing Unit, which conjures images of a scary AI chip that will take control of your device. But that’s not really what an NPU is, even though it is highlighted as important for Microsoft Windows Copilot. An NPU is basically a math coprocessor. It is a finely tuned chip made for doing matrix multiplication, which is an important element in AI inference. Except while the hardware is there, the support for it is spotty. Only Windows Copilot puts it into some use for minor AI stuff, and really it is at a very immature stage.

I will tell you right now that Linux or a local AI used by tools like LM Studio or Olama do not use the NPU. Currently, the bulk of AI models focus on VRAM video memory and the GPU. So the push for people to buy computers with an NPU has truly been a waste of money and time. Frankly, computers that are five years old without an NPU are just fine. And if you followed my recent videos of Microsoft Windows, you will see that I don’t recommend ever using Windows 11 with Copilot for any sort of personal use anyway.

Use Linux. If you’re going to use AI, then I have a completely different approach. And your base computer with sufficient memory like 8GB running Linux is going to be more than enough. Currently, no popular application really utilizes the NPU for any AI purpose outside of Microsoft itself, and it will have zero function under Linux. What applications are important to you? Before you start deciding on what new computer to get, you really have to determine the applications that are important to you right now. For example, if you’re into gaming, do you really need a brand new super expensive NVIDIA 1590 desktop? You’re competing with every AI user for those NVIDIA 1590s, and the target price of an AI user is more like $30,000 for machines with multiple 1590s.

So the prices are artificially inflated and has nothing to do with gaming. But for gaming, do you really need to spend $5,000 just for the NVIDIA card when you can get some used graphic cards for a fraction? The competition is for NVIDIA cards with lots of VRAM, so a 1590 is on the top of the list with 32GB of VRAM. Why? Because it means it can load a 32GB AI model directly on it without using RAM and that means super fast AI inference. If your goal is to learn to use AI, then the answer is to go cloud.

Don’t compete on the super expensive hardware. Yes, there are privacy advantages to using local AI like I’m doing, but there are safe cloud alternatives I’ll mention later. Memory prices have gone through the roof. I just bought an AMD Strix Halo, which is a Beelink GT-R9 Pro, AI Plus Max 395. If you can remember that long name, it is one of the hottest devices out there for AI because it has unified memory of 128GB, and it is the cheapest. Unified memory is the big deal today because instead of buying NVIDIA 1590s, you can just use standard RAM like this at 120GB, and AMD and Apple can actually split the RAM into video memory and regular RAM.

And here’s the sad part. This Beelink cost $2000 in late 2025, now because of the AI agent demand. It is now $3000 with tax, so that’s a 50% increase. The reason for this is that the AI companies are buying up all the RAM, particularly the high speed LPDDR5 and LPDDR7 types. Many computer makers project depressed sales in 2026 because who would buy a computer that’s now 50% more expensive than last year? New hardware is buggy. The big deal with using computers like AMD Strix Halo CPUs with 128GB of RAM or more is that it will allow loading of large AI models locally.

On a 128GB machine, in theory, I’m supposed to be able to load models as large as 96GB. This beats the super expensive NVIDIA 1590, which only has 32GB of VRAM. In theory, in fact, while it’s crashing, it’s unstable. Unified memory is more reliable on Apple Mac Studio for AI, but really a crash risk on AMD. So in reality, I’m not using more than 50GB of VRAM. And Apple Mac Studio pricing is stupidly out of this world. The fact is that if you want stability, you’ll have to stick to NVIDIA 1590, which is an incredibly expensive option, as I already said.

$5000 a card and you will need more than one. AI on your main machine, not doable. Originally, my imagined use of AI was to run Olama with a local AI on my machine I bought two years ago. It’s a Lenovo Legion 5 with 64GB of RAM and an NVIDIA 4070 with 8GB of video RAM, a gaming laptop. Well, it turns out that this is not the practical way to really use AI. The new big deal is the use of AI agents, and the hot topic since February 2026 is OpenClaw. I’ve been using OpenClaw heavily for the last month, and frankly, the only safe way to use OpenClaw is to use a standalone, separate computer.

So my thought was to use Olama and then OpenClaw on that two-year-old computer. Wrong. Not enough video RAM. NPU useless. As it turns out, OpenClaw doesn’t use that much CPU horsepower, so you can use a 5-year-old computer and it will handle OpenClaw itself just fine. So I’ve dedicated my Lenovo Legion 5 computer to OpenClaw use. My AI model is running Olama on the Beelink Strix Halo, but this is not a realistic solution financially. If I weren’t testing this as a privacy solution, all I needed was to use Olama cloud models with the OpenClaw machine. No extra computer needed.

Olama cloud will get you going for $20 a month. Fix cost. And an older computer would achieve the same thing. Maybe a couple of years from now, when the cost of memory goes down, you can consider buying a new computer for an AI server. In my particular case, I bought a new Lenovo ThinkPad X1 Carbon Gen 13 as my daily driver because it was a thin and light machine running the new Intel Lunar Lake architecture, referred to as Series 2. Now, this was before the price increases in memory and this particular model was well-priced. The biggest advantage of this upgrade was that I can actually do video editing on a thin and light laptop without an Nvidia card.

This was a first and I’m loving this new capability, except I feel cheated because in a single year, the newer Lunar Lake chip Series 3 is significantly faster. If I waited, I would have gotten more bang for the buck. The reason is that Intel and AMD are copying some of the features of M-Silicon chip from Apple and are gaining massive improvements in PowerDraw. So the jumps in performance are more significant than in a typical year because of architectural changes. This will likely continue for a bit again next year. I just dove into this too early. Unfortunately, having the extra speed would be outweighed by the extra cost of memory, which is now one-third the cost of a new machine.

The newer version Lenovo X1 Carbon of the one I bought is not yet available as of the time of this video, but it is likely priced at double of what I paid. So how can I recommend that you buy a new machine? What to do instead? Instead of buying a new computer, you’ll need to study what your goals are and frankly, you’ll be better off buying an older computer and hold off on any new computer purchase in 2026 and perhaps for part of 2027. Just to give you a general heads up on computer comparisons, let me focus on Intel machines, which are more common on laptops.

The old Intel chip design was the old CPUs numbered i7-14000, which were common in 2023-2024, and then 13000 for the prior year and 12000 for the year before that and so on. So basically, a five-year-old computer would be using an Intel i7 or i5-12000 series chip. Now let me tell you something interesting. The newer chips called Lunar Lake are not more powerful than the 14000 chips. They’re actually a little slower and their biggest advantage is lower power draw so the batteries last longer. So to be honest with you, for raw horsepower, a four-year-old consumer computer at the high end would be the same performance roughly as a Lunar Lake computer today.

And a five-year-old computer would be a bit slower than that. It hasn’t been a big jump in performance as you would expect. The biggest jumps in performance relate to the iGPU or graphics processor, so that’s why I can do video editing without NVIDIA. And old computers with NVIDIA cards already performed well on gaming, which is good enough for most people. So the only real disadvantage is that the old computers will run hotter, use more power, and be heavier. What use computers to buy? For general use, particularly for compatibility with Linux or for running an extra computer with OpenClaw, I still recommend the same computer in 2026.

Get a used Lenovo X1 ThinkPad Carbon. You can get them from $300 to $400 on eBay. I actually got myself another recently since they’re great backup computers. Find some using the Intel i5 or i7-12000 series chips. They will do well for you. The reason these are priced well is because these are popular corporate laptops and are sold with three-year leases. And then after three years, they get dumped on market. These are super expensive laptops brand new, and you will find them with tons of memory. But even eight gigabytes is good enough for normal use. You can buy older gaming computers too, as they are pretty powerful with potentially large amounts of memory, and older NVIDIA cards like A3050.

I had an old Dell XPS 15 with an NVIDIA 3050. This was also around $500 used, and I also got one of these used recently. These computers would run new applications like OpenClaw very well. Use Cloud AI When I started to do a lot of work with AI, I had different objectives. If I were a high-end programmer doing high-productivity kinds of uses, I’d probably use N-Prophex Claude. This is the hot cloud AI for coding, and many people still use ChatGPT, which is OpenAI, Grok, which is XAI, and Gemini, which is Google. Depending on your focus, these cloud options can be dangerous because you are basically potentially sending private data to a cloud AI.

Would you have any of these models do your tax returns, for example? Aside from the privacy considerations, there are other differences. Grok stands out as having built-in web search, so the model is not a stale source of information based on what the model was trained on. So Grok requires no special setup to have more current information. For other AI cloud products, to have web search requires that you hook up search to them manually, which is not necessarily a task available to average users. But again, these are not the safest privacy options, any of them. The safer privacy option is to use Olama, which is olama.ai.

While we know of Olama as a provider of open source local models that you can use on a local machine, which I’m doing, Olama also provides the same models and the larger versions in the cloud. And unlike the other AI providers that charge per million tokens, Olama offers fixed cost plans of $20 a month pro and $100 a month max. For a single user, a $20 a month subscription is sufficient for testing out OpenClaw and doing most AI safely. I was spending a lot more than that on XAI, more than $50 a month, and traffic would have been $500 a month.

So today, instead of buying a new computer, temporarily use these cloud options. Olama is a safe option since your queries don’t get collected for learning and the models are all open source. Future prediction I’m not an expert on predicting what will happen in hardware. I can predict software moves better, but I’ll give you my two cents. Currently, it is too expensive to be considering running local AI. They’re also buggy, but a couple of years from now, this will be realistic. I already run a local AI using AMD Strix Halo, but I’m still awaiting fixes from AMD to make it more stable instead of the constant crashing.

Yes, I expect that it will be fixed, but it might take a year. Apple Mac Studio as an AI option is incredibly expensive. Plan on 10K. Nvidia DGX Spark is 5-6K. A desktop with 350 90s? 25K. So how can I talk about these as options? By the way, there’s that new Brax OpenSlate project on Indiegogo that is an Android Linux tablet. That’s inexpensive and should perform most tasks you need to do, even OpenClaw with added privacy-safe features at a reasonable price in spite of the memory prices. In the meantime, beyond basic uses, we have to be on a holding pattern.

Do not buy a new computer with AI in mind. Focus on buying used computers and you will not waste money. Folks, privacy is of course the main focus of this channel, and I teach you technology so you understand the risk technology as to your life. We have people who discuss these issues at my platform, BraxMe. To support this channel, we have some products in our store that provide the toolkit to retain your privacy. They are awesome products. We have BraxMail, an email service with unlimited aliases and identity protection. We have Brax Virtual Phone, anonymous phone numbers.

We have ViceVPN for anonymizing your IP address and location, and evading privacy-invading laws. We have the Google phones, phones free from Big Tech tracking. The successful Brax 3 phone is open for pre-order right now at Braxtech.net. And the new Brax OpenSlate Linux tablet is also now a new project you can check out on Braxtech.net. Big thanks to everyone supporting us on Patreon, locals and YouTube memberships. You keep this channel alive. See you next time. Thank you for watching. [tr:trw].

See more of Rob Braxman Tech on their Public Channel and the MPN Rob Braxman Tech channel.

Author

5G
There is no Law Requiring most Americans to Pay Federal Income Tax

Sign Up Below To Get Daily Patriot Updates & Connect With Patriots From Around The Globe

Let Us Unite As A  Patriots Network!

By clicking "Sign Me Up," you agree to receive emails from My Patriots Network about our updates, community, and sponsors. You can unsubscribe anytime. Read our Privacy Policy.


SPREAD THE WORD

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our

Patriot Updates

Delivered To Your

Inbox Daily

  • Real Patriot News 
  • Getting Off The Grid
  • Natural Remedies & More!

Enter your email below:

By clicking "Subscribe Free Now," you agree to receive emails from My Patriots Network about our updates, community, and sponsors. You can unsubscribe anytime. Read our Privacy Policy.

15585

Want To Get The NEWEST Updates First?

Subscribe now to receive updates and exclusive content—enter your email below... it's free!

By clicking "Subscribe Free Now," you agree to receive emails from My Patriots Network about our updates, community, and sponsors. You can unsubscribe anytime. Read our Privacy Policy.