Uncategorized

Intel XeSS receives some minor improvements — tech still lacks a native Frame Generation feature

Two days ago, on July 18, Intel released version 1.3.1 of the Intel XeSS SDK onto GitHub. This latest version of the Intel XeSS SDK only lists “Bug Fixes & Stability Improvements” verbatim, so we won’t get much meaningful information from analyzing the newest release by comparison. The XeSS SDK 1.3.0 release back on April 4 introduced new AI models, new Ultra and Native image quality presets, and increased resolution scale on every existing preset, as well.Overall, Intel XeSS still looks relatively young compared to Nvidia’s DLSS or AMD’s FSR. While those solutions have both leveled up to offering Frame Generation alongside their image reconstruction functionality, Intel XeSS still provides no native Frame Generation capabilities. Instead, Intel XeSS users rely on games with Intel XeSS and AMD FSR 3 Frame Generation support, which can allow XeSS reconstruction with AMD’s open-sourced, cross-platform Frame Generation solution.On the note of open source, don’t let the Intel XeSS SDK being available on GitHub fool you: XeSS still is not open source. The complete files needed to run Intel XeSS aren’t present on GitHub and are still being kept private by Intel well over a year into the XeSS release.A silver lining to Intel XeSS compared to AMD FSR 3 is that Intel XeSS can, at least, leverage AI with its image-upscaling workloads. It seems to provide XeSS with slightly better image quality than AMD FSR and edges it closer to the upscaling market leader,  Nvidia DLSS. However, if AMD FSR begins utilizing the onboard AI hardware of AMD GPUs, that gap could quickly be closed.As a silver lining for Intel XeSS, its support across broader PC gaming is improving at a steady pace and seems to be standard with the newly common PlayStation-to-PC ports, among other AAA and AA titles. While only 105 games are highlighted by Intel on the official XeSS Enabled Games page, a SteamDB search now unveils that XeSS is present in 228 games on Steam—though this number also counts Demos and some Benchmarking software.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Neural network learns to make maps with Minecraft — code available on GitHub

A fundamental limitation of modern artificial intelligence and neural networks is that they aren’t good at spatial mapping or navigation without an existing map. However, TechXplore reports that a combination of a predictive coding algorithm and Minecraft gameplay successfully “taught” a neural network how to create spatial maps and subsequently use those spatial maps to predict the following frames of video, yielding a mean-squared error of 0.094% between the predicted image and the final image.The project demonstrates genuine spatial awareness of AI, which still isn’t seen in the impossible architecture and other strange glitches that come with things like OpenAI’s Sora.These findings come from a paper published in the Nature Machine Intelligence journal on Nature.com, automated construction of cognitive maps with visual predictive coding, from James Gornet & Matt Thomson of the California Institute of Technology (aka Caltech). The paper, released to the public just yesterday, details exactly how this was achieved in exhaustive detail and even shares the code on GitHub and Zenodo.One of the two researchers who worked on the project, Matt Thomson, spoke to TechXplore and provided a few noteworthy quotes about the process and what led them to undertake it.Per Matt Thomson, “There’s this sense that even state-of-the-art AI models are still not truly intelligent. They don’t problem-solve like we do; they can’t prove unproven math results or generate new ideas. We think it’s because they can’t navigate in conceptual space; solving complex problems is like moving through a space of concepts, like navigating. AIs are doing more like like rote memorization— you give it an input, and it gives you a response. But it’s not able to synthesize disparate ideas.”James Gornet, the graduate student who led the project, encouraged the use of Minecraft and studied neuroscience, machine learning, math, statistics, and biology under the Department of Computational and Neural Systems (CNS) at Caltech. He did not provide a quote about the process, but Thomson says that CNS is uniquely suited for James’s work and that “we’re hoping to learn about the brain in turn,” not just advance AI.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Microsoft will soon implement Nvidia GeForce Now alongside Xbox Cloud Gaming for its first-party titles listed on Xbox.com

Just yesterday, Microsoft announced through the Twitter Xbox account that GeForce Now was being added to Xbox.com’s “Play With Cloud” dialog, providing an option besides Xbox Cloud Gaming for Xbox’s first-party published games. Since this has yet to be fully implemented, the full extent of this change can’t yet be tested. However, from what we know about both services, Xbox Cloud Gaming (formerly Project xCloud) can’t provide image quality or input responsiveness on par with GeForce Now, so this is a better choice for cloud PC gamers who own Xbox games.Xbox Cloud Gaming is limited to 1080p and 60 FPS for those unfamiliar with either service. It is fine by console standards but still inhibited by cloud streaming. Meanwhile, the premium version of GeForce Now can go as high as 240 Hz, implements Nvidia Reflex, and is powered by an RTX 4080 per user. Nvidia estimates end-to-end latency to be 66% lower than a local Xbox Series X. That number is heavily reliant on your Internet connection. This factor is truthfully never entirely in the player’s control.Play your games the way you want, where you want! Starting today, we’ve enabled GeForce NOW integration which allows you to launch supported games on GeForce NOW via https://t.co/Nf3xumC9vw game pages: https://t.co/rNVwXNU6gw pic.twitter.com/TBrfsiDoCeJuly 17, 2024On the official Xbox support page linked in Xbox’s Tweet, it’s noted that this Play With Cloud functionality doesn’t just extend to first-party Xbox games purchased directly through a Microsoft platform. For example, if you have a Steam account that owns those games, you should still be able to Play With Cloud through Xbox Cloud Gaming, Nvidia GeForce Now, and even the lesser-known Boosteroid as long as your accounts are correctly linked to one another.It is a nice gesture from Microsoft, though it raises some long-term concerns about Xbox Cloud Gaming. The most-liked reply on the original posting is the immediate assumption that Xbox Cloud Gaming (referred to by its original XCloud codename) won’t ever get a bitrate increase. Since GFN is far and away the better solution and Microsoft is now directly pushing it as an alternative to their Cloud Gaming solution, that may not be an incorrect assumption. However, Microsoft still cares about its own Cloud Gaming solution.After all, just last month, Xbox Cloud Gaming also found its way to some of Amazon’s Fire Stick streaming devices, so this move following that news emphasizes how dedicated Microsoft is to expanding the Xbox ecosystem. Once you buy a first-party Xbox game, it likely doesn’t matter much to them what cloud you play it from; it’s just that you bought (or are renting) the game. Their solution is bundled with Game Pass as the entry-level pick, which may be fine.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Meteor Lake-powered mini-PC arrives with an external expansion slot to connect GPUs — Beelink GTi14 sports a latchable PCIe x8 slot and integrated 145W power supply

The Beelink GTi14 is a new mini PC from Beelink, now boasting the latest Intel Meteor Lake mobile processors. The most marked improvement of Meteor Lake over past Intel laptop CPUs is the inclusion of actually-pretty-good Intel Arc graphics, which are broadly competitive with the current best iGPU offerings from AMD, who have historically led by a significant margin in iGPU performance. Performance-wise, this Mini PC should be a compelling pick, especially if you also care for other features like NPU support on Intel’s latest architectures.Beelink GTi14 Core SpecificationsCPU Option 1: Intel Core Ultra 7 155H with 16 cores (6P+8E+2LPE) up to 4.80 GHzCPU Option 2: Intel Core Ultra 9 185H with 16 cores (6P+8E+2LPE) up to 5.10 GHzGPU: Intel Arc Graphics for Meteor Lake with 8 Intel Xe coresRAM: 32 GB DDR5-5600 RAMStorage: 1 TB NVMe Gen 4 SSDI/O: 1 PCIe x8 slot; 1 Thunderbolt 4 port; 5 USB 3.2 Gen2 Type-A ports; 1 USB 3.2 Gen2 Type-C port; 1 DisplayPort 1.4A port; 1 HDMI* port; 1 3.5mm audio port; 1 SD card slotETC: Built-in Dual Speakers and 4-Mic Array; Wi-Fi 7 and Bluetooth 5.4 support; Fingerprint sensor*Unspecified HDMI, but most likely HDMI 2.0.But what else does this mini PC have to offer? What sticks out to us most is a latchable PCIe x8 slot on the bottom of the unit, which should allow for reasonably strong iGPU performance with all but the beefiest graphics cards. The promotional renders on the original product pages (Core Ultra 7/Core Ultra 9) show this being used with an RTX GPU and a dedicated docking station with a similar design language, but at this time, we are unable to confirm if this is included or if actual PCIe x8 GPU docking stations exist. This also happened with the ASRock DeskMate X600, which has a similar x8 ribbon cable seemingly purpose-designed for a market of PCIe x8 GPU docks that don’t seem to exist.We have emailed Beelink to confirm these details and will update the article appropriately when we receive an answer. In any case, PCIe x8 support becoming more common on modern Mini PCs is nice, even if it seems like most would be better off going for the more widely supported OCuLink.Besides that, most of the features on offer here seem pretty nice. If the device performs as advertised, the cooling should perform fine and even pretty quietly at only 32 decibels. The Cinebench and Geekbench scores on each model’s product page also align with what we should expect from these Intel Core Ultra CPUs.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

AMD’s new Zen 5 flagship gets benchmarked — Ryzen 9 9950X Engineering Sample isn’t as impressive in Blender at maximum power settings

Since our last AMD Ryzen 9 9950X ES leaked benchmarks story, AnandTech forum member Igor_Kavinsky has continued posting new Engineering Sample benchmarks in his original thread. The enthusiast has now undertaken 253W PPT and “Unlimited” Package Power Tracking (PPT) testing. This is in addition to the previously-covered 90W, 120W, 160W, and 230W PPT results we’ve covered. We’ve also included our own Ryzen 9 7950X benchmarking results for a quick comparison with the newer chip.  In our last article on these benchmarks, we noted that the Ryzen 9 9950X seems to boast significant efficiency improvements over the Ryzen 9 7950X, not just higher performance in general. In particular, it was noted that the Ryzen 9 9950X seems capable of outperforming the Ryzen 9 7950X even when operating at a lower maximum wattage. It also remained fairly competitive with the 170W Ryzen 7950X at wattages as low as 120W.AMD Ryzen 9 9950X ES Blender Benchmark ScoresSwipe to scroll horizontallyRow 0 – Cell 0 Ryzen 9 9950X “Unlimited PPT”Ryzen 9 9950X 253W PPTRyzen 9 9950X 230W PPTRyzen 9 9950X 160W PPTRyzen 9 7950X 170W PPTRyzen 9 9950X 120W PPTRyzen 9 9950X 90W PPTRyzen 9 9950X 60W PPTBlender “Monster” Benchmark Score367.6366353.4319.7289.7268.7227.5153.2Blender “Junkshop” Benchmark Score231.4230226.1205.8172.8177.5150.6101.8Blender “Classroom” Benchmark Score180.1179171.3152.5136.7129.8108.872.7Blender Overall Benchmark Score779.1775750.8678599.2576486.9327.7*Author’s Note: My prior article on this topic referred to these power targets as “TDP” instead of “PPT”. These are…mostly the same thing, but whereas TDP (Thermal Design Power) refers to the CPU’s power target, PPT (Package Power Tracking) refers to all power being directed to the CPU socket, and adjustments change the maximum wattage to the socket, and thus real TDP is lower. “Unlimited PPT” allows for as much wattage as the CPU and socket can support.Isolating the two new benchmark results, one has to immediately note that only minor improvements have been gained by pushing the PPT power limits to their absolute maximum. The 253W result with a 5.5 GHz overclock still maintains impressive temperatures of 61C or less thanks to the liquid cooling setup used with this ES. However, fully removing the power limits kicks up temps to 80C under liquid cooling while achieving only the most marginal of performance improvements.In other words, the most impressive results here… still start at around 170W, compared to the preceding CPU. It is fully within expectations for a successor to outperform its predecessor at the same or higher power targets, but the efficiency gains remain the most impressive aspect of this story.That said, it’s still nice that the Ryzen 9 9950X could be pushed this far with (apparently) a standard liquid cooling setup, though we don’t know if it was done with an AIO or a custom loop. Apparently, no CPU delidding was needed to achieve these results, and in fact doing so would have likely upset AMD, since this is an Engineering Sample that must eventually be returned, per Igor_Kavinski’s secondhand reports. (Note that while Igor posts these benchmarks, an unnamed source is actually running them and sending them to him.)Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Google’s Gemini AI caught scanning Google Drive hosted PDF files without permission — user complains feature can’t be disabled

As part of the wider tech industry’s wider push for AI, whether we want it or not, it seems that Google’s Gemini AI service is now reading private Drive documents without express user permission, per a report from Kevin Bankster on Twitter embedded below. While Bankster goes on to discuss reasons why this may be glitched for users like him in particular, the utter lack of control being given over his sensitive, private information is unacceptable for a company of Google’s stature —and does not bode well for future privacy concerns amongst AI’s often-forced rollout.Just pulled up my tax return in @Google Docs–and unbidden, Gemini summarized it. So…Gemini is automatically ingesting even the private docs I open in Google Docs? WTF, guys. I didn’t ask for this. Now I have to go find new settings I was never told about to turn this crap off.July 10, 2024So, what exactly is going on here? Both Google support and the Gemini AI itself do not quite seem to know, but Kevin Bankston has some theories, after providing much more detail in the full thread. Contrary to the initial posting, this is technically happening within the larger umbrella of Google Drive and not Google Docs specifically, though it seems likely the issue could apply to Docs as well.But what caused this issue? According to Google’s Gemini AI, the privacy settings used to inform Gemini should be openly available, but they aren’t, which means the AI is either “hallucinating (lying)” or some internal systems on Google’s servers are outright malfunctioning. Either way, not a great look, even if this private data supposedly isn’t used to train the Gemini AI.What’s more, Bankston did eventually find the settings toggle in question… only to find that Gemini summaries in Gmail, Drive, and Docs were already disabled. Additionally, it was in an entirely different place than either of the web pages to which Gemini’s bot initially pointed.For Bankston, the issue seems localized to Google Drive, and only happens after pressing the Gemini button on at least one document. The matching document type (in this case, PDF) will subsequently automatically trigger Google Gemini for all future files of the same type opened within Google Drive. He additionally theorizes that it may have been caused by him enabling Google Workspace Labs back in 2023, which could be overriding the intended Gemini AI settings.Even if this issue is isolated to Google Workspace Labs users, it’s quite a severe downside for having helped Google test its latest and greatest tech. User consent still matters on a granular basis, particularly with potentially sensitive information, and Google has utterly failed at least one segment of its user base by failing to stay true to that principle.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Warframe devs report 80% of game crashes happen on Intel’s overclockable Core i9 chips — Core i7 K-series CPUs also have high crash rates

The water for Intel keeps getting hotter amidst ongoing 13th Gen and 14th Gen CPU crashing issues. On July 9, Warframe developers added to the fire by reporting both Warframe player crash statistics and even a developer who was experiencing the same issue. As noted by the Warframe team, the staff member in question wasn’t overclocking and was even using a new PC, but even the most basic in-game tasks resulted in hard crashes.Surprisingly, stress testing similar machines across the Warframe dev team did not show similar results. But spurred by an Intel report, the Warframe dev team applied a BIOS update to the sole team member who couldn’t play the game without crashing, and a BIOS update at least seemed to help. So, there’s a non-zero chance that the statistics we’ve shared below may be reduced by some prudent BIOS updates from 13th and 14th Generation Intel CPU users— but there’s likely more to this story, based on what we already know.Statistics on Intel 13th and 14th Gen crashes from Warframe players, provided by a developer on Warframe’s forums. (Image credit: Digital Extremes)While the Warframe development team was able to fix their own in-house Intel crashing issue, this is not the same story being reported by others. Just yesterday, we reported on Alderon Games, developers of Path of Titans, experiencing a 100% crash rate with seemingly any server or development PC using a 13th Gen or 14th Gen Intel CPU. They even noted that the CPUs seem to be deteriorating over time. Thus, they have pivoted their server backend entirely to using AMD systems, deeming Intel’s latest desktop socket releases “defective.”These issues aren’t exclusive to Warframe or Path of Titans, either. Epic Games has noted that Fortnite, arguably the biggest multiplatform multiplayer game ever, frequently crashes on 13th- and 14th-generation Intel CPUs.At the time of writing, not even Intel seems to understand what’s going on entirely. However, at least one issue (Turbo Boost at unsafe temperatures) can be attested to poor power (and thus thermal) management on Intel CPUs. It is a sensible cause of system stability issues and may even explain the steady deterioration of Intel CPUs reported by Alderon Games. It also seems to make overclocking 13th Gen and 14th Gen Intel CPUs risky, especially if you aren’t going “all the way” with liquid cooling.Hopefully, these issues won’t be plaguing Intel users for much longer, and the appropriate updates—and notice to apply those updates—will eventually eliminate this issue from the equation. Ahead of Intel’s Arrow Lake launch on desktop later this year, though, this certainly isn’t a good look on the part of Team Blue, who shouldn’t be getting lazy with that top-dog spot in the market.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

AMD Ryzen 9 9950X Engineering Sample gets a full suite of Blender benchmarks at various TDPs, showcasing major efficiency improvements

Starting on July 7, AnandTech forum member Igor_Kavinski began posting Ryzen 9 9950X engineering sample Blender benchmark results courtesy of an unnamed source — starting at a super-slim 60W TDP. Over the course of the following week, 90W TDP, 120W TDP, 160W TDP, and finally, max-capacity 230W TDP results were also posted. The results give us a comprehensive idea of how power efficiency will improve with next-gen Zen 5 AMD CPUs.Before proceeding, it’s evident that the newer Ryzen 9 9950X would outperform the older chip when given a more generous power budget. We didn’t test our Ryzen 9 7950X at 230W TDP, but reports from other users in the thread point toward a ~20% performance improvement still present in that scenario. The interesting results here start at 170W and below.AMD Ryzen 9 9950X vs Ryzen 9 7950X Blender BenchmarksSwipe to scroll horizontallyRow 0 – Cell 0 Ryzen 9 9950X 230W TDPRyzen 9 9950X 160W TDPRyzen 9 7950X 170W TDPRyzen 9 9950X 120W TDPRyzen 9 9950X 90W TDPRyzen 9 9950X 60W TDPBlender “Monster” Benchmark Score353.4319.7289.7268.7227.5153.2Blender “Junkshop” Benchmark Score226.1205.8172.8177.5150.6101.8Blender “Classroom” Benchmark Score171.3152.5136.7129.8108.872.7Blender Overall Benchmark Score750.8678599.2576486.9327.7 *Note: All benchmark results listed above use AMD’s Precision Boost Overdrive for a small performance boost. Additionally, the Ryzen 9 9950X ES is liquid-cooled.At 170W, the Ryzen 9 7950X achieves a cumulative Blender score of 599.2. The Ryzen 9 9950X scores 678 at 160W, which outperforms its predecessor by about ~11% when both operate at more standard CPU TDPs.The performance differentials between Ryzen 9 9950X and Ryzen 9 7950X start to narrow down when the newer chip is put at 120W. It is still within about ~5% of its predecessor’s performance despite running with a 50W deficit in comparison.These engineering sample benchmarks aren’t the only insight we’ve received into AMD’s upcoming Ryzen 9000 Series of CPUs. Earlier this week, Ryzen 9 9900X Geekbench results appeared that seem to have the new architecture pinned to take the crown in single-core performance, far outstripping the last-gen Ryzen 9 7950X3D and even the Intel Core i9-14900K.Overall, we have to say that these emerging benchmarks are looking quite favorable for the future of AMD desktop platform users. However, some salt is required with pre-release benchmarks like these. Beyond raw performance gains, the power efficiency gains here also bode well for the eventual arrival of Zen 5 laptop chips, and should generally be nice for anyone trying to limit their power consumption. Even the 60W TDP results make this CPU look pretty usable since those scores align with an Intel Core i9-10980XE, per Blender’s benchmark database.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Software engineer achieved free plane Wi-Fi at a terrible cost — PySkyWiFi bypasses firewall but is limited to plain text endeavors

Earlier this week, software engineer Robert Heaton posted the entire story behind his open-source PySkyWiFi project— or how he achieved free Wi-Fi on an airplane by painstakingly subverting the existing firewall. Be prepared, though— even if you can duplicate this yourself using his existing work or recreating it on your own, the final results aren’t that pretty, and you’ll most likely want to pay for your plane Wi-Fi the regular way, anyway.The process started when he realized that his Airmiles account page, not blocked by the firewall, was still connected to the broader Internet, and this gap could be exploited. He also had two laptops on hand, one of which he used the standard Wi-Fi on (though he claimed his wife could have helped set this up from home had she been willing to follow his instructions), and the other he used to develop and test prototypes for PySkyWiFi.The final form of PySkyWiFi can load all web pages, but the earlier prototypes focused on plain text endeavors like “Instant” Messaging, stock prices, and even football scores. It was all done with Python, which is where the “Py” in the final name originates.Robert Heaton’s simplified “PySkyWiFi” graphic (Image credit: Robert Heaton)So, how would PySkyWiFi perform if you were to grab it from its official GitHub page? Per Robert’s extended blog post, you can expect a complete PySkyWiFi setup to run at speeds of “several bytes per second.” That’s right— several bytes, not kilobytes. Those plain text prototypes make a lot of sense considering the bandwidth constraints of carefully freeloading through a firewall, so don’t expect to watch videos or consume much of anything but plain text documents. Even those won’t be done promptly.As Robert says, you probably shouldn’t do any of this. Even with most of the work done ahead of time for those who wish to follow in his footsteps, it’s way less headache and trouble than simply paying the obligatory plane Wi-Fi fee and moving on with your day. You can also…sleep on the flight, bring a book, or do any potentially constructive activities while not connected to the Internet.But it’s cool that widespread, free plane Wi-Fi is technically possible through the existing firewalls…it just almost certainly isn’t worth doing for anything besides a quick giggle at the novelty before you realize you need pages to load quickly more than you need that five dollars or whatever they’re charging you. Still, though— it is hard to beat the price of “free”!Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]

Uncategorized

Minisforum’s latest Mini PC flaunts an AMD Ryzen 9 7945HX and an RX 7600M XT in a slim-looking case

Today Minisforum debuted the AtomMan G7 PT Mini PC with an early bird price of $1,199 (down $300 from its MSRP of $1,499), which will get you a complete system with 32GB of RAM and a 1TB NVMe SSD.  A barebones version — without these components, or an OS — is also available for the early bird price of $999 (MRSP $1,249).The Minisforum AtomMan G7 PT Mini PC is housed in a tall, slim-looking enclosure reminiscent of “thin clients” and other slimline desktop PC builds of the past — though those were typically known for being very low-power. The AtomMan G7 PT instead boasts a Zen 4-based AMD Ryzen 9 7945HX CPU and a decently-powerful RX 7600M XT discrete GPU within, which should allow for genuinely good performance — but you’re paying extra to have that kind of performance in such a slim form factor. The side logo lighting can also be turned on or off.Minisforum AtomMan G7 PT Core SpecsCPU: AMD Ryzen 9 7945HX; Zen 4 CPU with 16 cores and 32 threads up to 5.4 GHzGPU: AMD Radeon RX 7600M XT discrete graphicsRAM: 32GB of DDR5-5200* MT/s RAM (Up to 96GB supported)Storage: 1TB NVMe SSD (Another slot present, up to 4 TB supported)Front I/O: 3.5mm audio port; 1 USB 3.2 Gen2 Type-A port; 1 USB 3.2 Gen2 Type-C portRear I/O: Line Out port; Mic in port; 2.5G Ethernet port; 3 USB 3.2 Gen2 Type-A ports; 1 USB 3.2 Gen2 Type-C port w/ Data and DisplayPort Alt modes; 1 HDMI 2.1 port; 1 DisplayPort 2.0 portWireless Technologies: Wi-Fi 7 support *MT/s spec based on max support spec. The speed of the 32GB DDR5 RAM used is unclear. Beyond having fairly good specs for the Mini PC category, the AtomMan G7 PT will have “cold wave ultra cooling,” capable of providing up to 205W cooling capacity with its four fans, 8 heatpipes, and active RAM/SSD heatsinks, according to the website. In theory, this should be enough for both the RX 7600M XT (max 120W TDP) and the Ryzen 9 7945HX (max 75W, base 55W TDP) to stretch their legs a little, though the CPU seems more likely to be constrained by this setup. AMD EXPO memory overclocking is explicitly listed as not supported “at the moment”, which may mean there’s no thermal headroom for memory overclocking.In any case, the Minisforum AtomMan G7 PT does pack a lot of power in an appealing SFF package. However, users will likely want to do some deeper CPU and GPU TDP and/or clock tweaking to achieve the ideal balance of performance in this smaller, low-power form factor.Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox. […]