Connect with us

Technology

Could RISC-V become a force in HPC? We talk to the experts • The Register

Avatar

Published

on

Analysis The RISC-V architecture looks set to become more prevalent in the high performance computing (HPC) sector, and could even become the dominant architecture, at least according to some technical experts in the field.

Meanwhile, the European High Performance Computing Joint Undertaking (EuroHPC JU) has just announced a project aimed at the development of HPC hardware and software based on RISC-V, with plans to deploy future exascale and post-exascale supercomputers based on this technology.

RISC-V has been around for at least a decade as an open source instruction set architecture (ISA), while actual silicon implementations of the ISA have been coming to market over the past several years.

Among the attractions of this approach are that the architecture is not only free to use, but can also be extended, meaning that application-specific functions can be added to a RISC-V CPU design, and accessed by adding custom instructions to the standard RISC-V set.

This latter could prove to be a driving factor for broader adoption of RISC-V in the HPC sector, according to Aaron Potler, Distinguished Engineer at Dell Technologies.

“There’s synergy and growing strength in the RISC-V community in HPC,” Potler said, “and so RISC-V really does have a very, very good chance to become more prevalent on HPC.”

Potler was speaking in a Dell HPC Community online event, outlining perspectives from Dell’s Office of the Chief Technology and Innovation Officer.

However, he conceded that to date, RISC-V has not really made much of a mark in the HPC sector, largely because it wasn’t initially designed with that purpose in mind, but that there is “some targeting now to HPC” because of the business model it represents.

He made a comparison of sorts with Linux, which like RISC-V, started off as a small project, but which grew and grew in popularity because of its open nature (it was also free to download and run, as Potler acknowledged).

“Nobody would have thought then that Linux would run on some high end computer. When in 1993, the TOP500 list came out, there was only one Linux system on it. Nowadays, all the systems on the TOP500 list run Linux. Every single one of them. It’s been that way for a few years now,” he said.

If Linux wasn’t initially targeting the HPC market, but was adopted for it because of its inherent advantages, perhaps the same could happen with RISC-V, if there are enough advantages, such as it being an open standard.

“If that’s what the industry wants, then the community is going to make it work, it’s gonna make it happen,” Potler said.

He also made a comparison with the Arm architecture, which eventually propelled Fujitsu’s Fugaku supercomputer to the number one slot in the TOP500 rankings, and which notably accomplished this by extending the instruction set to support the 512bit Scalable Vector Engine units in the A64FX processor.

“So why wouldn’t a RISC-V-based system be number one on the TOP500 someday?” he asked.

There has already been work done on RISC-V instructions and architecture extensions relating to HPC, Potler claimed, especially those for vector processing and floating point operations.

All of this means that RISC-V has potential, but could it really make headway in the HPC sector, which once boasted systems with a variety of processor architectures but is now dominated almost entirely by X86 and Arm?

“RISC-V does have the potential to become the architecture of choice for the HPC market,” said Omdia chief analyst Roy Illsley. “I think Intel is losing its control of the overall market and the HPC segment is becoming more specialized.”

Illsley pointed out that RISC-V’s open-source nature means that any chipmaker can produce RISC-V-based designs without paying royalties or licensing fees, and that is supported by many silicon makers as well as by open-source operating systems.

Manoj Sukumaran, Principal Analyst for Datacenter Compute & Networking at Omdia agreed, saying that the biggest advantage for RISC-V is that its non-proprietary architecture lines up well with the technology sovereignty goals of various countries. “HPC Capacity is a strategic advantage to any country and it is an inevitable part of a country’s scientific and economic progress. No country wants to be in a situation like China or Russia and this is fueling RISC-V adoption,” he claimed.

RISC-V is also a “very efficient and compelling instruction set architecture” and the provision to customize it for specific computing needs with additional instructions makes it agile as well, according to Sukumaran.

The drive for sovereignty, or at least greater self-reliance, could be one motive behind the call from the EuroHPC JU for a partnership framework to develop HPC hardware and software based on RISC-V as part of EU-wide ecosystem.

This is expected to be followed up by an ambitious plan of action for building and deploying exascale and post-exascale supercomputers based on this technology, according to the EuroHPC JU.

It stated in its announcement that the European Chips Act identified RISC-V as one of the next-generation technologies where investment should be directed in order to preserve and strengthen EU leadership in research and innovation. This will also reinforce the EU’s capacity for the design, manufacturing and packaging of advanced chips, and the ability to turn them into manufactured products.

High-performance RISC-V designs already exist from chip companies such as SiFive and Ventana, but these are typically either designs that a customer can take and have manufactured by a foundry company such as TSMC, or available as a chiplet that can be combined with others to build a custom system-on-chip (SoC) package, which is Ventana’s approach.

Creating a CPU design with custom instructions to accelerate specific functions would likely be beyond the resources of most HPC sites, but perhaps not a large user group or forum. However, a chiplet approach could de-risk the project somewhat, according to IDC Senior Research Director for Europe, Andrew Buss.

“Rather than trying to do a single massive CPU, you can assemble a SoC from chiplets, getting your CPU cores from somewhere and an I/O hub and other functions from elsewhere,” he said, although he added that this requires standardized interfaces to link the chiplets together.

But while RISC-V has potential, the software ecosystem is more important, according to Buss. “It doesn’t matter what the underlying microarchitecture is, so long as there is a sufficient software ecosystem of applications and tools to support it,” he said.

Potler agreed with this point, saying that “One of the most critical parts for HPC success is the software ecosystem. Because we’ve all worked on architectures where the software came in second, and it was a very frustrating time, right?”

Developer tools, especially compilers, need to be “solid, they need to scale, and they need to understand the ISA very well to generate good code,” he said.

This also plays a part in defining custom instructions, as these calls for a profiler or some performance analysis tools to identify time consuming sequences of code in the applications in use and gauge whether specialized instructions could accelerate these.

“So if I take these instructions out, I need a simulator that can simulate this [new] instruction. If I put it in here and take the other instructions out, the first question is, are the answers correct? Then the other thing would be: does it run enough to make it worthwhile?”

Another important factor is whether the compiler could recognize the sequences of code in the application and replace it with the custom instruction to boost performance, Potler said.

“You also see that extensions to the instruction set architecture will provide performance benefits to current and future HPC applications, whatever they may be,” he added.

However, Buss warned that even if there is a great deal of interest in RISC-V, it will take time to get there for users at HPC sites.

“There’s nothing stopping RISC-V, but it takes time to develop the performance and power to the required level,” he said, pointing out that it took the Arm architecture over a decade to get to the point where it could be competitive in this space.

There was also the setback of Intel pulling its support for the RISC-V architecture last month, after earlier becoming a premier member of RISC-V International, the governing body for the standard, and pledging to offer validation services for RISC-V IP cores optimized for manufacturing in Intel fabs.®

Source link

Technology

Irish orgs part of EU-wide push to build €19.92m digital skills project

Avatar

Published

on

The Digital4Business consortium, which has several Irish members, aims to launch its first MSc programmes in January 2024.

The National College of Ireland (NCI) helped to launch a new pan-European digital and entrepreneurial skills project that aims to provide a steady pipeline of talent to SMEs in the region.

NCI is one of 15 partners from seven European countries that are taking part in the consortium leading the project, which is called Digital4Business.

Digital4Business is a four-year initiative that will see various EU institutions and businesses work together to devise and deliver a market-led postgraduate programme to help SMEs access a pipeline of digital talent.

Other programmes launching in the coming few months and years will concentrate on key topics such as cloud, data analytics, AI, cybersecurity, blockchain, IoT and quantum computing.

Overall, the project will cost €19.92m. The programmes that will result from it will offer both industry and academic accreditation. The consortium will be focused on the practical application of advanced digital skills within European companies.

The initiative is being funded by the European Commission’s Digital Europe programme, which focuses on the digital transformation of Europe’s society and economy. The funding award to Digital4Business is one of the largest awards the programme has made to date.

Digital4Business was officially launched at an event in the IFSC in Dublin today (21 March).

The project began in December 2022. The consortium aims to launch the first part-time and full-time MSc programmes in January 2024.

Speaking by video link at the event, Minister for Further and Higher Education, Research, Innovation and Science, Simon Harris, TD, highlighted the project’s relevance as part of the European Year of Skills.

“2023 is European Year of Skills – the focus is on helping people get the right education to be prepared for quality jobs, and to address specific skills shortages that businesses are experiencing – particularly SMEs.  Digital4Business directly serves this mission.”

Dara Calleary TD, Minister of State for Trade Promotion and Digital Transformation attended the event in person.

“Digital4Business’ focus on the practical application of advanced digital skills within companies, and especially, within our small and medium businesses, is of great importance. This type of talent development is essential to ensure that the skills and the expertise are in place for businesses to maximise their digital potential – to take advantage of the opportunities digital presents and to assist them in maintaining their competitive edge,” he said.

As well as NCI, the other Irish partners involved in Digital4Business are IT company Terawe, Skillnet Ireland and Digital Technology Skills Limited. Digital agency Matrix Internet co-headquartered between Ireland and Belgium is also involved.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Source link

Continue Reading

Technology

Cults, prophecies and helpless villagers galore: Diablo 4 is back to its moody goth best | Games

Avatar

Published

on

With a click of the right-mouse button, my musclebound barbarian sinks his axe into the ground behind him, sweeps it forward and creates a shock wave that obliterates everything in its path. Ahead, a horde of undead creatures is repulsed by the blast, zombies flayed by the force of the air, skeletons scattered across the ground, wraiths dissipating into spectral dust. The room’s furnishing fly with them, chairs, candlesticks and barrels smashing into the far wall. The ground itself is scarred by the attack, a conical depression left in the floor as if struck by a meteorite airburst.

I’ve performed this attack countless times over the last weekend, and it never fails to light up my brain like Blackpool in September. The Diablo series represents video gaming in its purest and perhaps most reductive form and has exploited these feedback loops to enormous success in the last 25 years, reworking the complex rulesets of role-playing games into something less cerebral and more sensory. While there’s an argument to be had about how intellectually nourishing these games may be, Diablo 4 has a lot of seductive power. Clicking monsters to death in this game feels dangerously good.

Diablo IV screenshot
Diablo 4 returns to being the moody goth kid of its RPG social group. Photograph: Blizzard

Yet having spent 48 hours with the game during its beta phase, it’s clear there’s more to this than mindless monster-bashing. Diablo 4 sees the series return from a long hiatus after a third game that proved controversial in more ways than one. Partly because of this, it looks both backward and forward, addressing some criticisms of Diablo 3 while striving to compete in a world that has changed dramatically since 2012.

After a mixed reception to the colourful visuals of Diablo 3, Diablo 4 returns to being the moody goth kid of its RPG social group: pale-faced, clad in black and obsessed with death. The opening area, named Fractured Peaks, is an oppressive place where muddy, monster-ravaged villages cling to the edges of a snowy mountain range, with warrens of caves and dungeons concealed beneath the frozen surface. Said dungeons revel in their own dinginess. Painted in abundant dark shades, much like FromSoftware’s Bloodborne, the blackened walls and floors are slick with decaying viscera and often writhe with strange tendrils that grasp at you from the stonework.

Diablo 4’s appeal to the past isn’t purely stylistic. As your character accrues power across the game’s dark fantasy adventure, you must choose how to channel that power, picking skills and abilities that complement one another to make your chosen warrior an unstoppable destructive force. Diablo 4 ditches the previous game’s overly streamlined approach, returning to a more traditional skill tree that shows your character’s entire power trip at a glance.

Diablo IV screenshot
Players now carve their way through a huge open world. Photograph: Blizzard

I tested two of the five available character classes in the open beta – the barbarian and the sorcerer. What became obvious during my time with them is how intuitive character progression is. My sorceress, for example, offered an array of elemental powers to choose from. I could have made her an incandescent pyromancer, or a weaponised Elsa who froze her enemies to death. Instead, I focused on electrical abilities, Emperor Palpatine-ing my way through dungeons by zapping demons with bouncing bolts of lightning. This wasn’t the limit of my options, either. Diablo 4 let me further tailor these attacks to produce a collectible item known as “Crackling Energy”. As I plucked these orbs of static electricity from fallen foes, they’d discharge automatically when approaching new enemies. Hence, my sorceress could fry whole groups of demons before casting her first spell – a delightful sensation.

Structurally, Diablo 4 is different, as players now carve their way through a huge open world. For the beta, only the Fractured Peaks area was available to explore, but this nonetheless represents a sizeable and impressively freeform area. Although there is a central story to follow, it’s easy to get side-tracked into some offshoot adventure, helping a villager find her missing husband in some shadowy forest or delving into optional dungeons with foreboding names such as the Black Asylum. These secondary activities are tied together by “Renown”, a currency that, when accrued, periodically rewards players with extra gold, skill points, and other bonuses.

The looser structure creates a more coherent world, but it doesn’t radically change how Diablo plays. Instead, the open world exists mainly to facilitate Diablo 4’s new status as a persistent online game. Diablo 4 has extensive multiplayer features, with other players wandering freely around the game world able to periodically fight together as they explore individually, or actively join clans and embark on quests together. This ever-present multiplayer element could prove controversial, but interaction with other players isn’t mandatory, and you can happily plunder dungeons and pursue the central storyline solo.

skip past newsletter promotion

Diablo IV screenshot
Cults. Prophecies. More helpless villagers. Photograph: Blizzard

While the story has always been a part of Diablo, its role is small compared with other RPGs – largely an excuse for players to mash monsters by the million. But Diablo 4 makes a more concerted effort to grab the player’s attention, breaking up the action with more elaborate cutscenes and dialogue that dwell on individual characters, and takes more time to explore the game’s pseudo-Christian lore. These sequences bring with them all the flair you’d expect from Blizzard, and an impressive cast that includes veteran voice actors such as Troy Baker and Jennifer Hale, alongside Hollywood names like Ralph Ineson.

Broadly, it’s a typical fantasy adventure, a grand battle between good and evil. There are cults. There are prophecies. There are more helpless villagers than you can shake a pitchfork at. But there is also an attempt at more nuanced characterisation. The main antagonist – the demonic goddess Lilith – is not wholly villainous, while the fallen angel Inarius, a central figure in the religion of the game’s longsuffering humans, is not wholly good. There’s enough of interest to be audible above the sound of battle, and it helps that the game takes itself seriously, avoiding the temptation to lace the narrative with knowing side-glances and ironic gags.

Some questions remain. While Diablo’s character progression is slick and intuitive, will it offer the same level of flexibility as other ARPGs, most notably Path of Exile, which stepped in during Diablo’s long absence? Moreover, what does this new multiplayer structure mean for Blizzard’s long-term monetisation plans – will we eventually be asked to pay for a subscription? It appears inevitable it will continue to evolve after launch, and the question is what form will that evolution take. This is a game that could change shape substantially in the coming years. In its current form at least, Diablo 4 seems like a worthy ascendant to the throne of destruction.

Source link

Continue Reading

Technology

Nvidia hooks TSMC and friends on GPU accelerated chip design • The Register

Avatar

Published

on

GTC Nvidia’s latest gambit? Entrenching itself as a key part of the semiconductor manufacturing supply chain.

At GTC this week, the chipmaker unveiled cuLitho, a software library designed to accelerate computational lithography workloads used by the likes of TSMC, ASML, and Synopsys, using its GPUs.

The idea behind the platform is to offload and parallelize the complex and computationally expensive process of generating photomasks used by lithography machines to etch nanoscale features, like transistors or wires, into silicon wafers.

“Each chip design is made up of about 100 layers and in total contains trillions of polygons or patterns. Each of these 100 layers are encoded separately into a photomask — a stencil for the design if you will — and, using a rather expensive camera, are successively printed onto silicon,” Vivek Singh, VP of Nvidia’s advanced technology group, explained during a press conference on Monday.

Originally, photomasks were just a negative of the shape engineers were trying to etch into the silicon, but as transistors have gotten smaller these photomasks became more complex to counteract the effects of optical distortion. If unchecked, this distortion can blur these features beyond recognition. This process is called optical proximity correction (OPC) and more recently has evolved into inverse lithography technology (ILT). In the case of the latter, the photomasks look nothing like the feature they’re designed to print.

And the more ornate these photomasks get, the more computational horsepower is required to produce them. However, using GPUs, Nvidia believes it can not only speed up this process, but reduce the power consumption required. The company claims that cuLitho running on its GPUs is roughly 40x faster than existing computational lithography platforms running on general purpose CPUs.

“It’ll help the semiconductor industry continue the pace of innovation that we’ve all come to rely on, and it’ll improve the time to market for all kinds of chips in the future,” Singh claimed.

However, at least in the near term, Nvidia’s expectations seem to be a little more grounded. The company expects fabs using cuLitho could produce 3-5x more photomasks a day while using 9 percent less power, which if true, should help to boost foundries’ already thin margins

And with the likes of ASML, Synopsys, and TSMC lining up to integrate Nvidia’s GPUs and libraries into their software platforms and fabs, we won’t have to wait long to see these claims put to the test.

TSMC is already investigating Nvidia’s GPUs and cuLitho to accelerate ILT photomasks, while ASML and Synopsys are working to integrate support for GPU acceleration using cuLitho in their computational lithography software platforms.

And while Nvidia execs would love to sell its latest and most expensive GPU architectures to these companies, Singh notes that the library is compatible with GPUs going back to the Volta generation, which made its debut in 2017.

While Nvidia is using GPUs to accelerate these workloads, it’s worth noting that cuLitho isn’t using machine learning or AI to optimize semiconductor design just yet. But it’s no secret that Nvidia is also working on that particular problem.

“Much of this has to do with accelerating the underlying primitive operations of computational lithography,” Singh said. “But I will say that AI is very much in the works in cuLitho.”

As our sister site The Next Platform reported last summer, Nvidia has been working on ways to accelerate computational lithography workloads for some time now. In a research paper published in July, engineers at the company used AI to design equivalent circuits 25 percent smaller than those created using traditional EDA platforms.

Nvidia is hardly the only company investigating the use of machine learning to accelerate circuit design. Synopsys and Cadence have both implemented AI technologies into their portfolios, while Google researchers developed a deep-learning model called PRIME to create smaller and faster accelerator designs. And previously, the company used reinforcement learning models to design portions of its tensor processing unit (TPU).

With that said, the addressable market for something like cuLitho isn’t that big, and thanks to efforts by the US Commerce Department to stifle China’s fledgling semiconductor industry, the number is only getting smaller.

cuLitho will almost certainly be subject to US export controls governing the sale of advanced semiconductor manufacturing equipment and software to countries of concern, which for the moment means China. Pressed on this point, Singh said the library would be “available wherever this end-to-end OPC software is available,” but declined to comment further on US trade restrictions. ®

 

Source link

Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!