Connect with us


Will optics replace copper interconnects? We asked Ayar Labs • The Register

Voice Of EU



Science fiction is littered with fantastic visions of computing. One of the more pervasive is the idea that one day computers will run on light. After all, what’s faster than the speed of light?

But it turns out Star Trek’s glowing circuit boards might be closer to reality than you think, Ayar Labs CTO Mark Wade tells The Register. While fiber optic communications have been around for half a century, we’ve only recently started applying the technology at the board level. Despite this, Wade expects, within the next decade, optical waveguides will begin supplanting the copper traces on PCBs as shipments of optical I/O products take off.

Driving this transition are a number of factors and emerging technologies that demand ever-higher bandwidths across longer distances without sacrificing on latency or power.

If this sounds familiar, these are the same challenges that drove telecommunication giants like Bell to replace thousands of miles of copper telephone cables with fiber optics in the 1970s.

As a general rule, the higher the bandwidth, the shorter the distance it can travel without the assistance of amplifiers or repeaters to extend the reach at the expense of latency. And this is hardly unique to telecommunications networks.

The same laws of physics apply to interconnect technologies like PCIe. As it doubles its effective bandwidth with each subsequent generation, the physical distance the signal can travel shrinks.

“In a lot of cases, long distances are now defined as anything more than a few meters,” Wade said. “As the PCIe bandwidths are going higher and higher, you can no longer escape the server board without putting a retimer on the board” to boost the signal.

“Even if you can get the bandwidth from point A to point B, the question is with how much power and with how much latency,” he adds.

Ayar Lab’s something something something

This is exactly the problem that Ayar Labs is trying to solve. The silicon photonics startup has developed a chiplet that takes electrical signals from chips and converts them into a high-bandwidth optical signal.

And because the technology uses chiplet architecture, it’s intended to be packaged alongside compute tiles from other chipmakers using open standards like the Universal Chiplet Interconnect Express (UCI-express), which is currently in development.

The underlying technology has helped the company raise nearly $200 million from tech giants like Intel and Nvidia, and secure several high-profile partnerships, including one to bring optical I/O capabilities to Hewlett Packard Enterprise’s high-performance Slingshot interconnect fabric.

Near-term applications

While Wade firmly believes that optical communication at the system level is inevitable, he notes there are several applications for optical interconnects in the near term. These include high-performance computing and composable infrastructure.

“Our claim is that the electrical I/O problem is going to become so severe that computing applications are going to start to get throttled by their ability to shift bandwidth around,” he said. “For us, that’s AI and machine learning scale out.”

These HPC environments often require specialized interconnect technologies to avoid bottlenecks. Nvidia’s NVLink is one example. It enables high-speed communication between up to four GPUs.

Another area of opportunity for optical I/O, Wade says, is the kind of rack-level composable infrastructure promised by Compute Express Link’s (CXL) latest specs.

CXL defines a common, cache-coherent interface based on PCIe for interconnecting CPUs, memory, accelerators, and other peripherals

The CXL 1.0 and CXL 2.0 specs promise to unlock a variety of memory pooling and tiered memory functionality. However, the open standard’s third iteration, expected to be ratified later this year, will extend these capabilities beyond the rack level.

It’s at this stage of CXL’s development that Wade says optical’s advantages will be on full display.

“Even at the CXL 2.0 level, you’re very limited to the degree in which you can scale out, because the moment you hit something like a retimer, you start to incur latencies,” that make memory pooling impractical, he said.

However, for at least the first generation of CXL products, Wade expects most, if not all, will be electrical. “There’s a lot of software stack work that has to get done to really enable these kind of disaggregated systems” before CXL will be ready for optical I/O, he said.

But as the applications for optical I/O become more prevalent, Wade predicts the supply chain economics will make the technology even more attractive from a cost perspective. “It’s our belief that we’re gonna see an optical I/O transformation start to hit throughout almost every market vertical that’s building computing systems.”

Challenges aplenty

Of course, getting there won’t be without its challenges, and one of the biggest is convincing customers the technology is not only more performant and economically viable but mature enough for production environments.

This is specifically why Ayar Labs is focused on optical interconnects as opposed to co-packaged optics. One of the reasons that co-packaged optics haven’t taken off is their splash radius in the event of failure is significantly larger. If the optics fail on a co-packaged optical switch, the entire appliance goes down. And many of these same concerns apply to optical I/O.

“Whenever you have a heavily commoditized, standardized, risk-averse application space, that is not a place to try to deploy a new technology,” Wade said. However, “if you have a high-value application that highly benefits from increases in hardware performance, then you’re obviously going to take more risk.”

By focusing its attention on HPC environments, Ayar believes it can refine its designs and establish a supply chain for components, all while racking up the substantial field-operating hours necessary to sell to more mainstream markets.

Sci-Fi optical computers still more than a decade away

For customers that are ready and willing to risk deploying nascent technologies, optical I/O is already here.

“The customer that we’re delivering to right now has already replaced their board-level links with our optical I/O,” Wade said. “Every socket-to-socket link is an optical I/O link, and that’s even at the board level.”

As the technology matures, the question then becomes whether the optical waveguides will ever get integrated into the PCB — ala Star Trek.

“Will we see the optical waveguides getting integrated into the boards? I do think we’ll see some of that emerge actually within the next decade,” he said. “As the volume of optical I/O solutions start to get massive, it’ll make it more attractive for some of these solutions.”

Once you start shrinking beyond the board level, the future of optical I/O gets a bit murkier. The next logical step, Wade says, would be using optics to connect the individual dies that make up the chip.

However, he doesn’t expect this to happen anytime soon. “As you go into the millimeter scale, electrical I/O has, I think, a healthy roadmap in front of it,” he said. “Beyond 10-15 years, we might see… optical communication start to enter the millimeter scale regime.” ®

Source link


.NET 6 comes to Ubuntu 22.04 • The Register

Voice Of EU



Ubuntu and Microsoft have brought .NET 6 to the Ubuntu repositories, meaning that you can install it without adding any extra sources to the OS.

The announcement means that Ubuntu 22.04 is catching up with the Red Hat Linux family. As per Microsoft’s online docs, you could already do this on Fedora 36 as well as the more business-like variants: RHEL 8, CentOS Stream 8 and 9, and via scl-utils on RHEL 7.

Microsoft’s blog post about the news also mentions the ability to install the runtime, or the full SDK, into Ubuntu containers. Canonical also has new versions of these. It describes Ubuntu ROCKs as “new, ultra-small OCI-compliant appliance images, without a shell or package manager,” smaller than existing Ubuntu container images thanks to a new tool called chisel.

.NET 6 is Microsoft’s cross-platform toolchain for building apps to run on multiple platforms, including Windows, Linux, macOS, and mobile OSes. Essentially, it’s Microsoft’s answer to Oracle’s JVM – the increasingly inaccurately named Java Virtual Machine, which now supports multiple languages, including Clojure, Kotlin, Scala, and Groovy.

Microsoft’s own list of .NET languages is relatively short – C#, F#, and Visual Basic – although there are many others from outside the company. The list arguably should include PowerShell, but that already has its own Linux version.

Since 2014 or so, .NET primarily means what was formerly called .NET Core. According to Microsoft’s own diagram, that means the .NET Common Language Runtime, the bit which allows “managed code” to execute, and Microsoft’s web app framework ASP.NET.

There are three separate packages: dotnet-sdk-6.0, the SDK; dotnet-runtime-6.0, the CLR runtime; and aspnetcore-runtime-6.0, the runtime for ASP.NET. All three can be installed at once via the dotnet6 metapackage.

The notable bits of .NET that aren’t included in Core are the venerable Windows Forms framework or the slightly more modern Windows Presentation Framework, WPF.

Compare and contrast: .NET Framework versus .NET Core

Diagram showing .NET Core design

Click to enlarge

So don’t get excited and think that the inclusion of .NET in Ubuntu means that graphical .NET apps, such as Windows Store apps, can now be built and run natively on Linux. Limit your expectations to server-side stuff. This is a mainly a way to deploy console-based C# and ASP.NET apps into Ubuntu servers and Ubuntu containers.

When we asked Canonical about this, a spokesperson responded: “WPF is not currently supported in .NET 6 on Ubuntu. So, you’re correct that .NET 6 on Ubuntu is aimed at developers building text/server apps rather than graphical/GUI apps.”

We’ve also asked Microsoft if they have any additional information or details, and will update when they respond.

There are cross-platform graphical frameworks for .NET, including the open-source Avalonia and as well as Uno, which got on board in .NET 5. There is also Microsoft’s own Multi-platform App UI, or MAUI, which evolved out of Xamarin Forms.

The origins of .NET lie in Microsoft’s 1996 acquisition of Colusa Software for its OmniWare tool, which Colusa billed as “a universal substrate for web programming.” As Microsoft faced off against the US Department of Justice and European Commission, and the possibility of being broken into separate apps and OS divisions, it came up with Next Generation Windows Services, which then turned into .NET: a way to use Microsoft tools to build apps for any OS.

There is still controversy over exactly how open .NET really is, as exemplified by the aptly named isdotnetopen site. ®

Source link

Continue Reading


How cognitive science can be used to bring AI forward

Voice Of EU



Dr Marie Postma spoke to about misconceptions around AI as well its relationship with human consciousness.

AI and robots are getting ‘smarter’ all the time. From Irish-made care robot Stevie to Spot the robot dog from Boston Dynamics, these tech helpers are popping up everywhere with a wide range of uses.

The tech beneath the hardware is getting smarter too. Earlier this year, Researchers at MIT developed a simpler way to teach robots new skills after only a few physical demonstrations. And just this week, Google revealed how its combining large language models with its parent company’s Everyday Robots to help them better understand humans.

However, the advances in these areas have led to recent discussions around the idea of sentient AI. While this idea has been largely rebuffed by the AI community, an understanding of the relationship between cognitive science and AI is an important one.

Dr Marie Postma is head of the department of cognitive science and artificial intelligence at Tilburg School of Humanities and Digital Sciences in the Netherlands.

The department is mainly financed by three education programmes and has around 100 staff and between 900 and 1,000 students.

‘Technology is not the problem; people are the problem’

The team focuses on different research themes that combine cognitive science and AI, such as computational linguistics with a big focus on deep learning solutions, autonomous agents and robotics, and human-AI interaction, which is mainly focused on VR and its use in education.

Postma was a speaker at the latest edition of the Schools for Female Leadership in the Digital Age in Prague, run by Huawei’s European Leadership Academy.

Postma spoke to the 29 students about cognitive science and machine learning, starting with the history of AI and bringing it up to the modern-day challenges, such as how we can model trust in robots and the role empathy could play in AI.

“We have research where we are designing first-person games where people can experience the world from the perspective of an animal – not a very cuddly animal, it’s actually a beaver. That’s intentional,” she told me later that day.

Sentient AI

Her talk brought about a lot of discussion around AI and consciousness, a timely discussion following the news that Blake Lemione, a Google engineer, published an interview with the AI chatbot and claimed that it had become sentient.

Postma said much of the media coverage around this story had muddied the waters. “The way it was described in the media was more focused on the Turing test – interacting with an AI system that comes across as being human-like,” she said.

“But then at some point they mention consciousness, and consciousness is really a different story.”

Postma said that most people who research consciousness would agree that it’s based on a number of factors. Firstly it’s about having a perceptual basis, both the ability to perceive the world around us but also what’s happening inside us and being self-aware.

Secondly, the purpose of consciousness is being able to interpret yourself as someone who has feelings, needs, actionability in the world and a need to stay alive. “AI systems are not worried about staying alive, at least the way we construct them now, they don’t reflect on their battery life and think ‘oh no, I should go plug myself in’.”

Possibilities and limitations

While AI and robots don’t have consciousness, their ability to be programmed to a point where they can understand humans can be highly beneficial.

For example, Postma’s department has been conducting research that concerns brain-computer interaction, with a focus on motor imagery. “[This is] trying to create systems where the user, by focusing on their brain signal, can move objects in virtual reality or on computer screens using [electroencephalography].”

This has a lot of potential applications in the medical world for people who suffer from paralysis or in the advancements of prosthetic limbs.

Last year, researchers at Stanford University successfully implanted a brain-computer interface (BCI) capable of interpreting thoughts of handwriting in a 65-year-old man paralysed below the neck due to a spinal cord injury.

However, Postma said there is still a long way to go with this technology and it’s not just about the AI itself. “The issue with that is there are users who are able to do that and others who are not, and we don’t really know what the reasons are,” she said.

“There is some research that suggests that being able to do special rotation might be one of the factors but what we’re trying to discover is how we can actually train users so that they can use BCI.”

And in the interest of quelling any lingering fears around sentient AI, she also said people should not worry about this kind of technology being able to read their thoughts because the BCI is very rudimentary. “For the motor imagery BCI, it’s typically about directions, you know, right, left, etc.”

Other misconceptions about AI

Aside from exactly how smart the robots around us really are, one of the biggest falsehoods that Postma wants to correct is that the technology itself is not necessarily what causes the problems that surround it.

“What I repeat everywhere I go, is that the technology is not the problem, people are the problem. They’re the ones who create the technology solutions and use them in a certain way and who regulate them or don’t regulate them in a certain way,” she said.

“The bias in some AI solutions is not there because some AI solutions are biased, they’re biased because the data that’s used to create the solutions is biased so there is human bias going in.”

However, while bias in AI has been a major discussion topic for several years, Postma has an optimistic view on this, saying that these biased systems are actually helping to uncover biased data that would have previously been hidden behind human walls.

“It becomes explicit because all the rules are there, all the predictive features are there, even for deep learning architecture, we have techniques to simplify them and to uncover where the decision is made.”

While Postma is a major advocate for all the good AI can do, she is also concerned about how certain AI and data is used, particularly in how it can influence human decisions in politics.

“What Cambridge Analytica did – just because you can, doesn’t mean you should. And I don’t think they’re the only company that are doing that,” she said.

“I’m [also] concerned about algorithms that make things addictive, whether it’s social media or gaming, that really try to satisfy the user. I’m concerned about what it’s doing to kids.”

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Source link

Continue Reading


‘I’m buying Manchester United’: Elon Musk ‘joke’ tweet charges debate over struggling club’s future | Elon Musk

Voice Of EU



Tesla billionaire Elon Musk briefly electrified the debate about the future of Manchester United by claiming on Twitter that he is buying the struggling Premier League club – before saying that the post was part of a “long-running joke”.

He did not make clear his views on new coach Eric ten Hag’s controversial insistence on passing out from the back, or whether unhappy star striker Cristiano Ronaldo should be allowed to leave, but he did say that if he were to buy a sports team “it would be Man U. They were my fav team as a kid”.

With the team rooted to the bottom of the league after a humiliating 4-0 away defeat to Brentford, the outspoken entrepreneur’s tweet offered hope – however –briefly – to fans who want to see the back of current owners, the Florida-based Glazer family.

Also, I’m buying Manchester United ur welcome

— Elon Musk (@elonmusk) August 17, 2022

Musk has a history of making irreverent tweets, and he later clarified the post by saying he was not buying sports teams.

No, this is a long-running joke on Twitter. I’m not buying any sports teams.

— Elon Musk (@elonmusk) August 17, 2022

Buying United, one of the biggest football clubs in the world, would have cost Musk at least £2bn, according to its current stock market valuation.

Manchester United’s recent on-pitch woes have led to increased fan protests against the Glazers, who bought the club in a heavily leveraged deal in 2005 for £790m ($955.51m).

The anti-Glazer movement gained momentum last year after United were involved in a failed attempt to form a breakaway European Super League.

But a takeover by Musk would have been a case of out of the frying pan and into the fire for the club, given the billionaire’s tendency for off-the-cuff remarks and falling foul of market regulators.

Many were quick to point out that Musk had also promised to buy Twitter for $44bn before the deal collapsed in July, and has also boasted about colonising Mars and boosting birthrates on Earth.

That’s what you said about Twitter.

— Sema (@_SemaHernandez_) August 17, 2022

Fans responded with a mixture of bafflement and optimism given the lowly status of a club used to occupying the top places in the league rather than the bottom.

Manchester United did not immediately respond to a request for comment.

Source link

Continue Reading


Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!