Connect with us

Technology

Infinite lives: the company saving old arcade machines | Games

Avatar

Published

on

On a rural industrial estate five miles outside Honiton, under the flight path of a nearby aerodrome, sits a rather nondescript warehouse. Only one feature marks it out: in front is a graveyard of stripped arcade cabinets, slowly rotting in the cold and damp.

I am here to visit Play Leisure, a company that restores and sells old arcade games. It has a compelling TikTok account where it shares new discoveries – a recent post showed off a Deadstorm Pirates machine with its enormous sit-in cabinet and giant cinematic display. I’ve dragged my friend and fellow arcade fanatic Joao Sanches along, and now I’m feeling nervous and responsible because, walking up to the unmarked entrance, I’ve no idea if they will have anything interesting in stock after our 90-minute drive.

But peering inside, I spot it immediately, sat there in the cramped reception area amid piles of cardboard boxes: a pristine 1992 Street Fighter II machine, the backboard sporting a wild illustration of Ryu kicking Ken, each special feature on the playfield named after famous Street Fighter attacks. I almost gasp.

Matt Conridge, the owner of Play Leisure, has always been interested in arcade machines. “Like a lot of us in our 30s and 40s, it comes from back when I was a kid,” he explains as he comes to greet us. “I used to visit arcades at seaside resorts – places like Dawlish and Lynmouth.”

Matt Conridge, the owner of Play Leisure.
‘It comes from back when I was a kid’ … Matt Conridge, the owner of Play Leisure. Photograph: Joao Diniz Sanches

Three years ago, Conridge was running a video game bar in Bideford, north Devon, when Covid hit. Facing disaster, he decided to close up and use his contacts in the arcade scene to pivot into a new project: restoration. He rented a warehouse, employed a small team of specialist engineers and started buying up all the old coin-ops he could get his hands on. The plan was to repair them and sell them on to private collectors and retro theme bars, after the pandemic.

“Back then, we were only buying small quantities so it usually came from collectors. Now we take them on an industrial scale,” says Conridge. “At the moment, with what’s happening in the economy, arcades are cutting costs, getting rid of some of the lower performing machines that cost them more to run than they make in revenue. We get clearances from arcades, play centres, trampoline parks … ”

Another problem is that older coin-ops require specialist engineers to maintain them. “A lot of the people who used to build and service these machines have retired,” says Conridge. “That knowledge is dying.”

Matt takes us through to the main warehouse space, where we’re momentarily stunned again. Crammed into a space about the size of a tennis court are 200-odd arcade machines from throughout gaming history. The first thing I spot is the twin cabinet version of Sega’s brilliant 1995 racing game Manx TT Super Bike, which allowed players to sit on reproduction motorcycles and compete against each other along narrow country lanes. Nearby there’s Konami’s thrilling Silent Scope 2: Fatal Judgement, complete with its authentic sniper rifle controller, and further back in this electronic labyrinth is a twin cab of Final Furlong, the crazy Namco horse racing game that you control by sitting on a plastic horse and jumping up and down.

I’m taken back to the first time I visited Japan in 2000 to attend the Tokyo Game Show. I walked into an arcade in Akihabara and saw salarymen on their lunch hour, dozens of them in rows playing this game, grimacing with effort in the darkness.

The warehouse has about 200 arcade machines from throughout gaming history.
The warehouse has about 200 arcade machines from throughout gaming history. Photograph: Joao Diniz Sanches

The machines arrive in huge shipping containers and Conridge is never quite sure what games he’ll find or what condition they will be in. “The problem is, arcade operators don’t generate any more money by keeping machine internals clean,” he says. “If you open it up and start cleaning the inside you may end up causing issues. We’ve opened them and found coins, tools … We found a porno mag in the back of a machine once. We’ve just got one from Blackpool, a crane machine that dispensed sweets – it’s been left for a few years and the sweets have fallen inside and rotted, then the flies got in there … ugh.”

Will they clean that? “No,” laughs Conridge. “We’ll sell it off and let someone else deal with it.”

Conridge is however, conscientious about whom he sells brittle older machines to. “There are some retro machines that we advise people not to buy unless they’re technically minded,” he says. “There’s a pinball machine, a 1966 electromechanical model we’re just about to put on sale, and we’ll refuse to sell that to nine out of 10 people who contact us because we know it won’t be suitable for them. These machines are like classic cars: they are specialist pieces of equipment and need constant care. If I sell it to someone who just wants a working machine, they’ll be fed up after five minutes – we’ve got to choose the right customer for it. Someone who is able to tinker.”

It’s not just ancient pinball machines that are problematic. The big video arcade games of the 1990s – the technical peak of the industry – often used proprietary hardware that is simply impossible to replace or reproduce. “The Sega Model arcade boards used custom Lockheed Martin chips, which you just can’t source,” explains Chris, the lead engineer. “We have to decide whether to harvest parts from less interesting games and use them to resupply classics like Sega Rally.” Around the outskirts of the warehouse space, there are shelves groaning under the weight of esoteric parts, haphazardly piled or collected in boxes.

Lining the warehouse are shelves of esoteric parts.
Lining the warehouse are shelves of esoteric parts. Photograph: Joao Diniz Sanches

Adding to the value of these machines now is the fact that arcades historically dumped old units when they stopped being profitable. “Ten to 15 years ago companies just didn’t foresee that there would be any interest from collectors,” says Conridge. “We just sold an Addams Family pinball machine for £10,000 – that would have been chucked in a skip 15 years ago. People didn’t expect anyone would want them.”

This was especially true of larger speciality machines, such as rhythm action games, with their bulky floor pads and complicated controllers, and driving games with their realistic race car cabinets. Not only did they take up valuable floor space, they were expensive to maintain. Their growing rarity represents an interesting challenge for Play Leisure, because games like Dance Mania and Guitar Hero are exactly the sorts of machines that the new era of retro gaming bars – such as the NQ64 chain, which has just taken on £2m of funding – are looking for: not only are they fun to play in a bar environment, they’re fun to watch, too. “Dance Mania is now a £3k machine,” Conridge says.

When cabinets arrive, their condition is assessed. For Conridge there is a delicate balance between restoration and preservation. He shows me a Point Blank machine that’s just come in: Namco’s entertaining light gun shooter, which was also popular on the PlayStation, is a currently a hit with buyers. He will aim to repair these machines whatever state they arrive in – even though the guns themselves, with their delicate recoil mechanic, are often busted beyond repair (“they get really smashed by kids in the arcade”).

On this cabinet, the lavishly illustrated decals on the sides are peeling off: do they change the artwork for a modern reproduction? “If we do, it will look better but it won’t be original,” says Conridge. “It’s a challenge. We don’t tend to sell perfect-looking machines. When we went into arcades as children, the machines would have cigarette burns – that’s how you remember them. There’s a certain charm to that.”

‘I almost gasp’ … at the sight of the classic arcade game Street Fighter II.
‘I almost gasp’ … classic arcade game Street Fighter II. Photograph: Joao Diniz Sanches

Some arcade cabinets are not economically viable to repair, but that doesn’t mean they’re unsellable. “We sell quite a lot of project machines,” he says. “For a collector working in their garage, that’s fine. We had a Star Wars 1982 Atari machine come in about 14 months ago. We put it on TikTok and Facebook – someone rang and they were desperate for it. It was nice to save this original machine from being scrapped.”

If they can’t be repaired, they’re stripped for parts: circuit boards, cathode ray monitors, joysticks, motors. Almost none of these are manufactured any more, so they’re all saved. Even completely stripped cabinets can have value: people often use them as a shell for their own arcade machines, using a PC and LED monitor. “Our customers can be really creative,” says Conridge. “We have people turning them into cocktail cabinets, stands for DVD players and games consoles. It’s nice because they’re not ending up in a landfill site – they’re getting another life.”

Conridge reckons half his machines go to retro bars and modern arcades. The rest are bought by private collectors. There’s a highly active arcade-collecting community, based around Discord servers and forums such as UKVAC, and Covid brought in a lot of new customers who started building gaming dens in the midst of lockdown.

Besides retro pinball tables and 1990s hits, the big sellers are attached to film or TV licences. Play Leisure has sold three Star Wars Battle Pods, really big immersive machines, for £10,000 each. An Aerosmith-branded arcade game named Revolution X will sell for £1,500, an X-Files pinball table for £3,500. There’s an odd market too for old coin-pushing machines, mostly thanks to the TV quiz show Tipping Point and the growing popularity of TikTok accounts that specialise in coin-pushing live streams.

Close-up of game instructions.
‘It’s nice because they’re not ending up in a landfill site – they’re getting another life.’ Photograph: Joao Diniz Sanches

Joao and I spend the whole day here, snaking between the machines, peering into their exposed innards. We photograph everything. A long time ago we worked together on the video game magazine Edge, often reporting on arcade shows – these machines, which are now antiques, were the newest, hottest tech when we started our careers.

And before that, as a kid, I hung out in arcades in the 1980s. Donkey Kong, Defender, Space Harrier, Out Run; a pocket full of 10 pence coins, a whole day to waste. It is bittersweet to see the machines here, their CRT monitors cracked or missing, light gun holsters worn and split.

It is good that these things are being saved. To many of us, these are more than just disposable commercial products: they are works of art containing within them the experiences of thousands of players, my own included.

Source link

Technology

Microsoft’s Activision Blizzard acquisition will harm UK gamers, says watchdog | Microsoft

Avatar

Published

on

The UK’s competition regulator has ruled that Microsoft’s $68.7bn (£59.6bn) deal to buy Activision Blizzard, the video game publisher behind hits including Call of Duty, will result in higher prices and less competition for UK gamers.

The Competition and Markets Authority (CMA), which launched an in-depth investigation in September after raising a host of concerns about the biggest takeover in tech history, said the deal would weaken the global rivalry between Microsoft’s Xbox and Sony’s PlayStation consoles.

“Our job is to make sure that UK gamers are not caught in the crossfire of global deals that, over time, could damage competition and result in higher prices, fewer choices, or less innovation,” said Martin Coleman, the chair of the independent panel of experts conducting the investigation. “We have provisionally found that this may be the case here.”

The CMA said possible remedies to address competition issues included selling or spinning off the business that makes Call of Duty, or the entire Activision arm of the combined Activision Blizzard.

However, the watchdog acknowledged that a spin-off into a standalone operation would mean the new business “may not have sufficient assets and resources to operate as an independent entity”.

While the CMA did not completely rule out measures short of a divestiture – for example a “behavioural remedy” such as an iron-clad licence to guarantee distribution of Call of Duty to Sony – it said a structural solution such as a partial sale, spin-off or completely blocking the deal was its preferred option.

“We are of the initial view that any behavioural remedy in this case is likely to present material effectiveness risks,” it said. “At this stage, the CMA considers that certain divestitures and/or prohibition are, in principle, feasible remedies in this case.”

The CMA said there was a risk under the deal that Microsoft could try to make Call of Duty, Activision’s flagship game and one of the most popular and profitable global franchises of all time, exclusively available to Xbox console owners.

Last year, Microsoft attempted to allay competition concerns saying it would offer its rival Sony a 10-year licence to ensure the title stayed on its Playstation consoles.

However, following Microsoft’s $7.5bn acquisition of ZeniMax in 2020, the parent of studios behind games including The Elder Scrolls, Fallout and Doom, Microsoft moved to make some titles exclusive to its own devices.

The company had previously assured European regulators that it had no incentive to make such a move.

“Microsoft would find it commercially beneficial to make Activision’s games exclusive to its own consoles, or only available on PlayStation under materially worse conditions,” the CMA said. “This strategy, of buying gaming studios and making their content exclusive to Microsoft’s platforms, has been used by Microsoft following several previous acquisitions of games studios.”

The CMA said the end result could be that gamers would see “higher prices, reduced range, lower quality, and worse service in gaming consoles over time”.

skip past newsletter promotion

Microsoft said that it believed its 10-year guarantee to continue to offer Call of Duty to rivals on equal terms would be enough to allay competition concerns.

“We are committed to offering effective and easily enforceable solutions that address the CMA’s concerns,” said Rima Alaily, the corporate vice-president and deputy general counsel at Microsoft. “Our commitment to grant long-term 100% equal access to Call of Duty to Sony, Nintendo, Steam and others preserves the deal’s benefits to gamers and developers and increases competition in the market.”

The CMA’s ruling is of critical importance as it comes before the publication of official findings of investigations conducted by the European Commission and the US Federal Trade Commission, which in December launched legal action to block the deal.

“We hope between now and April we will be able to help the CMA better understand our industry,” said a spokesperson for Activision Blizzard. “To ensure they can achieve their stated mandate to promote an environment where people can be confident they are getting great choices and fair deals, where competitive, fair-dealing business can innovate and thrive, and where the whole UK economy can grow productively and sustainably.”

Microsoft’s all-cash offer for Activision Blizzard, which also publishes global hits such as World of Warcraft and Candy Crush, dwarfs its previous biggest deal, the $26bn takeover of LinkedIn in 2016.

The purchase would result in the Xbox maker becoming the world’s third-biggest gaming company by revenue behind China’s Tencent and Japan’s Sony, the maker of PlayStation games consoles. It is also the biggest deal in tech history, eclipsing the $67bn paid by Dell to buy the digital storage company EMC in 2015.

Source link

Continue Reading

Technology

Could RISC-V become a force in HPC? We talk to the experts • The Register

Avatar

Published

on

Analysis The RISC-V architecture looks set to become more prevalent in the high performance computing (HPC) sector, and could even become the dominant architecture, at least according to some technical experts in the field.

Meanwhile, the European High Performance Computing Joint Undertaking (EuroHPC JU) has just announced a project aimed at the development of HPC hardware and software based on RISC-V, with plans to deploy future exascale and post-exascale supercomputers based on this technology.

RISC-V has been around for at least a decade as an open source instruction set architecture (ISA), while actual silicon implementations of the ISA have been coming to market over the past several years.

Among the attractions of this approach are that the architecture is not only free to use, but can also be extended, meaning that application-specific functions can be added to a RISC-V CPU design, and accessed by adding custom instructions to the standard RISC-V set.

This latter could prove to be a driving factor for broader adoption of RISC-V in the HPC sector, according to Aaron Potler, Distinguished Engineer at Dell Technologies.

“There’s synergy and growing strength in the RISC-V community in HPC,” Potler said, “and so RISC-V really does have a very, very good chance to become more prevalent on HPC.”

Potler was speaking in a Dell HPC Community online event, outlining perspectives from Dell’s Office of the Chief Technology and Innovation Officer.

However, he conceded that to date, RISC-V has not really made much of a mark in the HPC sector, largely because it wasn’t initially designed with that purpose in mind, but that there is “some targeting now to HPC” because of the business model it represents.

He made a comparison of sorts with Linux, which like RISC-V, started off as a small project, but which grew and grew in popularity because of its open nature (it was also free to download and run, as Potler acknowledged).

“Nobody would have thought then that Linux would run on some high end computer. When in 1993, the TOP500 list came out, there was only one Linux system on it. Nowadays, all the systems on the TOP500 list run Linux. Every single one of them. It’s been that way for a few years now,” he said.

If Linux wasn’t initially targeting the HPC market, but was adopted for it because of its inherent advantages, perhaps the same could happen with RISC-V, if there are enough advantages, such as it being an open standard.

“If that’s what the industry wants, then the community is going to make it work, it’s gonna make it happen,” Potler said.

He also made a comparison with the Arm architecture, which eventually propelled Fujitsu’s Fugaku supercomputer to the number one slot in the TOP500 rankings, and which notably accomplished this by extending the instruction set to support the 512bit Scalable Vector Engine units in the A64FX processor.

“So why wouldn’t a RISC-V-based system be number one on the TOP500 someday?” he asked.

There has already been work done on RISC-V instructions and architecture extensions relating to HPC, Potler claimed, especially those for vector processing and floating point operations.

All of this means that RISC-V has potential, but could it really make headway in the HPC sector, which once boasted systems with a variety of processor architectures but is now dominated almost entirely by X86 and Arm?

“RISC-V does have the potential to become the architecture of choice for the HPC market,” said Omdia chief analyst Roy Illsley. “I think Intel is losing its control of the overall market and the HPC segment is becoming more specialized.”

Illsley pointed out that RISC-V’s open-source nature means that any chipmaker can produce RISC-V-based designs without paying royalties or licensing fees, and that is supported by many silicon makers as well as by open-source operating systems.

Manoj Sukumaran, Principal Analyst for Datacenter Compute & Networking at Omdia agreed, saying that the biggest advantage for RISC-V is that its non-proprietary architecture lines up well with the technology sovereignty goals of various countries. “HPC Capacity is a strategic advantage to any country and it is an inevitable part of a country’s scientific and economic progress. No country wants to be in a situation like China or Russia and this is fueling RISC-V adoption,” he claimed.

RISC-V is also a “very efficient and compelling instruction set architecture” and the provision to customize it for specific computing needs with additional instructions makes it agile as well, according to Sukumaran.

The drive for sovereignty, or at least greater self-reliance, could be one motive behind the call from the EuroHPC JU for a partnership framework to develop HPC hardware and software based on RISC-V as part of EU-wide ecosystem.

This is expected to be followed up by an ambitious plan of action for building and deploying exascale and post-exascale supercomputers based on this technology, according to the EuroHPC JU.

It stated in its announcement that the European Chips Act identified RISC-V as one of the next-generation technologies where investment should be directed in order to preserve and strengthen EU leadership in research and innovation. This will also reinforce the EU’s capacity for the design, manufacturing and packaging of advanced chips, and the ability to turn them into manufactured products.

High-performance RISC-V designs already exist from chip companies such as SiFive and Ventana, but these are typically either designs that a customer can take and have manufactured by a foundry company such as TSMC, or available as a chiplet that can be combined with others to build a custom system-on-chip (SoC) package, which is Ventana’s approach.

Creating a CPU design with custom instructions to accelerate specific functions would likely be beyond the resources of most HPC sites, but perhaps not a large user group or forum. However, a chiplet approach could de-risk the project somewhat, according to IDC Senior Research Director for Europe, Andrew Buss.

“Rather than trying to do a single massive CPU, you can assemble a SoC from chiplets, getting your CPU cores from somewhere and an I/O hub and other functions from elsewhere,” he said, although he added that this requires standardized interfaces to link the chiplets together.

But while RISC-V has potential, the software ecosystem is more important, according to Buss. “It doesn’t matter what the underlying microarchitecture is, so long as there is a sufficient software ecosystem of applications and tools to support it,” he said.

Potler agreed with this point, saying that “One of the most critical parts for HPC success is the software ecosystem. Because we’ve all worked on architectures where the software came in second, and it was a very frustrating time, right?”

Developer tools, especially compilers, need to be “solid, they need to scale, and they need to understand the ISA very well to generate good code,” he said.

This also plays a part in defining custom instructions, as these calls for a profiler or some performance analysis tools to identify time consuming sequences of code in the applications in use and gauge whether specialized instructions could accelerate these.

“So if I take these instructions out, I need a simulator that can simulate this [new] instruction. If I put it in here and take the other instructions out, the first question is, are the answers correct? Then the other thing would be: does it run enough to make it worthwhile?”

Another important factor is whether the compiler could recognize the sequences of code in the application and replace it with the custom instruction to boost performance, Potler said.

“You also see that extensions to the instruction set architecture will provide performance benefits to current and future HPC applications, whatever they may be,” he added.

However, Buss warned that even if there is a great deal of interest in RISC-V, it will take time to get there for users at HPC sites.

“There’s nothing stopping RISC-V, but it takes time to develop the performance and power to the required level,” he said, pointing out that it took the Arm architecture over a decade to get to the point where it could be competitive in this space.

There was also the setback of Intel pulling its support for the RISC-V architecture last month, after earlier becoming a premier member of RISC-V International, the governing body for the standard, and pledging to offer validation services for RISC-V IP cores optimized for manufacturing in Intel fabs.®

Source link

Continue Reading

Technology

How to improve the consumer offboarding experience

Avatar

Published

on

We often think about the start and middle points of the consumer experience but how often do we think about the end? In this article, adapted from his book Endineering, Joe Macleod, veteran product developer, explains how businesses can productively and meaningfully disengage with consumers.

Businesses often fail to engage in purposeful and proactive methods to end consumer product or service lifecycles. The consequence is a failed approach to endings that is damaging customer relationships, businesses and the environment.

What if the end isn’t all bad? What if there is actually much to be gained at the end? I’ve been working on endings in the consumer lifecycle for over a decade – researching, publishing books, speaking around the world at conferences, and working with some of the world’s biggest companies.

Here are some suggestions on how to achieve positive offboarding experiences for customers.

Consciously connected

The consumer experience should feel similar at both the beginning and the end of the consumer lifecycle.

Currently, many offboarding experiences are delivered without care or interest from the provider. Further still, offboarding is sometimes delivered by entirely different groups, for example municipal organisations such as waste management, or health and safety representatives.

The same narrative voice should offboard the consumer from the experience, with similar principles and tone of voice as when they were being onboarded.

Emotional triggers

The emotional richness delivered at onboarding helps consumers to engage. These feelings should be matched at offboarding, inspiring engagement and interest from all parties.

Being present as a brand both emotionally and actively is important at the end. Currently many brands seem to struggle with appearing authentic.

Emotional triggers should offer an opportunity for the consumer to reflect personally on the experience gained with the brand.

Endineering book cover which is an illustration of product lifecycles on a white background.

Endineering by Joe Macleod. Image: Joe Macleod

Measurable and actionable

Consumers should have a clear, measurable understanding of the impact of their consumption at offboarding. This information should be delivered in a way that enables the consumer to reflect upon their involvement in consumerism and be empowered to do something about it.

Consumers should have a clear, measurable understanding of the impact of their consumption at offboarding.

Businesses and governments around the world need to build and agree upon common measuring systems that are easily understood by the consumer.

This would establish a shared language for the consumer and the provider to communicate about the status of lingering assets, whether these are digital, service or physical product endings.

Identify and bond consumer and provider

Society needs to attach personal identity to consumerism. Consumers should be recognised as perpetrators of their past consumer activity.

Currently, the physical fallout of consumption is too easily relinquished, shipped overseas or left in the atmosphere for the most vulnerable in the world and future generations to grapple with.

However, the consumer shouldn’t be abandoned to deal with this responsibility alone. It should be shared with the provider, tied to the neutralising of assets.

Businesses need to move beyond relationships limited to a ‘good usage experience’ and start to be proud partners with consumers working towards a healthier conclusion.

Neutralising the negative consequences of consumption

Following on from the previous point, neutralising the assets of consumption should be the joint responsibility of both consumer and provider. People understand how some products, vegetable matter for example, are neutralised through organic decay. Other assets, like recycled plastics, appear to have smooth, accessible routes to offboarding courtesy of municipal recycling bins and collections.

But it’s what happens afterwards that is less visible. Plastic often gets shipped to vulnerable countries where people who are unprotected by safety laws process the material. Although the plastic material might eventually be neutralised, the consequences have knock-on effects.

Businesses, consumers and wider society need to see the issue of neutralising assets as an integral consumer experience.

For example, one simple improvement would be changing what is communicated at the end of product life. Rather than saying a product is ‘recyclable’, provide details such as, ‘This product is dismantled by x method, then gets recycled by x process, at this place in x country. This process is completed within this amount of time and costs this amount of carbon, which is then off-set’.

Timely and attentive

Businesses need to intervene at the end of the lifecycle with an active and attentive attitude. If the consumer experience is left to linger on beyond a planned ending, the assets become outdated, obsolete and risk falling out of control into the wider environment. This has become normal in recent decades, thus promoting indifference about unused products, accounts and subscriptions.

Businesses should redefine timeframes and styles of engagement with the consumer. In the short term, they will need to engage actively with the consumer to put an end to unused assets that linger in the physical, digital and service landscapes. This will seem counterintuitive to a business culture that has, in the past, benefitted from overlooking endings. But, in the long term, businesses that get this right will benefit from deeper, more loyal partnerships based on trusted re-engagement over years.

Strategic approaches will become more sophisticated, not only with regard to the consumer experience and long-term impact, but also as a means of collaboration to improve consumerism.

By Joe Macleod

Joe Macleod has experience in product development across various industries including leading e-communications and digital companies. Now he trains business influencers, policy makers, designers, product developers and individuals across diverse industries about the need for ‘good endings’ and how to achieve them. His book, Endineering: Designing consumption lifecycles that end as well as they begin, is available from online booksellers and www.andend.co.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Source link

Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!