Connect with us

Technology

Facial-recog algos increasingly surveilling lawyers at home • The Register

Voice Of EU

Published

on

In brief Contract lawyers are increasingly working under the thumb of facial-recognition software as they continue to work from home during the COVID-19 pandemic.

The technology is hit-and-miss, judging from interviews with more than two dozen American attorneys conducted by the Washington Post. To make sure these contract lawyers, who take on short term-gigs, are working as expected and are handling sensitive information appropriately, their every move is followed by webcams.

The monitoring software is mandated by their employers, and is used to control access to the legal documents that need to be reviewed. If the system thinks someone else is looking at the files on the computer, or equipment has been set up to record information from the screen, the user is booted out.

For some of the legal eagles, especially those with darker skin, this working environment is beyond tedious. The algorithms can’t reliably recognize their faces, or are thrown off by the lighting in their room, the quality of the webcam, or small facial movements. These cause the monitoring software to think an unauthorized person is present, or some other infraction has taken place, and an alert is generated.

One lawyer said twisted knots in her hair were mistaken for “unauthorized recording devices,” and she was often kicked off from the system – she said she had to log in more than 25 times on some days.

Many said they felt dehumanized and hated feeling like they were “treated like a robot.” Others, however, said they didn’t mind being monitored so much and were actually more productive because of it. We’ve more about this kind of surveillance tech here.

AI skin cancer algorithm databases short on patients with darker skin

Public datasets used to train and test AI skincare cancer algorithms lack racial diversity, and could lead to models that perform less accurately when analyzing darker skin tones.

A paper published this week in Lancet Digital Health and presented at the National Cancer Research Institute found that 21 open-source skin cancer datasets predominately contained images of fair skin.

There were 106,950 images in total, and only 2,436 of them bothered having a skin type label. Within those 2,436 images, there were only ten images of people with brown skin, and only one marked as dark brown or black skin.

“We found that for the majority of datasets, lots of important information about the images and patients in these datasets wasn’t reported,” said David Wen, co-author of the study and a dermatologist from the University of Oxford. “Research has shown that programs trained on images taken from people with lighter skin types only might not be as accurate for people with darker skin, and vice versa.”

Although these datasets are geared towards academic research, it’s difficult to tell if any commercial medical systems have been affected by its limitations.

“Evaluating whether or which commercial algorithms have been developed from the datasets was beyond the scope of our review,” he told The Register. “This is a relevant question and may indeed form the basis for future work.”

Enter Cohere, can it talk the talk?

GPT-3 isn’t the only large commercial language model in town. There is now more choice for customers than ever after the latest startup Cohere launched its AI text-generation API and announced a multi-year contract to run off Google’s TPUs.

These contracts are lucrative for cloud providers. Cohere will pay Google large sums of money for its compute resources. And in turn, Google will help Cohere sell its API, according to TechCrunch. It’s a win-win situation for both companies.

Developers only have to add a few lines of code to their applications to access Cohere’s models via the API. They can also fine-tune their own datasets to do all sorts of tasks like generating or summarizing text.

“Until now, high-quality NLP models have been the sole domain of large companies,” Cohere’s co-founder and CEO, Aidan Gomez, said. “Through this partnership we’re giving developers access to one of the most important technologies to emerge from the modern AI revolution.”

Other commercial models include Nvidia’s Megatron and AI21 Lab’s Jurassic-1.

OpenAI’s GPT-3 API is now generally available

OpenAI announced its GPT-3 API is now generally available, users from selected countries can sign-up and immediately play around with the model.

“Our progress with safeguards makes it possible to remove the waitlist for GPT-3,” it said this week.

“Tens of thousands of developers are already taking advantage of powerful AI models through our platform. We believe that by opening access to these models via an easy-to-use API, more developers will find creative ways to apply AI to a large number of useful applications and open problems.”

Previously, developers had to wait until they were approved by the company before they could use the tool. Although OpenAI said it has changed some of its user restrictions, developers cannot use the AI text generation model for certain applications and in some cases may be required to implement a content filter.

Things like general purpose chatbots that can spew hate speech or NSFW text are definitely banned.

What it’s like to be an ‘Amazombian’ constantly watched by AI cameras

One man went undercover at an Amazon fulfillment center in Montreal, and said its AI cameras were “the most insidious form of surveillance” for workers.

Mostafa Henaway, a community organizer at the Immigrant Workers Centre, an organization that fights for immigrant rights, and a PhD candidate at Concordia University, decided to work as an “Amazombian” for a month. He described what it was like to take the graveyard shift between 0120am until 12pm on weekdays.

Workers have to strap a device to their arm, which tells them what tasks they should do for the day and logs their working hours. AI cameras, installed during the COVID-19 pandemic to make sure co-workers stayed six feet away from each other, scan their every move. Even supervisors can’t escape their glare.

“The artificial cameras only ensured our obedience,” he wrote in The Breach, a Canadian news outlet.

“Every six minutes, the AI cameras analyze every worker and the distance between them, generating a report at the end of the shift. The use of big data artificial intelligence shows that even management is not themselves in control—they are simply there to enforce algorithms and predetermined tasks.”

But hey, at least the guy responsible for it all got his joyride in space. ®

Source link

Technology

Tech neck: what are smartphones doing to our bodies? | Life and style

Voice Of EU

Published

on

Name: Tech neck.

Age: Two years old.

Appearance: The next stage of human evolution.

This sounds exciting! Are we all going to be cyborgs soon? Not exactly.

Then what on earth is tech neck? That’s easy. It’s the hunch you develop from staring at your phone too much.

That’s less exciting. And less deniable. It has been claimed by the Australian Chiropractors Association that our compulsive use of mobile devices is changing the shape of our bodies.

How? Let’s say you hold your phone at an angle that makes you lower your head by 60 degrees. That adds approximately 27kg (60lbs) of weight through your spine. Now, imagine doing that for several hours every day. That’s one messed up back.

Hang on, you said that tech neck is only two years old. Phones are older than that, and “text neck” was identified as an ailment in 2011, but the pandemic made things so much worse.

Posed by model Hunchbacked person with wrong bad posture, back bones pain and problems
All in the angle … tilting the head forward adds pressure (posed by model). Photograph: Михаил Руденко/Getty Images/iStockphoto

It did? For month after month you were starved of normal human contact, and had to communicate with the rest of the world through your phone. And when you weren’t doing that, you spent your time doom-scrolling in horror through a barrage of some of the worst news in modern history.

That sounds just like me. Me too. And guess what? All that bad news was a pain in the neck.

Well, on the plus side phones have only harmed us in one way. Or two, if you count “phone thumb”, a condition where your thumb can become inflamed from prolonged texting.

OK, fine, two ways. Or three, if you factor in the claim that the blue light emitted by phones can interfere with melatonin production. Or four, if you count the eye strain you get from prolonged use. And a couple of years ago it was suggested that humans are growing bone spurs at the base of their skulls to counter all the terrible phone-related posture.

Please, stop! Do you want to know the good news?

Yes! Anything! The posture problem is easy to correct. You can do a simple stretch, where you interlock your fingers behind your head and hold your elbows against a wall.

That’s promising. Or you could try holding your phone at eye level, to reduce the pressure on your spine. Or make an extra effort to stay active throughout the day.

This is good. I can do this. Then again, there is a better way to combat tech neck.

This sounds ominous. You could always try not using your phone as much.

Never! The humps are worth it! Suit yourself.

Do say: “The best way to avoid tech neck is to put your phone down.”

Don’t say: “You know, in a minute, after you’ve watched all those TikToks.”

Source link

Continue Reading

Technology

VMware demos ‘bare-metal’ performance from virtualized GPUs • The Register

Voice Of EU

Published

on

The future of high-performance computing will be virtualized, VMware’s Uday Kurkure has told The Register.

Kurkure, the lead engineer for VMware’s performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported “near or better than bare-metal performance” for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia’s NVLink interconnect.

NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0’s 2.5GB/s. The interconnect enabled Kurkure’s team to pool 160GB of GPU memory from the Dell PowerEdge system’s four 40GB Nvidia A100 SXM GPUs.

“As the machine learning models get bigger and bigger, they don’t fit into the graphics memory of a single chip, so you need to use multiple GPUs,” he explained.

Support for NVLink in VMware’s vSphere is a relatively new addition. By toggling NVLink on and off in vSphere between tests, Kurkure was able to determine how large of an impact the interconnect had on performance.

And in what should be a surprise to no one, the large ML workloads ran faster, scaling linearly with additional GPUs, when NVLink was enabled.

Testing showed Mask R-CNN training running 15 percent faster in a twin GPU, NVLink configuration, and 18 percent faster when using all four A100s. The performance delta was even greater in the BERT natural language processing model, where the NVLink-enabled system performed 243 percent faster when running on all four GPUs.

What’s more, Kurkure says the virtualized GPUs were able to achieve the same or better performance compared to running the same workloads on bare metal.

“Now with NVLink being supported in vSphere, customers have the flexibility where they can combine multiple GPUs on the same host using NVLink so they can support bigger models, without a significant communication overhead,” Kurkure said.

HPC, enterprise implications

Based on the results of these tests, Kurkure expects most HPC workloads will be virtualized moving forward. The HPC community is always running into performance bottlenecks that leaves systems underutilized, he added, arguing that virtualization enables users to make much more efficient use of their systems.

Kurkure’s team was able to achieve performance comparable to bare metal while using just a fraction of the dual-socket system’s CPU resources.

“We were only using 16 logical cores out of 128 available,” he said. “You could use that CPU resources for other jobs without affecting your machine-learning intensive graphics modules. This is going to improve your utilization, and bring down the cost of your datacenter.”

A road leading up to a question mark in a cloud

Broadcom to buy VMware ‘on Thursday for $60 billion’

READ MORE

By toggling on and off NVLink between GPUs, additional platform flexibility can be achieved by enabling multiple isolated AI/ML workloads to be spread across the GPUs simultaneously.

“One of the key takeaways of this testing was that because of the improved utilization offered by vGPUs connected over a NVLink mesh network, VMware was able to achieve bare-metal-like performance while freeing idle resources for other workloads,” Kurkure said.

VMWare expects these results to improve resource utilization in several applications, including investment banking, pharmaceutical research, 3D CAD, and auto manufacturing. 3D CAD is a particularly high-demand area for HPC virtualization, according to Kurkure, who cited several customers looking to implement machine learning to assist with the design process.

And while it’s possible to run many of these workloads on GPUs in the cloud, he argued that cost and/or intellectual property rules may prevent them from doing so.

vGPU vs MIG

An important note is VMware’s tests were conducted using Nvidia’s vGPU Manager in vSphere as opposed to the hardware-level partitioning offered by multi-instance GPU (MIG) on the A100. MIG essentially allows the A100 to behave like up to seven less-powerful GPUs.

By comparison, vGPUs are defined in the hypervisor and are time-sliced. You can think of this as multitasking where the GPU rapidly cycles through each vGPU workload until they’re completed.

The benefit of vGPUs is users can scale well beyond seven GPU instances at the cost of potential overheads associated with rapid context switching, Kurkure explained. However, at least in his testing, the use of vGPUs didn’t appear to have a negative impact on performance compared to running on bare metal with the GPUs passed through to the VM.

Whether MIG would change this dynamic remains to be seen and is the subject of another ongoing investigation by Kurkure’s team. “It’s not clear when you should be using vGPU and when we should be running in MIG mode,” he said.

More to come

With vGPU with NVLink validated for scale-up workloads, VMware is now exploring options such as how these workloads scale across multiple systems and racks over RDMA over converged Ethernet (RoCE). Here, he says, networking becomes a major consideration.

“The natural extension of this is scale out,” he said. “So, we’ll have a number of hosted connected by RoCE.”

VMware is also investing how virtualized GPUs perform with even larger AI/ML models,

Kurkure’s team is also investigating how these architectures scale with even larger AI/ML, like GPT-3, as well as how they can be applied to telco workloads running at the edge. ®

Source link

Continue Reading

Technology

The Irish start-up tackling employee mental wellbeing

Voice Of EU

Published

on

Pause offers coaching, audit, supervision and training services in a bid to deliver measurable mental wellbeing improvements for organisations.

A new Irish start-up called Pause aims to help employers implement good mental wellbeing practices in the workplace following a tough couple of years for workers.

The company is led by Báirbre Meehan, who has been in senior leadership roles for 25 years and is a trained executive coach with a master’s in business and executive coaching.

Future Human

Meehan realised that there was a gap in the market when it came to managing employee mental wellbeing, which was only widened by the stresses of the pandemic.

She undertook a research project into mental wellbeing after seeing first-hand the impact that mental health issues were having on employee performance. For five years, she worked with GPs, psychotherapists and word-of-mouth referrals to support and monitor mental wellbeing improvements in more than 100 people.

Her research found that short-term coaching intervention led to a 70pc improvement in collective employee mental wellbeing, with positive mental wellbeing maintained at six-month and two-year review stages.

Meehan used what she found out to develop Pause. She is now launching the company at a pivotal time for employer-employee relations, as workplaces continue reopening and companies negotiate hybrid and remote work policies with staff.

Pause offers coaching, audit, supervision and training services in a bid to deliver measurable mental wellbeing improvements for organisations.

Recent Pause research, carried out in 2021, revealed that senior HR leaders are finding it increasingly difficult to support employee mental wellbeing due to the distance involved in hybrid and remote working arrangements.

New ways of working have made identifying employees struggling with their mental wellbeing challenging, and it is also difficult to convince employees to seek support, according to the findings.

‘People are finding it difficult to cope’

Meehan acknowledged that the pandemic had a “significant impact on people’s stress levels, which were already high before the pandemic, but are now at an all-time high”.

“The pace of life and working life has escalated to such an extent that people are finding it difficult to cope. The phased return to the workplace is causing a large amount of anxiety for varying reasons,” she said.

She added that people are finding it hard to draw boundaries between work and home, pointing to the introduction of the right to disconnect in Ireland last year to help people switch off and achieve a better work-life balance.

“In addition, the global pandemic caused people to re-evaluate their attitudes to work-life balance,” Meehan said.

“This makes employee retention and attraction a critical issue for organisations, and one they are struggling to manage. This is a really complex area, but Pause has developed a provable and measurable system of improving employee mental wellbeing, which has a clear positive impact on business results and employee retention.”

Meehan was the 2021 winner of the Empower Start pitching competition for women entrepreneurs based on her work with Pause. This was a Dragon’s Den-style competition delivered through the innovation hubs at Galway-Mayo Institute of Technology, IT Sligo and Letterkenny IT, which recently amalgamated to form Atlantic Technological University (ATU).

Pause is based at ATU Sligo’s innovation centre. The team currently includes Meehan and two other coaches, one of whom is a psychotherapist based in the UK.

Meehan plans to employ and train more coaches in the Pause method over the coming years.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Source link

Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!