Connect with us

Technology

What do leaders need to think about?

Voice Of EU

Published

on

From privacy and surveillance to fairness and transparency, Avanade Ireland’s Graham Healy discusses what leaders need to think about when it comes to digital ethics.

As digital transformation accelerates, there are plenty of issues for leaders to contend with, from considering a remote workforce to a decentralised data management system.

However, there are also ethical issues to consider when it comes to digitalisation, including data privacy, transparency and accessibility.

According to Graham Healy, the areas on which leaders need to focus their attention depends on several factors, including the business they’re in. Healy is the country manager for Avanade in Ireland, a joint venture between Microsoft and Accenture that delivers digital, IT and advisory services to clients all over the world.

‘With rights come responsibilities, which is even more evident in the virtual world’
– GRAHAM HEALY

“We expect a retail company in Europe working to personalise their customers’ experience to be more concerned about privacy than other firms, while a bank working on an AI system for customer financial advice would have considerations around fairness and transparency,” he said.

“In addition, government agencies have additional societal considerations such as equality and accessibility.”

Digital ethics trends

While challenges differ from sector to sector, Healy said there are some common themes emerging, especially in the area of remote working. With the mass shift to working from home over the last year, he said leaders must ensure they do not overstep when it comes to privacy and surveillance, even if tools allow it.

“In other words, with rights come responsibilities, which is even more evident in the virtual world,” he said.

Healy also highlighted the broader, more deep-seated digital ethics issues around how technology is designed. “We have already seen examples of bias coming from artificial intelligence applications in many areas of life whether it be finance, healthcare, law enforcement and so forth,” he said.

There have been several examples in recent years highlighting bias in AI, including an MIT image library that was withdrawn after researchers discovered that it contained racist and misogynistic terms that could be used in the training of machine learning models. Plenty of stakeholders have been attempting to address these issues, including the EU, UNESCO and individual companies.

However, Healy also highlighted a more optimistic trend within digital ethics, which is that people want to work for, shop with and invest in companies which they believe align with their personal values. “Companies that really take their corporate values to heart are seeing a competitive advantage and stronger brand value,” he said.

Many areas of digital ethics explore the ‘what’ and the ‘how’ of topics such as data privacy and AI bias. But Healy said the societal impacts around those ethical areas also need to be addressed.

“We don’t pay enough attention to the psychological and emotional impacts of the technology we use,” he said.

“For example, the stress people feel when they feel like they’re under constant surveillance, the divisiveness of misleading online content that has been magnified by social media for the sake of engagement, and the unbelievably toxic attacks on women and minorities online.”

Healy also said the tech industry as a whole needs to examine its environmental impact as an ethical issue.

“Tech companies have prospered in an industry focused on growth, with customers eager for the latest gadgets and features,” he said.

“But that emphasis on growth and innovation often comes at the expense of quality, durability, and eventually the environment. As an industry, we’re only just starting to have honest conversations about this.”

What can leaders do?

Healy said business and tech leaders need to examine their company’s corporate values to help guide decisions when it comes to digital ethics.

“Often, it’s not a matter of ‘whether’ but ‘how’ to move forward with a digital initiative. And here we see that digital ethics has more to do with decisions, actions and behaviours than with the technology itself.”

Healy said that training and awareness is also key to building a digitally ethical culture. “This training should be reinforced through frequent conversations too, whether in planning sessions, design -thinking workshops, team check-ins, individual performance reviews. Digital ethics should also be a key design principle when it comes to any company strategy, initiative or technology development.

“Next, they should empower champions throughout the organisation to get involved. Beyond just tech, there are people from HR, marketing, sales, finance etc who have helpful skills and interest in these topics.”

Healy also recommended that leaders tie digital ethics to business benefits as often as possible, which could include making products more accessible, more diverse or more trusted. These ties could lead to a greater adoption, especially given the trend of customers gravitating towards products, workplaces and companies that align with their own values.

“Demonstrating that attention to digital ethics can have this kind of positive outcome on the business is likely going to be their best path to get more buy-in and support.”

Source link

Technology

NFTs not annoying enough? Now they come with wallet-emptying malware • The Register

Voice Of EU

Published

on

In brief Whether or not non-fungible tokens are a flash in the pan or forever, malware operators have been keen to weaponise the technology.

An investigation was triggered after a number of cryptowallets belonging to customers of the largest NFT exchange OpenSea got mysteriously emptied. Researchers at security shop Check Point found a nasty form of NFT was in circulation, one that came with its own malware package.

People were receiving free NFTs from an unknown benefactor, but when they accepted the gift the attackers got access to their wallet information in OpenSea’s storage systems. The code generated a pop-up, that if clicked, allowed wallets to be emptied.

After disclosing the issue Opensea had a fix sorted within an hour – we wish others took such prompt action – and the platform appears to be secured. But beware of “free” gifts, particularly where money is involved.

Crime doesn’t pay? really?

A US Treasury report has said that in the last three years ransomware operators using over 60 different variants have siphoned off $5.3bn in Bitcoin payments.

The Financial Crimes Enforcement Network report [PDF], first spotted by The Record, said that the ransoms taken in the the first six months of this year amounted to $590m, up from $416m for 2020, and the problem is getting worse, according to ten years of 2,184 Suspicious Activity Reports (SARs) analysed by the agency.

“If current trends continue, SARs filed in 2021 are projected to have a higher ransomware-related transaction value than SARs filed in the previous 10 years combined, which would represent a continuing trend of substantial increases in reported year-over-year ransomware activity,” the Treasury team warned.

Arming robots with sniper rifles, not worrying at all

US-based Ghost Robotics showed off an unusual new gadget this week at a meeting of the Association of the United States Army – a sniper rifle robot.

The robotics firm already has unarmed robot dogs acting as sentries at Tyndall Air Force Base but mounted a 6.5mm sniper rifle with a range of up to 1,200 meters (3937 feet) with both day and night vision cameras. The manufacturers were at pains to point out that this is not autonomous in any way and a human always controls the trigger, the robot just gets into position to keep its human operator safe.

The robot caused something of a storm, and Ghost Robotics CEO Jiren Parikh attributed this to the emotional connection robot dogs evoke and decades of movies about killer robots.

US warns critical water systems under attack

American online watchdogs at the Cybersecurity and Infrastructure Security Agency has issued a security advisory following a spate of attacks against water and waste management facilities.

Since 2019 CISA said it had recorded five attacks against water systems, mostly ransomware but also aa former employee at Kansas-based water company who tried to tamper with drink water quality using credentials that should have been revoked when they left the biz.

For ransomware operators such businesses are tempting targets. Since water is such an essential service, it’s no-doubt thought that they’d be more likely to pay up rather than cause widespread disruption and panic.

Ukrainian cops cuff botnet suspect

The Security Service of Ukraine announced this week that they had arrested a man accused of running a massive botnet and charging for its use.

The man, a resident of Ivano-Frankivsk region in the west of the country, is said to have been running a botnet made up of over 100,000 infected systems. His opsec wasn’t great, he used telegram to tout for customers and, police say, made use of “electronic payment systems banned in Ukraine.”

A search of the suspect’s premises revealed computer equipment used to operate the botnet, and data stolen from botnet participants. Police say the suspect was also a representative of legitimate Russian payment service Webmoney, which is however under sanctions from the Ukrainian government.

Source link

Continue Reading

Technology

How scientists in Ireland are using technology to predict the climate

Voice Of EU

Published

on

Scientists at ICHEC have used supercomputing to predict Ireland’s weather patterns for the rest of the century.

In August, the Intergovernmental Panel on Climate Change (IPCC) spelled out the intensity of the climate crisis affecting every region of the world because of human activity, and Ireland is no exception.

Scientists at the Irish Centre for High-End Computing (ICHEC), based at NUI Galway, have been using advanced technology to create climate models and simulations that indicate the impact of the climate crisis on Ireland by mid-century.

Their work has raised some concerning predictions for Ireland’s weather patterns in the coming decades, including more heatwaves, less snow, and increasingly unpredictable rainfall patterns – even by Irish standards.

Temperatures are set to increase by between one and 1.6 degrees Celsius relative to levels experienced between 1991 to 2000, with the east seeing the sharpest rise. Heatwaves, especially in the south-east of the country, are expected to become more frequent.

The simulations also found that the number of days Ireland experiences frost and ice will be slashed by half, as will the amount of snow that falls in winter. Rainfall will be more variable with longer dry and wet periods, and surface winds will become weaker.

‘Dramatic changes’

While the report suggests that a heating climate may be good for farming in Ireland – a significant contributor to the economy – it will also be accompanied by the rise of pests that can have potentially devastating effects on agriculture.

Reduced wind strength and unpredictable weather will have an impact on Ireland’s growing renewable energy infrastructure, which relies heavily on specific climate conditions to reach targets.

“A mean warming of two or three degrees Celsius does not seem like much, given that temperatures can vary by a lot more than that just from day to day,” said ICHEC climate scientists Dr Paul Nolan and Dr Enda O’Brien.

“However, even that amount of warming is likely to lead to widespread and even dramatic changes in ice cover – especially in the Arctic – to sea levels, and in the natural world of plants and animals.”

Ireland’s contribution

With machine learning and supercomputing, scientists are able to use historical climate data and observations to improve predictions of Earth’s future climate – and the impacts of the climate crisis.

Ireland is part of a consortium of several northern European countries that contribute to the IPCC reports by running global climate models that feed into the report’s assessment.

As part of the consortium, Nolan has conducted many centuries worth of global climate simulations using the EC-Earth climate model, which represents the most relevant physical processes that operate in the atmosphere, oceans, land surface and sea ice.

The simulations range from historical data – so the model can be compared to real climate records – to the end of the 21st century, with the aim of providing a comprehensive picture of climate trends and what the future could hold. The ICHEC research is funded and supported by the Environmental Protection Agency, Met Éireann and the Marine Institute in Galway.

“The level of detail and consistency achieved gives confidence in these projections and allows an ever more persuasive evidence-based consensus to emerge that humans are forcing rapid climate change in well-understood ways,” Nolan and O’Brien wrote in the Irish Times this week.

“How to respond to that consensus now is a matter primarily for governments, since they can have the most impact, as well as for individuals.”

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Source link

Continue Reading

Technology

Apple’s plan to scan images will allow governments into smartphones | John Naughton

Voice Of EU

Published

on

For centuries, cryptography was the exclusive preserve of the state. Then, in 1976, Whitfield Diffie and Martin Hellman came up with a practical method for establishing a shared secret key over an authenticated (but not confidential) communications channel without using a prior shared secret. The following year, three MIT scholars – Ron Rivest, Adi Shamir and Leonard Adleman – came up with the RSA algorithm (named after their initials) for implementing it. It was the beginning of public-key cryptography – at least in the public domain.

From the very beginning, state authorities were not amused by this development. They were even less amused when in 1991 Phil Zimmermann created Pretty Good Privacy (PGP) software for signing, encrypting and decrypting texts, emails, files and other things. PGP raised the spectre of ordinary citizens – or at any rate the more geeky of them – being able to wrap their electronic communications in an envelope that not even the most powerful state could open. In fact, the US government was so enraged by Zimmermann’s work that it defined PGP as a munition, which meant that it was a crime to export it to Warsaw Pact countries. (The cold war was still relatively hot then.)

In the four decades since then, there’s been a conflict between the desire of citizens to have communications that are unreadable by state and other agencies and the desire of those agencies to be able to read them. The aftermath of 9/11, which gave states carte blanche to snoop on everything people did online, and the explosion in online communication via the internet and (since 2007) smartphones, has intensified the conflict. During the Clinton years, US authorities tried (and failed) to ensure that all electronic devices should have a secret backdoor, while the Snowden revelations in 2013 put pressure on internet companies to offer end-to-end encryption for their users’ communications that would make them unreadable by either security services or the tech companies themselves. The result was a kind of standoff: between tech companies facilitating unreadable communications and law enforcement and security agencies unable to access evidence to which they had a legitimate entitlement.

In August, Apple opened a chink in the industry’s armour, announcing that it would be adding new features to its iOS operating system that were designed to combat child sexual exploitation and the distribution of abuse imagery. The most controversial measure scans photos on an iPhone, compares them with a database of known child sexual abuse material (CSAM) and notifies Apple if a match is found. The technology is known as client-side scanning or CSS.

Powerful forces in government and the tech industry are now lobbying hard for CSS to become mandatory on all smartphones. Their argument is that instead of weakening encryption or providing law enforcement with backdoor keys, CSS would enable on-device analysis of data in the clear (ie before it becomes encrypted by an app such as WhatsApp or iMessage). If targeted information were detected, its existence and, potentially, its source would be revealed to the agencies; otherwise, little or no information would leave the client device.

CSS evangelists claim that it’s a win-win proposition: providing a solution to the encryption v public safety debate by offering privacy (unimpeded end-to-end encryption) and the ability to successfully investigate serious crime. What’s not to like? Plenty, says an academic paper by some of the world’s leading computer security experts published last week.

The drive behind the CSS lobbying is that the scanning software be installed on all smartphones rather than installed covertly on the devices of suspects or by court order on those of ex-offenders. Such universal deployment would threaten the security of law-abiding citizens as well as lawbreakers. And even though CSS still allows end-to-end encryption, this is moot if the message has already been scanned for targeted content before it was dispatched. Similarly, while Apple’s implementation of the technology simply scans for images, it doesn’t take much to imagine political regimes scanning text for names, memes, political views and so on.

In reality, CSS is a technology for what in the security world is called “bulk interception”. Because it would give government agencies access to private content, it should really be treated like wiretapping and regulated accordingly. And in jurisdictions where bulk interception is already prohibited, bulk CSS should be prohibited as well.

In the longer view of the evolution of digital technology, though, CSS is just the latest step in the inexorable intrusion of surveillance devices into our lives. The trend that started with reading our emails, moved on to logging our searches and our browsing clickstreams, mining our online activity to create profiles for targeting advertising at us and using facial recognition to allow us into our offices now continues by breaching the home with “smart” devices relaying everything back to motherships in the “cloud” and, if CSS were to be sanctioned, penetrating right into our pockets, purses and handbags. That leaves only one remaining barrier: the human skull. But, rest assured, Elon Musk undoubtedly has a plan for that too.

What I’ve been reading

Wheels within wheels
I’m not an indoor cyclist but if I were, The Counterintuitive Mechanics of Peloton Addiction, a confessional blogpost by Anne Helen Petersen, might give me pause.

Get out of here
The Last Days of Intervention is a long and thoughtful essay in Foreign Affairs by Rory Stewart, one of the few British politicians who always talked sense about Afghanistan.

The insider
Blowing the Whistle on Facebook Is Just the First Step is a bracing piece by Maria Farrell in the Conversationalist about the Facebook whistleblower.

Source link

Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!