Connect with us

Technology

‘I was just really scared’: Apple AirTags lead to stalking complaints | Technology

In early January, Brooks Nader, a 26-year-old Sports Illustrated swimsuit model, was walking home alone from a night out in New York when she received a disturbing iPhone notification telling her she was carrying an “unknown accessory”.

“This item has been moving with you for a while,” the alert read. “The owner can see its location.”

That’s when she knew “something wasn’t right”, Nader told the NBC news program Today. Nader discovered that somebody had slipped an Apple AirTag into her coat pocket while she was sitting in a restaurant earlier. Unbeknown to her, the device tracked her location for four hours before Apple’s abuse prevention system triggered the notification to her phone.

AirTags are wireless, quarter-sized Bluetooth devices that retail for $29 each. Apple launched the product in April 2021 as tracking tools that users can pair with the company’s Find My app to help locate lost belongings, like backpacks or car keys.

Yet AirTags have proven easy to abuse – police in New York, Maryland, Idaho, Colorado, Georgia, Michigan, Texas and elsewhere both within the US and internationally, have reported instances of AirTags being used to stalk individuals, as well as to target cars for theft.

Last week, the New Jersey Regional Operations & Intelligence Center issued a warning to police that AirTags posed an “inherent threat to law enforcement, as criminals could use them to identify officers’ sensitive locations” and personal routines.

AirTags have abuse-mitigation features, including pop-ups like the one Nader received, and an alarm that beeps at 60 decibels (a conversational volume) after the AirTag has been away from its owner anywhere between eight to 24 hours.

Near the end of 2021, the company released a new Android app called Tracker Detect, which was designed to help people who own Androids discover suspicious AirTags near them – yet the app must be proactively downloaded and kept active to be effective, and is only compatible with Android 9 or higher.

The outcome of more anti-stalking mechanisms is that more people are realizing they are being stalked. On 14 January, police in Montgomery county, Maryland, responded to a call from a person who was stalked home from a movie theater after an AirTag was planted on their car. Around the same time, two California women called 911 after receiving a notification that their whereabouts were being tracked while out shopping. A 30 December report from the New York Times cites seven women who believe AirTags were used to surveil them. On social media, posts from mainly women sharing their own experiences of being tracked by AirTags have drawn attention to the issue, with one TikTok video from November 2021 receiving more than 31m views.

If you suspect you’re being tracked, the conventional wisdom is not to head home, but rather call – or go to – the police. However, law enforcement responses to incidences of AirTag stalking have thus far been inconsistent, and help is not always guaranteed.

When Arizona’s Kimberly Scroop went to local police after receiving an iPhone notification that she was being tracked in September last year, “they were not interested in taking a report, they didn’t take my name or phone number,” she says. “They said if I noticed someone following me, to call the police then.”

Scroop went home and made a TikTok video about her experience being tracked, thinking she should “make as much noise as possible, so there was some public record of it” online in case anything bad happened to her. “I was having a mini panic attack, just really scared,” she says in the post that has now been viewed more than 5.5m times.

In New York, Jackie’s Law – passed in 2014 to allow police to charge people using GPS tracking devices to stalk victims even if the victims have not pressed charges – contributed to police in West Seneca’s decision to subpoena Apple for information about a case involving an AirTag attached to a victim’s car bumper. Nonetheless, Nader claims she was unable to file a report after being tracked in Tribeca, New York City, as police told her no crime had been committed.

In an official statement, Apple says it will cooperate with police “to provide any available information” about unknown AirTags people discover on their person or property. “We take customer safety very seriously and are committed to AirTags’ privacy and security,” says a spokesperson.

Ultimately, their built-in anti-stalking mechanisms and the fact that they can be easily disabled when discovered render AirTags less dangerous than other forms of stalkerware. “If you really are nefarious and evil and you really want to find someone, there are things that are much better than an AirTag,” in the $100 to $300 range, says Jon Callas, director of technology projects at the Electronic Frontier Foundation.

Indeed, stalking affects an estimated 7.5 million people in the United States each year, and one in four victims report being stalked through some form of technology, according to the Stalking Prevention Awareness & Resource Center. And it’s on the rise: a 2021 international study by the security company Norton found the number of devices reporting stalkerware daily “increased markedly by 63% between September 2020 and May 2021” with the 30-day average increasing from 48,000 to 78,000 detections. There are thousands of different stalkerware variants, such as Cerberus, GPS tracking devices and Tile, a Bluetooth-enabled AirTag competitor that announced a partnership with Amazon last spring.

To Callas, the conversation around AirTags is drawing much-needed attention to the potential for technology to be misused; he hopes more people will consider the safety risks of tracking devices, regardless of how innocent they seem. “If you make a generalized technology that helps you find your lost keys, it can help you find anything,” he says, “and that includes people”.

Source link

Technology

Exploring the Enigma of Fermilab’s Quest for the Fifth Force in Nature

By Raza H. Qadri (ALI)
Entrepreneur, Science Enthusiast & Contributor ‘THE VOICE OF EU

In the cryptic world of particle physics, a compelling narrative unfolds, weaving together the threads of human curiosity and the fabric of the cosmos itself.

Within the confines of Fermilab, a particle accelerator facility nestled near Chicago, scientists stand on the brink of an extraordinary revelation – the potential unearthing of a fifth force of nature. In this journey of exploration, sub-atomic particles known as muons have emerged as the protagonists, exhibiting behavior that challenges our fundamental understanding of the Universe.

Muons, akin to the heavyweight cousins of electrons, have stirred the scientific community by deviating from the script of the Standard Model of particle physics. This deviation hints at the existence of a hitherto unknown force, an elusive influence that compels these sub-atomic entities to defy the laws of nature as we know them. The preliminary findings, initially introduced by Fermilab researchers in 2021, have now been fortified with an influx of data, driving us ever closer to the threshold of discovery.

The complex choreography of this exploration unfolds within the ‘g minus two (g-2)‘ experiment, where muons traverse a circular path while being accelerated to nearly the speed of light. Yet, despite the significance of these findings, the journey to conclusive proof is fraught with challenges.

Uncertainties embedded within the Standard Model’s predictions have injected a measure of complexity, necessitating a reevaluation of the acquired data. However, the scientific community remains resolute, fueled by the belief that within the next two years, the veil shrouding this enigmatic fifth force may finally be lifted.

The stakes are high, and Fermilab is not alone in this cosmic pursuit. A parallel endeavor unfolds at Europe’s iconic Large Hadron Collider (LHC), where researchers endeavor to detect any fractures in the façade of the Standard Model.

Dr. Mitesh Patel of Imperial College London, an instrumental figure in this pursuit, underscores the gravity of such revelations, declaring that the exposure of experimental anomalies could herald a seismic paradigm shift in our comprehension of the cosmos.

The echoes of this quest resonate through history, evoking the timeless wisdom of Albert Einstein. His theories of relativity redefined our understanding of space, time, and gravity, serving as the bedrock of modern physics. Einstein once mused,

“The most beautiful thing we can experience is the mysterious. It is the source of all true art and science.”

As we tread the precipice of the unknown, we heed his words, embracing the mysteries that beckon us toward revelation.

The implications of this potential fifth force ripple through the very fabric of our understanding. This force, if confirmed, could unravel the riddles of cosmic phenomena such as dark energy and dark matter. These enigmatic entities have confounded scientists for decades, eluding explanation within the framework of the Standard Model. A new force could catalyze a renaissance in particle physics, transcending the boundaries of convention and opening doorways to uncharted dimensions.

WATCH FERMILAB’S QUEST FOR THE FIFTH FORCE IN NATURE

The quest to comprehend the intricate dance of sub-atomic particles is a symphony of curiosity and exploration, an endeavor that aligns with humanity’s innate drive to uncover the mysteries of existence. Amidst the microcosmic ballet of muons, we are drawn to the profound wisdom of Einstein, who once stated, “The important thing is not to stop questioning. Curiosity has its own reason for existing.” In our pursuit of the fifth force, we honor his legacy by venturing into uncharted realms, driven by our insatiable thirst for knowledge.

Exploring The Fifth Force in Nature

As the data accumulates, the scientific community is poised on the precipice of a profound discovery. The dance of muons and their potential interaction with this newfound force serves as a testament to humanity’s relentless quest for insight, a journey that propels us ever closer to decoding the enigmas woven into the very tapestry of reality.


Here are my top picks that delve into the topic of Particle Physics:

1. “The Fifth Force: A Quest to Understand Dark Energy, the Most Mysterious Force in the Universe” by Dr. Christophe Galfard

2. “Beyond the Standard Model of Elementary Particle Physics” by Linda S. Sparke

3. “Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe” by Lisa Randall

4. “Particle Physics Brick by Brick: Atomic and Subatomic Physics Explained… in LEGO” by Dr. Ben Still

5. “Lost in Math: How Beauty Leads Physics Astray” by Sabine Hossenfelder

6. “The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory” by Brian Greene

These books provide a wide range of insights into particle physics, forces of nature, dark energy, and the intriguing mysteries of the cosmos.


Thank You For Your Support!

By Raza H. Qadri (ALI) | Entrepreneur, science enthusiast and contributor ‘THE VOICE OF EU

— For information: info@VoiceOfEU.com

— Anonymous news submissions: press@VoiceOfEU.com


Continue Reading

Current

Excellent Opportunity For Investors In Liquid Cooling For Datacenters

The increasing power consumption and heat generation of processors and other datacenter equipment have brought liquid cooling into the spotlight. The growing interest in this technology is further evidenced by recent investments made in the field.

One notable development is the acquisition of CoolIT Systems, a long-standing player in the liquid cooling market, by global investment company KKR. The deal, reportedly valued at $270 million, is aimed at enabling CoolIT to expand its operations and serve a wider range of global customers in the datacenter market. This market encompasses enterprise, high-performance computing (HPC), and cloud service provider segments.

KKR’s investment in CoolIT indicates its anticipation of a profitable return. However, their statements regarding the acquisition also reflect a recognition of the challenges facing the datacenter industry in terms of sustainability. Kyle Matter, Managing Director at KKR, emphasized the increasing data and computing needs and their potential environmental impact. He expressed a belief that liquid cooling plays a crucial role in reducing the emissions footprint of the digital economy.

Projections suggest that liquid cooling will witness significant growth, potentially capturing up to 26% of the datacenter thermal management market by 2026. This is driven by the deployment of more high-performance infrastructure. CoolIT, which is soon to be acquired, has already demonstrated its growth potential by securing a spot on the Financial Times’ list of fastest-growing US companies this year, ranking at number 218.

Alan Priestley, a former technical marketing manager at Intel and currently a VP analyst at Gartner, highlighted the necessity for many companies to invest in liquid cooling to address the challenges associated with managing high-performance servers. As processors become more powerful, liquid cooling offers an effective solution to address heat dissipation concerns and optimize server performance in datacenters.

According to Priestley, CPUs currently consume around 250W to 300W of power, while GPUs range from 300W to 500W. For servers handling demanding workloads such as AI training, those equipped with up to eight GPUs can draw as much as 7-10kW per node.

Priestley further explained that datacenters are striving to increase rack densities by incorporating more memory per node and higher-performance networking. Accommodating such heightened performance requirements necessitates increased power consumption.

Andrew Buss, a senior research director at IDC, concurred with this assessment. He emphasized that as chip or package power densities continue to rise, liquid cooling becomes a more efficient and preferred option.

Buss highlighted that support for direct liquid cooling loops is now being integrated into many modern datacenter facilities and colocation providers. He pointed out that companies like Atos/Bull have embraced direct contact liquid cooling loops for their power-dense high-performance computing (HPC) servers. This allows them to fit six AMD Epyc sockets with maximum memory, NVMe storage, and 100Gbps networking into a compact 1U chassis, all cooled by a custom cooling manifold.

The growing demand for higher performance and power-intensive applications is driving the need for efficient cooling solutions like liquid cooling in datacenters. By adopting liquid cooling technologies, datacenters can effectively manage the increasing power requirements of advanced processors and GPUs while maintaining optimal performance and mitigating potential heat-related issues.

According to Moises Levy, an expert in datacenter power and cooling research at Omdia, the global adoption of liquid cooling is expected to continue increasing.

Levy suggests that while liquid cooling has reached or is nearing a tipping point for specific applications with compute-intensive workloads, its widespread adoption in the broader datacenter market is still on the horizon. He highlights that direct-to-chip and immersion cooling technologies are likely to be the primary disruptors, projected to have the highest compound annual growth rate (CAGR) in the coming years.

Direct liquid cooling, supported by CoolIT, involves circulating a coolant, typically water, through cold plates directly attached to components like processors. This type of system is relatively easier to implement within existing rack infrastructure.

On the other hand, immersion cooling submerges the entire server node in a non-conductive dielectric fluid coolant. Specialized racks, some of which position the nodes vertically instead of horizontally, are typically required for this type of system. Immersion cooling tends to be favored for new-build server rooms.

As liquid cooling technologies continue to advance, their increasing adoption is expected to bring significant benefits to datacenters in terms of improved efficiency and enhanced cooling capabilities.

European cloud operator OVHcloud has developed a unique system that combines two cooling approaches for optimal performance. Their innovative solution involves utilizing water blocks attached to the CPU and GPU while employing immersion cooling with a dielectric fluid for the remaining components.

While OVHcloud currently reserves this system for their cloud infrastructure handling intensive workloads like AI, gaming, and high-performance compute (HPC) applications, they have indicated potential future expansion.

In a similar vein, GlobalConnect, a leading data center colocation provider, plans to offer immersion-based cooling as an option to all their customers. Teaming up with immersion cooling specialist GRC, GlobalConnect announced their system deployment in February. They aim to gradually introduce this advanced cooling technology across all 16 of their data centers located in Denmark, Norway, Sweden, Germany, and Finland, based on customer demand.

The question arises: Can liquid cooling help achieve sustainability objectives? OVH shared that its combined system is significantly more efficient than traditional air cooling methods. They claim that in tests, their cooling system achieved a favorable partial power usage effectiveness rating (PUE) of 1.004, which specifically measures the energy consumed by the cooling system.

However, Buss, an industry expert, urged caution in adopting liquid cooling and emphasized the need for careful consideration, particularly in waste heat management. He highlighted that implementing “liquid cooling done right” can certainly contribute to enhanced efficiency and environmental sustainability by reducing reliance on compressor-based cooling and leveraging heat-exchanger technology to maintain optimal cooling loop temperatures.

Nevertheless, Buss emphasized the importance of proper implementation, as simply discharging the heat into the environment, such as a lake or river, can have detrimental effects. Therefore, the design of the ultimate heat path should be carefully planned to maximize reuse opportunities whenever feasible.

The European Union (EU) has recently expressed its desire to see more cities utilizing waste heat from data centers to heat residential homes. However, challenges arise because the heat generated is often not at a sufficiently high temperature, necessitating additional energy consumption to address this limitation. Despite these obstacles, some data center operators, like QTS in the Groningen region of the Netherlands, have ventured into exploring such initiatives.

In the previous year, the United States Department of Energy made investments in projects aimed at reducing energy consumption for cooling purposes in data centers, albeit with a relatively modest funding amount of $42 million. Additionally, we highlighted the swift adoption of liquid cooling by Chinese data centers as a response to new government regulations.

Among the liquid cooling vendors that secured investments was Iceotope, a UK-based company that received £30 million ($35.7 million at the time) in a funding round led by a Singapore-based private equity provider, with a focus on penetrating the Asian market.

Intel also forged a partnership with Green Revolution Cooling to explore liquid immersion technology. However, the chip giant recently decided to halt its plans for a $700 million research and development lab dedicated to cooling technology in Oregon, as part of its cost-cutting measures.


Unlocking Efficiency & Performance: The Evolution of Datacenters

Introduction:

Datacenters play a critical role in the digital age, serving as the backbone of our increasingly connected world. These centralized facilities house an extensive network of servers, storage systems, and networking equipment that enable the storage, processing, and distribution of vast amounts of data. As technology advances and data demands continue to surge, datacenters are evolving to meet the challenges of efficiency, scalability, and performance.

1. The Rise of Hyperscale Datacenters:

Hyperscale datacenters have emerged as the powerhouses of the digital infrastructure landscape. These massive facilities are designed to handle the most demanding workloads, supporting cloud services, AI, machine learning, and big data analytics. With their extensive computing power and storage capabilities, hyperscale datacenters are fueling innovation and driving digital transformation across industries.

2. The Shift to Edge Computing:

As data-driven applications proliferate, the need for low-latency and real-time processing has become paramount. This has led to the rise of edge computing, a decentralized computing model that brings data processing closer to the source of data generation. Edge datacenters are strategically located in proximity to users and devices, enabling faster response times and reducing the burden on network infrastructure. This trend is particularly crucial for applications requiring real-time data analysis, such as autonomous vehicles, IoT devices, and augmented reality.

3. Green Datacenters: Driving Sustainability:

With the increasing energy consumption of datacenters, the industry is actively pursuing greener and more sustainable solutions. Datacenters are exploring innovative approaches to reduce their carbon footprint, optimize power usage, and increase energy efficiency. These initiatives include adopting renewable energy sources, implementing advanced cooling techniques, and optimizing server utilization through virtualization and consolidation. Green datacenters not only contribute to environmental conservation but also help organizations meet their sustainability goals.

4. Security and Data Privacy:

Data security and privacy have become paramount concerns in the digital era. Datacenters house vast amounts of sensitive information, making them attractive targets for cyber threats. As a result, datacenters are continuously enhancing their security measures, implementing robust firewalls, encryption protocols, and intrusion detection systems. Compliance with data protection regulations such as GDPR and CCPA is also a top priority for datacenters, ensuring the privacy and confidentiality of user data.

5. The Emergence of Liquid Cooling:

The ever-increasing power density of modern servers has led to significant heat dissipation challenges. To overcome this, datacenters are turning to liquid cooling as an efficient solution. Liquid cooling systems, such as direct-to-chip and immersion cooling, offer superior thermal management, enabling higher performance and energy efficiency. By efficiently dissipating heat, liquid cooling minimizes the risk of thermal throttling and extends the lifespan of critical hardware components.

Technology of Today & Tomorrow

Datacenters are at the forefront of the digital revolution, enabling seamless connectivity, storage, and processing of data. As technology advances, datacenters are continuously evolving to meet the escalating demands for efficiency, scalability, and sustainability. From hyperscale datacenters to edge computing, green initiatives, security enhancements, and liquid cooling solutions, the datacenter industry is shaping the future of our digital landscape. By embracing these advancements, organizations can unlock the full potential of their data and drive innovation in the digital age.


Continue Reading

Global Affairs

Open Source Software (OSS) Supply Chain, Security Risks And Countermeasures

OSS Security Risks And Countermeasures

The software development landscape increasingly hinges on open source components, significantly aiding continuous integration, DevOps practices, and daily updates. Last year, Synopsys discovered that 97% of codebases in 2022 incorporated open source, with specific sectors like computer hardware, cybersecurity, energy, and the Internet of Things (IoT) reaching 100% OSS integration.

While leveraging open source enhances efficiency, cost-effectiveness, and developer productivity, it inadvertently paves a path for threat actors seeking to exploit the software supply chain. Enterprises often lack visibility into their software contents due to complex involvement from multiple sources, raising concerns highlighted in VMware’s report last year. Issues include reliance on communities to patch vulnerabilities and associated security risks.

Raza Qadri, founder of Vibertron Technologies, emphasizes OSS’s pivotal role in critical infrastructure but underscores the shock experienced by developers and executives regarding their applications’ OSS contribution. Notably, Qadri cites that 95% of vulnerabilities surface in “transitive main dependencies,” indirectly added open source packages.

Qadri also acknowledges developers’ long-standing use of open source. However, recent years have witnessed heightened awareness, not just among developers but also among attackers. Malware attacks targeting the software supply chain have surged, as demonstrated in significant breaches like SolarWinds, Kaseya, and the Log4j exploit.

Log4j’s widespread use exemplifies the consolidation of risk linked to extensively employed components. This popular Java-based logging tool’s vulnerabilities showcase the systemic dependency on widely used software components, posing significant threats if exploited by attackers.

Moreover, injection of malware into repositories like GitHub, PyPI, and NPM has emerged as a growing threat. Cybercriminals generate malicious versions of popular code to deceive developers, exploiting vulnerabilities when components are downloaded, often without the developers’ knowledge.

Despite OSS’s security risks, its transparency and visibility compared to commercial software offer certain advantages. Qadri points out the swift response to Log4j vulnerabilities as an example, highlighting OSS’s collaborative nature.

Efforts to fortify software supply chain security are underway, buoyed by multi-vendor frameworks, vulnerability tracking tools, and cybersecurity products. However, additional steps, such as enforcing recalls for defective OSS components and implementing component-level firewalls akin to packet-level firewalls, are necessary to fortify defenses and mitigate malicious attacks.

Qadri underscores the need for a holistic approach involving software bills of materials (SBOMs) coupled with firewall-like capabilities to ensure a comprehensive understanding of software contents and preemptive measures against malicious threats.

As the software supply chain faces ongoing vulnerabilities and attacks, concerted efforts are imperative to bolster security measures, safeguard against threats, and fortify the foundational aspects of open source components.


We Can’t Thank You Enough For Your Support!

By John Elf | Science, Technology & Business contributor VoiceOfEU.com Digital

— For more information: info@VoiceOfEU.com

— Anonymous news submissions: press@VoiceOfEU.com


Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!