Connect with us

Technology

NSA asks congress to reauthorize warrantless data collection • The Register

In brief A US intelligence boss has asked Congress to reauthorize a controversial set of powers that give snoops warrantless authorization to surveil electronic communications in the name of fighting terrorism and so forth.

NSA director General Paul Nakasone told the Privacy and Civil Liberties Oversight Board yesterday that the loss of Section 702 of the Foreign Intelligence Surveillance Act (FISA) would mean American spies would “lose critical insights into the most significant threats to our nation” if allowed to lapse on December 31. 

In his speech, Nakasone said Section 702 is “irreplaceable,” and he provided several stories of the FBI and NSA cooperating using the law to stop terrorist plots and online attacks to justify his claim. 

Section 702 was added to the Foreign Intelligence Surveillance Act in 2008, and has long been a bone of contention between civil liberties groups arguing it’s a gross privacy violation, and those who say that, if you’re not a terrorist, surely a little harmless observation by Uncle Sam is okay.

The NSA has long held that Section 702 saved American lives and protected the nation and its allies, though documents declassified in 2019 showed that it was frequently used against US persons, despite the law specifically being designed to only apply to foreign targets.

Despite those restrictions, the FBI was found to have used the database of electronic communications gathered from US telecom and tech companies under S.702 to search for records of US persons who were caught up in data gathering sweeps.

When asked about the use of Section 702-gathered data to surveil US persons during hearings over its previous renewal in 2017, the NSA refused to provide figures. “Seems like baloney to me … It’s the greatest intelligence service on the planet. You’d think they’d be able to know that,” House Representative Jim Jordan (R-OH) said during the hearings. 

“Section 702 cannot be used to target Americans anywhere in the world or any person inside the United States regardless of nationality. No exceptions,” Nakasone said. 

The records beg to differ, and this time they’re known about before reauthorization hearings. Whether that’ll change the outcome is another thing altogether. 

Avoid this Pokémon

South Korean security firm Ahnlab says it has discovered a malware-spreading campaign that tries to trick netizens into downloading a remote access trojan – a backdoor for remote control in other words – disguised as a beta version of a new Pokémon card game. 

This Pokemon-themed malware is hiding in the tall grass, having been subtly tweaked to bypass security tools, the researchers warned. We’re told that the trojan uses various legit tools, such as NetSupport Manager, AnyDesk, TeamViewer and others, to provide the backdoor access. These programs include config files with hard-coded command-and-control server IP addresses, as well as the ability to gain persistence by adding a shortcut to the Windows startup folder and adding a hidden appdata path. 

Once installed, Ahnlab said, the attacker can make use of any of the features the remote control software includes, giving them potential total control over an infected system.

While nothing in this malware campaign is particularly innovative or exceptionally dangerous, its Pokemon-themed delivery method is, even though the idea of using a children’s game to trick kids into downloading malware isn’t new.

Federal parks agency fails password security audit … badly

The US Department of the Interior’s mission is to protect America’s natural resources, but it might have a hard time doing so if its systems remain as unsecured as a recent Office of the Inspector General report uncovered. 

There’s no better way to relay the conclusions than the report itself: “We found that the Department’s management practices and password complexity requirements were not sufficient to prevent potential unauthorized access to its systems and data,” the OIG said [PDF].

Several of the bad practices found in DOI systems were the same that allowed the Colonial Pipeline ransomware attack to occur in 2021, the OIG said. 

Inspectors were able to crack 21 percent of the agency’s passwords (totaling 18,174) – 16 percent of which they figured out within the first 90 minutes of investigating. Of the accounts it managed to break into, 288 had elevated privileges, and 362 belonged to senior US Government employees.

In addition, the OIG said multifactor authentication wasn’t consistently implemented at the DOI and password complexity requirements were “outdated and ineffective … allow[ing] unrelated staff to use the same inherently weak passwords—meaning there was not a rule in place to prevent this practice.”

The DOI also wasn’t deactivating unused accounts or enforcing password age limits, leaving more than 6,000 additional accounts vulnerable to attack, inspectors found. 

The Inspector General had eight recommendations for the DOI, including not implementing MFA methods that can be bypassed, as is currently the case, and enhancing password complexity requirements.

More broadly, the OIG seems to want the DOI to develop a security posture that’s less fly-by-night crypto space fintech startup, and more federal government agency with an $18.1 billion dollar budget. ®

Source link

Continue Reading
1 Comment

1 Comment

  1. Avatar

    website

    September 7, 2023 at 10:44 am

    You have made some decent points there. Ilooked on the net for more information about the issue and found
    most individuals will go along with your views on this website.

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Exploring the Enigma of Fermilab’s Quest for the Fifth Force in Nature

By Raza H. Qadri (ALI)
Entrepreneur, Science Enthusiast & Contributor ‘THE VOICE OF EU

In the cryptic world of particle physics, a compelling narrative unfolds, weaving together the threads of human curiosity and the fabric of the cosmos itself.

Within the confines of Fermilab, a particle accelerator facility nestled near Chicago, scientists stand on the brink of an extraordinary revelation – the potential unearthing of a fifth force of nature. In this journey of exploration, sub-atomic particles known as muons have emerged as the protagonists, exhibiting behavior that challenges our fundamental understanding of the Universe.

Muons, akin to the heavyweight cousins of electrons, have stirred the scientific community by deviating from the script of the Standard Model of particle physics. This deviation hints at the existence of a hitherto unknown force, an elusive influence that compels these sub-atomic entities to defy the laws of nature as we know them. The preliminary findings, initially introduced by Fermilab researchers in 2021, have now been fortified with an influx of data, driving us ever closer to the threshold of discovery.

The complex choreography of this exploration unfolds within the ‘g minus two (g-2)‘ experiment, where muons traverse a circular path while being accelerated to nearly the speed of light. Yet, despite the significance of these findings, the journey to conclusive proof is fraught with challenges.

Uncertainties embedded within the Standard Model’s predictions have injected a measure of complexity, necessitating a reevaluation of the acquired data. However, the scientific community remains resolute, fueled by the belief that within the next two years, the veil shrouding this enigmatic fifth force may finally be lifted.

The stakes are high, and Fermilab is not alone in this cosmic pursuit. A parallel endeavor unfolds at Europe’s iconic Large Hadron Collider (LHC), where researchers endeavor to detect any fractures in the façade of the Standard Model.

Dr. Mitesh Patel of Imperial College London, an instrumental figure in this pursuit, underscores the gravity of such revelations, declaring that the exposure of experimental anomalies could herald a seismic paradigm shift in our comprehension of the cosmos.

The echoes of this quest resonate through history, evoking the timeless wisdom of Albert Einstein. His theories of relativity redefined our understanding of space, time, and gravity, serving as the bedrock of modern physics. Einstein once mused,

“The most beautiful thing we can experience is the mysterious. It is the source of all true art and science.”

As we tread the precipice of the unknown, we heed his words, embracing the mysteries that beckon us toward revelation.

The implications of this potential fifth force ripple through the very fabric of our understanding. This force, if confirmed, could unravel the riddles of cosmic phenomena such as dark energy and dark matter. These enigmatic entities have confounded scientists for decades, eluding explanation within the framework of the Standard Model. A new force could catalyze a renaissance in particle physics, transcending the boundaries of convention and opening doorways to uncharted dimensions.

WATCH FERMILAB’S QUEST FOR THE FIFTH FORCE IN NATURE

The quest to comprehend the intricate dance of sub-atomic particles is a symphony of curiosity and exploration, an endeavor that aligns with humanity’s innate drive to uncover the mysteries of existence. Amidst the microcosmic ballet of muons, we are drawn to the profound wisdom of Einstein, who once stated, “The important thing is not to stop questioning. Curiosity has its own reason for existing.” In our pursuit of the fifth force, we honor his legacy by venturing into uncharted realms, driven by our insatiable thirst for knowledge.

Exploring The Fifth Force in Nature

As the data accumulates, the scientific community is poised on the precipice of a profound discovery. The dance of muons and their potential interaction with this newfound force serves as a testament to humanity’s relentless quest for insight, a journey that propels us ever closer to decoding the enigmas woven into the very tapestry of reality.


Here are my top picks that delve into the topic of Particle Physics:

1. “The Fifth Force: A Quest to Understand Dark Energy, the Most Mysterious Force in the Universe” by Dr. Christophe Galfard

2. “Beyond the Standard Model of Elementary Particle Physics” by Linda S. Sparke

3. “Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe” by Lisa Randall

4. “Particle Physics Brick by Brick: Atomic and Subatomic Physics Explained… in LEGO” by Dr. Ben Still

5. “Lost in Math: How Beauty Leads Physics Astray” by Sabine Hossenfelder

6. “The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory” by Brian Greene

These books provide a wide range of insights into particle physics, forces of nature, dark energy, and the intriguing mysteries of the cosmos.


Thank You For Your Support!

By Raza H. Qadri (ALI) | Entrepreneur, science enthusiast and contributor ‘THE VOICE OF EU

— For information: info@VoiceOfEU.com

— Anonymous news submissions: press@VoiceOfEU.com


Continue Reading

Current

Excellent Opportunity For Investors In Liquid Cooling For Datacenters

The increasing power consumption and heat generation of processors and other datacenter equipment have brought liquid cooling into the spotlight. The growing interest in this technology is further evidenced by recent investments made in the field.

One notable development is the acquisition of CoolIT Systems, a long-standing player in the liquid cooling market, by global investment company KKR. The deal, reportedly valued at $270 million, is aimed at enabling CoolIT to expand its operations and serve a wider range of global customers in the datacenter market. This market encompasses enterprise, high-performance computing (HPC), and cloud service provider segments.

KKR’s investment in CoolIT indicates its anticipation of a profitable return. However, their statements regarding the acquisition also reflect a recognition of the challenges facing the datacenter industry in terms of sustainability. Kyle Matter, Managing Director at KKR, emphasized the increasing data and computing needs and their potential environmental impact. He expressed a belief that liquid cooling plays a crucial role in reducing the emissions footprint of the digital economy.

Projections suggest that liquid cooling will witness significant growth, potentially capturing up to 26% of the datacenter thermal management market by 2026. This is driven by the deployment of more high-performance infrastructure. CoolIT, which is soon to be acquired, has already demonstrated its growth potential by securing a spot on the Financial Times’ list of fastest-growing US companies this year, ranking at number 218.

Alan Priestley, a former technical marketing manager at Intel and currently a VP analyst at Gartner, highlighted the necessity for many companies to invest in liquid cooling to address the challenges associated with managing high-performance servers. As processors become more powerful, liquid cooling offers an effective solution to address heat dissipation concerns and optimize server performance in datacenters.

According to Priestley, CPUs currently consume around 250W to 300W of power, while GPUs range from 300W to 500W. For servers handling demanding workloads such as AI training, those equipped with up to eight GPUs can draw as much as 7-10kW per node.

Priestley further explained that datacenters are striving to increase rack densities by incorporating more memory per node and higher-performance networking. Accommodating such heightened performance requirements necessitates increased power consumption.

Andrew Buss, a senior research director at IDC, concurred with this assessment. He emphasized that as chip or package power densities continue to rise, liquid cooling becomes a more efficient and preferred option.

Buss highlighted that support for direct liquid cooling loops is now being integrated into many modern datacenter facilities and colocation providers. He pointed out that companies like Atos/Bull have embraced direct contact liquid cooling loops for their power-dense high-performance computing (HPC) servers. This allows them to fit six AMD Epyc sockets with maximum memory, NVMe storage, and 100Gbps networking into a compact 1U chassis, all cooled by a custom cooling manifold.

The growing demand for higher performance and power-intensive applications is driving the need for efficient cooling solutions like liquid cooling in datacenters. By adopting liquid cooling technologies, datacenters can effectively manage the increasing power requirements of advanced processors and GPUs while maintaining optimal performance and mitigating potential heat-related issues.

According to Moises Levy, an expert in datacenter power and cooling research at Omdia, the global adoption of liquid cooling is expected to continue increasing.

Levy suggests that while liquid cooling has reached or is nearing a tipping point for specific applications with compute-intensive workloads, its widespread adoption in the broader datacenter market is still on the horizon. He highlights that direct-to-chip and immersion cooling technologies are likely to be the primary disruptors, projected to have the highest compound annual growth rate (CAGR) in the coming years.

Direct liquid cooling, supported by CoolIT, involves circulating a coolant, typically water, through cold plates directly attached to components like processors. This type of system is relatively easier to implement within existing rack infrastructure.

On the other hand, immersion cooling submerges the entire server node in a non-conductive dielectric fluid coolant. Specialized racks, some of which position the nodes vertically instead of horizontally, are typically required for this type of system. Immersion cooling tends to be favored for new-build server rooms.

As liquid cooling technologies continue to advance, their increasing adoption is expected to bring significant benefits to datacenters in terms of improved efficiency and enhanced cooling capabilities.

European cloud operator OVHcloud has developed a unique system that combines two cooling approaches for optimal performance. Their innovative solution involves utilizing water blocks attached to the CPU and GPU while employing immersion cooling with a dielectric fluid for the remaining components.

While OVHcloud currently reserves this system for their cloud infrastructure handling intensive workloads like AI, gaming, and high-performance compute (HPC) applications, they have indicated potential future expansion.

In a similar vein, GlobalConnect, a leading data center colocation provider, plans to offer immersion-based cooling as an option to all their customers. Teaming up with immersion cooling specialist GRC, GlobalConnect announced their system deployment in February. They aim to gradually introduce this advanced cooling technology across all 16 of their data centers located in Denmark, Norway, Sweden, Germany, and Finland, based on customer demand.

The question arises: Can liquid cooling help achieve sustainability objectives? OVH shared that its combined system is significantly more efficient than traditional air cooling methods. They claim that in tests, their cooling system achieved a favorable partial power usage effectiveness rating (PUE) of 1.004, which specifically measures the energy consumed by the cooling system.

However, Buss, an industry expert, urged caution in adopting liquid cooling and emphasized the need for careful consideration, particularly in waste heat management. He highlighted that implementing “liquid cooling done right” can certainly contribute to enhanced efficiency and environmental sustainability by reducing reliance on compressor-based cooling and leveraging heat-exchanger technology to maintain optimal cooling loop temperatures.

Nevertheless, Buss emphasized the importance of proper implementation, as simply discharging the heat into the environment, such as a lake or river, can have detrimental effects. Therefore, the design of the ultimate heat path should be carefully planned to maximize reuse opportunities whenever feasible.

The European Union (EU) has recently expressed its desire to see more cities utilizing waste heat from data centers to heat residential homes. However, challenges arise because the heat generated is often not at a sufficiently high temperature, necessitating additional energy consumption to address this limitation. Despite these obstacles, some data center operators, like QTS in the Groningen region of the Netherlands, have ventured into exploring such initiatives.

In the previous year, the United States Department of Energy made investments in projects aimed at reducing energy consumption for cooling purposes in data centers, albeit with a relatively modest funding amount of $42 million. Additionally, we highlighted the swift adoption of liquid cooling by Chinese data centers as a response to new government regulations.

Among the liquid cooling vendors that secured investments was Iceotope, a UK-based company that received £30 million ($35.7 million at the time) in a funding round led by a Singapore-based private equity provider, with a focus on penetrating the Asian market.

Intel also forged a partnership with Green Revolution Cooling to explore liquid immersion technology. However, the chip giant recently decided to halt its plans for a $700 million research and development lab dedicated to cooling technology in Oregon, as part of its cost-cutting measures.


Unlocking Efficiency & Performance: The Evolution of Datacenters

Introduction:

Datacenters play a critical role in the digital age, serving as the backbone of our increasingly connected world. These centralized facilities house an extensive network of servers, storage systems, and networking equipment that enable the storage, processing, and distribution of vast amounts of data. As technology advances and data demands continue to surge, datacenters are evolving to meet the challenges of efficiency, scalability, and performance.

1. The Rise of Hyperscale Datacenters:

Hyperscale datacenters have emerged as the powerhouses of the digital infrastructure landscape. These massive facilities are designed to handle the most demanding workloads, supporting cloud services, AI, machine learning, and big data analytics. With their extensive computing power and storage capabilities, hyperscale datacenters are fueling innovation and driving digital transformation across industries.

2. The Shift to Edge Computing:

As data-driven applications proliferate, the need for low-latency and real-time processing has become paramount. This has led to the rise of edge computing, a decentralized computing model that brings data processing closer to the source of data generation. Edge datacenters are strategically located in proximity to users and devices, enabling faster response times and reducing the burden on network infrastructure. This trend is particularly crucial for applications requiring real-time data analysis, such as autonomous vehicles, IoT devices, and augmented reality.

3. Green Datacenters: Driving Sustainability:

With the increasing energy consumption of datacenters, the industry is actively pursuing greener and more sustainable solutions. Datacenters are exploring innovative approaches to reduce their carbon footprint, optimize power usage, and increase energy efficiency. These initiatives include adopting renewable energy sources, implementing advanced cooling techniques, and optimizing server utilization through virtualization and consolidation. Green datacenters not only contribute to environmental conservation but also help organizations meet their sustainability goals.

4. Security and Data Privacy:

Data security and privacy have become paramount concerns in the digital era. Datacenters house vast amounts of sensitive information, making them attractive targets for cyber threats. As a result, datacenters are continuously enhancing their security measures, implementing robust firewalls, encryption protocols, and intrusion detection systems. Compliance with data protection regulations such as GDPR and CCPA is also a top priority for datacenters, ensuring the privacy and confidentiality of user data.

5. The Emergence of Liquid Cooling:

The ever-increasing power density of modern servers has led to significant heat dissipation challenges. To overcome this, datacenters are turning to liquid cooling as an efficient solution. Liquid cooling systems, such as direct-to-chip and immersion cooling, offer superior thermal management, enabling higher performance and energy efficiency. By efficiently dissipating heat, liquid cooling minimizes the risk of thermal throttling and extends the lifespan of critical hardware components.

Technology of Today & Tomorrow

Datacenters are at the forefront of the digital revolution, enabling seamless connectivity, storage, and processing of data. As technology advances, datacenters are continuously evolving to meet the escalating demands for efficiency, scalability, and sustainability. From hyperscale datacenters to edge computing, green initiatives, security enhancements, and liquid cooling solutions, the datacenter industry is shaping the future of our digital landscape. By embracing these advancements, organizations can unlock the full potential of their data and drive innovation in the digital age.


Continue Reading

Global Affairs

Open Source Software (OSS) Supply Chain, Security Risks And Countermeasures

OSS Security Risks And Countermeasures

The software development landscape increasingly hinges on open source components, significantly aiding continuous integration, DevOps practices, and daily updates. Last year, Synopsys discovered that 97% of codebases in 2022 incorporated open source, with specific sectors like computer hardware, cybersecurity, energy, and the Internet of Things (IoT) reaching 100% OSS integration.

While leveraging open source enhances efficiency, cost-effectiveness, and developer productivity, it inadvertently paves a path for threat actors seeking to exploit the software supply chain. Enterprises often lack visibility into their software contents due to complex involvement from multiple sources, raising concerns highlighted in VMware’s report last year. Issues include reliance on communities to patch vulnerabilities and associated security risks.

Raza Qadri, founder of Vibertron Technologies, emphasizes OSS’s pivotal role in critical infrastructure but underscores the shock experienced by developers and executives regarding their applications’ OSS contribution. Notably, Qadri cites that 95% of vulnerabilities surface in “transitive main dependencies,” indirectly added open source packages.

Qadri also acknowledges developers’ long-standing use of open source. However, recent years have witnessed heightened awareness, not just among developers but also among attackers. Malware attacks targeting the software supply chain have surged, as demonstrated in significant breaches like SolarWinds, Kaseya, and the Log4j exploit.

Log4j’s widespread use exemplifies the consolidation of risk linked to extensively employed components. This popular Java-based logging tool’s vulnerabilities showcase the systemic dependency on widely used software components, posing significant threats if exploited by attackers.

Moreover, injection of malware into repositories like GitHub, PyPI, and NPM has emerged as a growing threat. Cybercriminals generate malicious versions of popular code to deceive developers, exploiting vulnerabilities when components are downloaded, often without the developers’ knowledge.

Despite OSS’s security risks, its transparency and visibility compared to commercial software offer certain advantages. Qadri points out the swift response to Log4j vulnerabilities as an example, highlighting OSS’s collaborative nature.

Efforts to fortify software supply chain security are underway, buoyed by multi-vendor frameworks, vulnerability tracking tools, and cybersecurity products. However, additional steps, such as enforcing recalls for defective OSS components and implementing component-level firewalls akin to packet-level firewalls, are necessary to fortify defenses and mitigate malicious attacks.

Qadri underscores the need for a holistic approach involving software bills of materials (SBOMs) coupled with firewall-like capabilities to ensure a comprehensive understanding of software contents and preemptive measures against malicious threats.

As the software supply chain faces ongoing vulnerabilities and attacks, concerted efforts are imperative to bolster security measures, safeguard against threats, and fortify the foundational aspects of open source components.


We Can’t Thank You Enough For Your Support!

By John Elf | Science, Technology & Business contributor VoiceOfEU.com Digital

— For more information: info@VoiceOfEU.com

— Anonymous news submissions: press@VoiceOfEU.com


Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!