Connect with us

Technology

‘Inconceivable’: why has Australia’s history been left to rot? | National Archives

Voice Of EU

Published

on

Historians are calling it an international embarrassment for Australia and saying it is “inconceivable that it has come to this”, as they preemptively mourn the loss of “irreplaceable national history”.

The National Archives of Australia doesn’t often make headlines, but when it does, it’s rarely good news.

Last year, it famously lost a years-long legal battle to keep secret the Palace letters – a trove of correspondence between Australia’s governor-general and the Queen’s private secretary in the lead up to the dismissal of Australia’s then prime minister, Gough Whitlam, in 1975.

As the institution – which is required by legislation to preserve records from Australian government agencies – was licking its financial wounds from the costly legal battle, it was dealt a further blow in this month’s federal budget, which largely ignored a “digital cliff” the archives was facing.

Last week, it was revealed the archives had resorted to launching a crowdfunding site in a last ditch attempt to raise tens of millions of dollars to digitise disintegrating historical materials.

The crowdfunding push has outraged Australia’s archivists and historians, and raised questions about the value Australia places on its national history.

A digital cliff

In March, an internal review of the archives found it was failing to meet its legal obligations due to underfunding. The Tune review found there was 361km of at-risk audio-visual material – including magnetic tape, cellulose acetate subject to vinegar syndrome, and film negatives – some of which will be beyond recovery as early as 2025. That figure has since grown to 384km.

At the archives’ current digitisation rate of 0.26km per year, it would take 1,400 years and $5.2b to digitise the entire collection.

Instead, the review proposed a $67.7m seven-year initiative to urgently digitise what it deemed the highest priority records. Despite desperate pleas from the archives in the weeks before the budget, it did not receive the required injection.

Now at risk of being lost are video recordings of early Australian Antarctic exploration, Asio’s spy surveillance footage, audio recordings from the royal commission into the Stolen Generation and hearings of the high court native title tribunal, as well as prime minister John Curtin’s wartime speeches.

Damaged material from the national Archive.
Damaged material from the National Archives. Photograph: National Archive of Australia

Michelle Arrow, an associate professor of modern history at Macquarie University, is scathing of government funding for the archive, as well as what she believes is the institution’s lack of planning decisions in recent years.

“This should be an international embarrassment for Australia,” she said. “Normally the public service isn’t meant to make a plea like this.

“If you think about the scale of the task, it’s still huge. They’re having to do this because there has been systematic funding issues for these institutions, but the digital cliff has been looming for the archives for many years.”

Arrow noted the archive ramped up its campaign for more funds to digitise this year. In 2015, the National Film and Sound Archives launched its Deadline 2025 discussion paper, and successfully lobbied for extra funding to digitise key collections.

While Arrow understands the impossibility of digitising all at-risk records, she is concerned that “irreplaceable national history” could still be lost even if the $67.7m can be raised for the prioritisation digitisation plan, due to the masses of departmental materials dumped on the archives by government agencies.

“We just don’t know what material there is there, it’s not all in their computer catalog. I suspect most of this stuff, we’re never going to know what we’ve lost, and that’s troubling.”

She said the Archives had become known for a reliance on “a family historian model”, charging up to $250 to digitise one document in a file in the hope it might contain relevant family history.

However this prioritises certain material being digitised, and along with the crowd funding push, Arrow says this risks important records being lost.

“Often donors have a vested interest in maintaining certain things and not others … We don’t know what researchers might want to know in the future.”

A disintegrating vinyl record.
A disintegrating vinyl record. Photograph: National Archive of Australia

Arrow says this lack of searchability for files has resulted in lengthy requests to retrieve materials, recalling a failed pursuit of hers to access letters sent to Whitlam’s women’s adviser, Elizabeth Reid.

“Reid was the first women’s advisor to a national leader anywhere in the world. We know she received many letters, and we know they’re there in the archives somewhere. I haven’t been able to find them, hopefully someone will find them someday.”

‘A problem with priorities’

Jenny Hocking, a professor of history at Monash University’s National Centre for Australia Studies, led the legal battle which forced the archives to release the Palace letters.

She believes the archives’ determination to keeping the Palace letters protected has come at the expense of digitisation.

“For them to have gone to such lengths to keep those letters away from the public is extraordinary, especially now that the cost was far greater than the roughly $1m in their legal fees, it’s closer to $2m as they had to pay mine.”

The Tune review found that around $900,000 each year of the archives’ capital budget of $6m is devoted to digitisation.

The archives’ woes are ‘tremendously embarrassing’, historian Jenny Hocking says.
The archives’ woes are ‘tremendously embarrassing’, historian Jenny Hocking says. Photograph: National Archive of Australia

“It’s an unedifying spectacle, seeing them resort to crowdfunding following the legal battle. Those two things don’t sit well,” said Hocking, who has authored a book about her legal battle and the contents of the letters.

“I have a huge respect for the archives, and it is deeply troubling to see this. I still can’t believe it, it’s almost inconceivable that it has come to this.”

Another questionable choice made by the archives, in Hocking’s mind, is the decision to enter into a $10m four-year contract to digitise war records. She believes other Commonwealth institutions, such as the National Film and Sound Archives, or the War Memorial, which has been awarded $500m in federal funds for a redevelopment, should share the digitisation burden.

“You don’t suddenly face a $67.7m cliff overnight. The problem is not with the legislation, what they’re required to do, it’s a problem with priorities.

“It’s tremendously embarrassing, it’s an international disgrace that our National Archives is resorting to passing the hat around to protect $67m of material. It’s just impossible to raise that all via crowdfunding, and they and the government know that,” Hocking said.

Digital future

Nicola Laurent, president of the Australian Society of Archivists, believes the National Archives’ current financial woes should trigger a discussion about how departmental material is archived.

The National Archives is an institutional member of the society, but it also includes university archives and archivists at private schools.

Laurent is disappointed at the need to crowdfund, and links it to the approach of charging high fees to anyone seeking to retrieve and digitise a document.

“It’s still the people having to fund the archives in a way that doesn’t seem appropriate,” she said.

She wants to see legislative change to increase access to the archives, as well as a longer term consideration about how materials are collected from departments.

“Digital preservation is almost always harder than paper based preservation, because formats change, and you have to check in, and change file formats, and make sure files aren’t corrupting.

“It’s the legislative method that requires file-by-file release that means so much doesn’t become accessible.

“Disposal needs to be happening at a greater level by the agencies, they’re giving over such large amounts and there’s no good mechanism for it as the legislation dictates what the archives must keep.”

Laurent noted a figure that government agencies created 2986 terabytes of digital records in 2019 and 92,966 shelf metres of physical records.

Arrow says more needs to be culled from records before the archives receives them and can’t dispose of them.

“Sometimes you’ll open a file and it has four copies of one letter,” Arrow said.

Attorney general Michaelia Cash, whose portfolio is responsible for the archives, has previously said the government will respond to the Tune review later this year.

The Guardian requested an interview with Fricker or another member of the National Archives. A spokeswoman said no one was available.

Source link

Technology

Amazon to let Prime users unsubscribe in two clicks to comply with EU rules

Voice Of EU

Published

on

There were complaints of ‘a large number of hurdles’ to unsubscribe from Amazon Prime such as complicated menus, skewed wording, confusing choices and warnings.

Amazon has committed to making it easier for users to cancel their Prime subscription to comply with EU rules.

The tech giant will now let consumers in the EU and EEA unsubscribe from Amazon Prime with just two clicks, using a prominent cancel button.

This came following a dialogue with the European Commission and national consumer protection authorities. Complaints had been issued to the Commission by the European Consumer Organisation, the Norwegian Consumer Council and the Transatlantic Consumer Dialogue.

These consumer authorities noted “a large number of hurdles” to unsubscribe from Amazon’s service, such as complicated navigation menus, skewed wording, confusing choices and repeated nudging.

Amazon made initial changes last year, labelling the cancel button more clearly and shortening the explanatory text. This text will now been reduced further so consumers don’t get distracted by warnings and deterred from cancelling.

“Consumers must be able to exercise their rights without any pressure from platforms,” said EU commissioner for justice Didier Reynders.

“Opting for an online subscription can be very handy for consumers as it is often a very straightforward process, but the reverse action of unsubscribing should be just as easy. One thing is clear: manipulative design or ‘dark patterns’ must be banned.”

Amazon has committed to implementing the new changes on all its EU websites and for all devices. The tech giant will be monitored by the European Commission and national authorities to ensure it complies with EU consumer law.

“Customer transparency and trust are top priorities for us,” an Amazon spokesperson said.

“By design we make it clear and simple for customers to both sign up for or cancel their Prime membership. We continually listen to feedback and look for ways to improve the customer experience, as we are doing here following constructive dialogue with the European Commission.”

Amazon has had a number of dealing with the European Commission over the years regarding its business practices. The tech giant was hit with a Statement of Objections in 2020 based on its use of marketplace seller data.

In 2017, an EU case led by competition commissioner Margrethe Vestager also accused Amazon of cutting an illegal deal with the Grand Duchy of Luxembourg to drastically lower its tax bill.

The country was ordered to recoup €250m in back taxes. However, Amazon won its appeal against this ruling last year, as the EU’s general court said the European Commission didn’t provide the “requisite legal standard” to prove Amazon received favour from tax authorities.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Source link

Continue Reading

Technology

How to read: a guide to getting more out of the experience | Books

Voice Of EU

Published

on

Why read books, in this day and age? “Haven’t we all secretly sort of come to an agreement, in the last year or two or three, that novels belonged to the age of newspapers and are going the way of newspapers, only faster?” wrote Jonathan Franzen, tongue firmly in cheek, in a 2010 essay. The comment feels trenchant a decade later, in an era marked by a saturation of streaming platforms, short-form video, podcasts and screen adaptations of said podcasts.

The proportion of non-readers in Australia has grown in recent years: results of the 2021 National Reading Survey found that 25% of people reported not reading a single book in the previous year – up from 8% in a 2017 survey.

Any bibliophile can easily rattle off a list of reasons for reading. Books enlighten and challenge us, they transport us to different worlds, they reflect essential truths about the human condition.

“People who read well and read a lot learn more, pick up more general knowledge … and can then be better critical consumers of what they read,” says Prof Pamela Snow, co-director of the Science of Language and Reading lab at La Trobe University.

So, within our busy lives, how do we better find the time for books? How can we get more out of the reading experience?

Skim/deep

We commonly interact with texts in different modes. In skimming through an article, taking in a few lines – a headline and subheadings, for example – we might gain a general but shallow understanding of its meaning. We also scan texts for specific numbers, names, or ideas – a quantity in a recipe, say.

Then there’s deep reading, what the scholars Dr Maryanne Wolf and Dr Mirit Barzillai define as “the array of sophisticated processes that propel comprehension and that include inferential and deductive reasoning, analogical skills, critical analysis, reflection, and insight. The expert reader needs milliseconds to execute these processes; the young brain needs years to develop them.”

Reading on screens has turned us into adept text skimmers. An influential 2005 study that analysed how reading behaviour had changed over the previous decade – coinciding with the global rise of the internet – found that online reading was characterised by “more time spent on browsing and scanning, keyword spotting, [and] one-time reading … while less time is spent on in-depth reading, and concentrated reading”.

Wolf has advocated for the need to cultivate a “bi-literate” reading brain, one capable of both deep reading processes and the skim reading more commonly associated with screens.

“Readers must engage in an active construction of meaning, in which they grapple with the text and apply their earlier knowledge as they question, analyse, and probe,” she and Barzillai have suggested. One technique for in-depth reading of narrative texts is RIDA: to Read, Imagine the scene, Describe it to yourself, and Add more mental detail by noting powerful imagery or salient passages.

Woman reading textbooks in library
Our brains should ideally be ‘capable of both deep reading processes and the skim reading more commonly associated with screens’. Photograph: Jacobs Stock Photography Ltd/Getty Images

Physical books, rather than devices like smartphones, tend to support more focused reading, says Baron, though she says the choice of medium is ultimately a matter of personal preference.

Screens themselves are not inherently detrimental to our ability to focus, says the head of the visual and cognitive neuroscience laboratory at the University of Melbourne, Prof Trichur Vidyasagar.

“People often have the belief, particularly concerned parents, that if you spend too much time on screen devices your concentration may get poorer. That’s not necessarily true,” he says. “If used correctly and not at the cost of other useful activities, they can greatly benefit learning.”

The key is the internet’s boundless potential for distraction. “When you use the screen, there are so many hyperlinks, so many sites, stories, and rabbit holes to go into,” Vidyasagar says. The temptation to multitask – “an illusory myth,” he says – can be hard to resist. “If you think you’re multitasking, what you’re actually doing is switching between two tasks at a rapid rate, and your performance in both goes down.”

“When you read a [physical] book it’s quite different – you can’t get distracted as easily.”

Research in university students has found that comprehension is generally higher for print reading. “There is something about reading digitally that seemingly increases the speed at which students move through the text and this processing time translates into reduced comprehension,” one study found. “The findings are especially true when you’re talking about longer materials,” Baron says, adding as a caveat that research tends to focus on academic rather than leisure reading.

Results seem to differ slightly for dedicated e-reader devices. One study, in which participants read a 28-page mystery story by Elizabeth George either in print or on a Kindle, found no differences in most standard comprehension measures. The print readers, however, were better at reconstructing the plot and chronological aspects of the story – potentially because “the physical substrate of paper provides material placeholders” for events within the story.

Rediscovering joy and meaning

Dr Judith Seaboyer, formerly a senior lecturer in literary studies at the University of Queensland, who retired last May, recently went through a fiction dry spell. “There’s so much good stuff to listen to [on the radio], so much good journalism out there to read, and I was finding that I wasn’t reading novels any more.”

“As somebody … who’s done a PhD in contemporary literary fiction, and taught it for over 20 years – you think I’d know [reading books] is worth doing.”

What broke Seaboyer out of her slump was reading new work by an author she loves – Ali Smith’s Companion Piece. Synthesising ideas and making comparisons across multiple texts is also a known strategy for deepening reading comprehension, so some might find it helpful to dig into multiple books by the same author.

Seaboyer’s advice is to read with curiosity and to carefully consider an author’s choices, which can lead to a deeper understanding of language, characters and plot. “Jot things down, annotate your book, write things in the margin,” she says. “Some publishers are putting out reading guides now – that’s often quite useful.”

Nabokov believed that “One cannot read a book: one can only reread it”. For him, revisiting books – like the process of regarding a painting – meant the mind first “takes in the whole picture and then can enjoy its details”.

“You [might] remember that you really loved reading Austen,” Seaboyer says. “It’s interesting to be thinking as you read … now that I’m older and wiser, am I seeing any of this any differently than I did when I was 18?”

“There are ways to be kind to yourself, to allow yourself the opportunity not to understand something the first time through, or to say … maybe there’s a different book I should read first,” Baron says. “It’s like reading James Joyce: if you want to start with Ulysses, good luck. If you start with A Portrait of the Artist as a Young Man, you’ll have a better shot at working your way in.”

Copy of Ulysses by James Joyce
‘It’s like reading James Joyce: if you want to start with Ulysses, good luck.’ Photograph: Martin Argles/The Guardian

If reading solely for pleasure, abandoning books that are not bringing enjoyment could, in fact, increase reading time. Of frequent readers surveyed in 2021 – those who consumed at least one book a month – 54% reported not finishing a book if they disliked it. As a result, they “move[d] on more quickly to the next book for greater enjoyment … and have fewer and shorter gaps between books”.

For those wanting to read more – for relaxation or self-improvement – Baron suggests committing to short but regular periods of reading, similar to time set aside for exercise or meditation.

The speed question

Some people are naturally fast readers – celebrated academic Harold Bloom claimed to be able to read 1,000 pages an hour in his prime. Most adults, according to 2019 analysis, read English nonfiction silently at a rate of between 175 and 300 words a minute, and fiction at a rate of 200 to 320.

While speed reading techniques or apps may seem alluring for the time poor, they’re unlikely to work without compromising understanding.

“Fast readers are not necessarily better at reading comprehension,” Vidyasagar says.

There are no shortcuts to reading faster. Becoming a better reader requires persistence and “dealing with the frustration at not seeing overnight results”, Snow says. “It’s like any skill – learning a musical instrument, learning to drive a car.”

A 2016 review of the science of reading found that reading can be improved in the same way all other skills are developed: through practice. “The way to maintain high comprehension and get through text faster is to practise reading and to become a more skilled language user.”

“If two goals of reading might be to learn for the long haul, and to think – that may be part of enjoyment, that may be part of learning – then what’s the hurry?” Baron says. “Why are we feeling like the White Rabbit?”

For Seaboyer, reading a good book is akin to a meditative experience . The “wonderful, immersive process that is deep reading” reliably brings her pleasure. “Something else is picking you up, and moving your mind and body and soul into a different space so you can think about the world differently.”

Source link

Continue Reading

Technology

Is a lack of standards holding immersion cooling back? • The Register

Voice Of EU

Published

on

Comment Liquid and immersion cooling have undergone something of a renaissance in the datacenter in recent years as components have grown ever hotter.

This trend has only accelerated over the past few months as we’ve seen a fervor of innovation and development around everything from liquid-cooled servers and components for vendors that believe the only way to cool these systems long term is to drench them in a vat of refrigerants.

Liquid and immersion cooling are by no means new technologies. They’ve had a storied history in the high-performance computing space, in systems like HPE’s Apollo, Cray, and Lenovo’s Neptune to name just a handful.

A major factor driving the adoption of this tech in traditional datacenters is a combination of more powerful chips and a general desire to cut operating costs by curbing energy consumption.

One of the challenges, however, is many of these systems employ radically different form factors than are typical in air-cooled datacenters. Some systems only require modest changes to the existing rack infrastructure, while others ditch that convention entirely in favor of massive tubs into which servers are vertically slotted.

The ways these technologies are being implemented is a mixed bag to say the least.

Immersion cooling meets rack mount

This challenge was on full display this week at HPE Discover, where the IT goliath announced a collaboration with Intel and Iceotope to bring immersion-cooling tech to HPE’s enterprise-focused Proliant server line.

The systems can now be provisioned with Iceotope’s Ku:l immersion and liquid-cooling technology, via HPE’s channel partners with support provided by distributor Avnet Integrated. Iceotope’s designs meld elements of immersion cooling and closed-loop liquid cooling to enable this technology to be deployed in rack environments with minimal changes to the existing infrastructure.

Ice’s chassis-level immersion-cooling platform effectively uses the server’s case as a reservoir and then pumps coolant throughout to hotspots like the CPU, GPU, or memory. The company also offers a 3U conversion kit for adapting air-cooled servers to liquid cooling.

Both designs utilize a liquid-to-liquid heat exchanger toward the back of the chassis, where deionized water is pumped in and heat is removed from the system using an external dry cooler.

This is a stark departure from the approach used by rival immersion-cooling vendors, such as LiquidStack or Submer, which favor submerging multiple systems in a tub full of coolant — commonly a two-phase refrigerant or specialized oil.

While this approach has shown promise, and has even been deployed in Microsoft’s Azure datacenters, the unique form factors may require special consideration from building operators. Weight distribution is among operators’ primary concerns, Dell’Oro analyst Lucas Beran told The Register in an earlier interview.

Standardized reference designs in the works

The lack of a standardized form factor for deploying and implementing these technologies is one of several challenges Intel hopes to address with its $700 million Oregon liquid and immersion cooling lab.

Announced in late May, the 200,000-square-foot facility, located about 20 miles west of Portland at its Hillsboro campus in the US, will qualify, test, and demo its expansive datacenter portfolio using a variety of cooling tech. The chipmaker is also said to be working on an open reference design for an immersion-cooling system that’s being developed by Intel Taiwan.

Intel plans to bring other Taiwanese manufacturers into the fold before rolling out the reference design globally. Whether the x86 giant will be able to bring any consistency to the way immersion cooling will be deployed in datacenters going forward remains to be seen, however.

Even if Intel’s reference design never pans out, there are still other initiatives pursuing similar goals, including the Open Compute Project’s advanced cooling solutions sub project, launched in 2018.

It aims to establish an ecosystem of servers, storage, and networking gear built around common standards for direct contact, immersion, and other cooling tech.

In the meantime, the industry will carry on chilling the best ways it can. ®

Source link

Continue Reading

Trending

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates 
directly on your inbox.

You have Successfully Subscribed!