More than 2 million people still pay for AOL dial-up
Technically Incorrect: AOL's quarterly earnings report throws up a glorious statistic about how slow some are to change their habits.
Technically Incorrect offers a slightly twisted take on the tech that's taken over our lives.

I don't know anyone who does it, but perhaps you do.
I have a distant memory of disks arriving in my mailbox for free, a whiny noise that sounded like it was coming from an alien drone and thinking: "Why do people do this?"
But perhaps you're one of the 2.1 million people who still have AOL dial-up service and actually pay for it.
AOL's quarterly earnings report, published Friday, revealed discreetly that 2.1 million people are still dialing up and paying AOL around $20 a month for the privilege of accessing the Internet.
Dial-up is infernally slow. It's about as narrowband as a contemporary connected mortal could imagine and far beyond anything they could tolerate. Just to compare, in January the FCC redefined broadbandas 25 megabits per second, though the average speed in the US is 10 Mbps. Dial-up is 56 kilobits per second. (As a quick refresher: kilo- anything is much smaller, or in this case slower, than mega-anything.) About 70 percent of Americans have broadband at home, as of a September 2013 survey, the latest figures from the Pew Internet Research project.
So who might these people be? I have contacted AOL to ask whether it could offer a breakdown and will update, should I hear.
One is left, therefore, to speculate. An obvious view would be that many of these people are senior citizens. For them, perhaps, the price is comfortable. Even more comfortable is the security of knowing how something works because they've been doing it for a long time.
Another group might be those for whom $20 a month is simply all they can afford. They might not be able to stretch to bundled cable packages or fancy computers. AOL offers, in their minds, a good deal.
Of course, it might be that some are neither grouped by age nor income bracket. They're simply people who are too ingrained in habits. They either don't notice what is going on around them, or they just don't care.
Not all AOL dial-up subscribers actually pay for it. Some have been induced to stay by freebies when they threatened to leave. There are, though, some who are on a free trial. Now. Yes, they're actually joining as if, for them, Kurt Cobain is still alive.
However odd it might be to conceive, for some people AOL dial-up is still synonymous with the Internet. Why, only the other week, a California man received a $24,000 AT&T bill(later rescinded) that appeared to be for his AOL dial-up use.
Internet arrangements are not dissimilar to marital ones. Sometimes, what happens on the inside isn't necessarily what people see on the outside.
Just as with the most peculiar marriages, at least some of the people who use AOL dial-up must still be happy with it, mustn't they?
THIS WEEK'S MUST READS
LATEST ARTICLES FROM CNET
From Ada to Brill: Why have we always dissed women in tech?
Silicon Valley faces tough questions about how it treats women. But the problem isn't just for modern-day women -- it goes back at least 200 years.

This story is part of Solving for XX, a CNET special report exploring what people and companies are doing to make the tech industry more diverse, more equitable and more welcoming to women.
Yvonne Brill was a rocket scientist. Literally. In the 1970s, she invented a propulsion system that kept satellites from wandering out of orbit. Today's satellites still rely on the technology. Her work was so important, President Barack Obama awarded her the National Medal of Technology and Innovation in 2011, the highest honor the United States can give a citizen for contributing to technological progress.
But when The New York Times wrote Brill's obituary in March 2013 -- an honor reserved only for the most influential newsmakers -- the first mention was of her "mean beef stroganoff," followed by a comment about her following her husband from job to job and taking off eight years from work to spend time with her family. A list of Brill's professional accolades didn't come until later.

Readers recoiled, taking to Twitter, Facebook and emails to accuse the newspaper of gender bias. The New York Times' public editor, Margaret Sullivan, who comments on the paper's approach to writing stories, said the piece "had the effect of undervaluing" Brill's work. The Web version of the story was changed.
The sad thing is, The New York Times isn't the only company that's diminished, undervalued or dismissed women's achievements and roles in technology. Less than two years after Brill's obit, toymaker Mattel came under fire for a book called "Barbie: I Can be a Computer Engineer." While an admirable title, the book cast the character as a helpless airhead. "I'm only creating the design ideas," she says. "I'll need Steven's and Brian's help to turn it into a real game!"
Blogger Pamela Ribon summed up the reaction to the book in a five-word takedown: "Barbie F--ks it Up Again." Mattel apologized and said the book, originally published in 2010, "doesn't reflect the Brand's vision for what Barbie stands for. We believe girls should be empowered to understand that anything is possible and believe they live in a world without limits...All Barbie titles moving forward will be written to inspire girls' imaginations and portray an empowered Barbie character."
The Brill and Barbie examples highlight a larger issue: society's tendency to trivialize, ignore or just plain deny women's contributions to science, technology, engineering and mathematics (STEM). In the past year alone, some of the biggest companies in the tech industry have had to backtrack from public gaffes as they tried to address the "women in tech" issue (Microsoft, Google) or were hit with lawsuits over gender discrimination (Facebook, Twitter).
The thing is that some of the problems women deal with today are the same their predecessors faced decades -- even centuries -- before them.
Arguing about Ada
Take Ada, countess of Lovelace, born 200 years ago. The daughter of English poet Lord Byron, Lovelace's claim to fame was her work on the Analytical Engine. Designed by the mathematician Charles Babbage, the Analytical Engine is now recognized as the first general-purpose computer.

A mathematician in her own right, Lovelace was commissioned to translate a paper on the computer written by an Italian military engineer in 1840, which she supplemented with her own elaborate notes. Those notes contain what many consider the first algorithm designed to be carried out by a computer.
Walter Isaacson, author of a best-selling biography of Apple co-founder Steve Jobs, devoted his next book to tech innovators and included a chapter on Lovelace because he wanted to give her a "moment in the sun."
Some critics, though, dismiss her contributions. Babbage historian Bruce Collier is quoted as calling her "the most overrated figure in the history of computing." Dorothy Stein in 1983 wrote that Babbage was responsible for much of Lovelace's legacy-defining notes. But Stein also argued that Lovelace -- despite being both intelligent and wealthy -- was constrained by Victorian mores that frowned on female accomplishments.
Still others have come to Lovelace's defense. "I would discredit the discreditors," said Donald Knuth, a professor emeritus at Stanford University and an expert on the history of computer science. Knuth won the prestigious Turing Award in 1974. "I could argue Lady Lovelace knew more about programming than Babbage," he said.
Suw Charman-Anderson, who organized Ada Lovelace Day to recognize women in tech, calls Stein's book a "hatchet job."
"You see in the story the double standards that modern women in STEM have to deal with. You have to prove yourself twice over," says Charman-Anderson. "Is that where we are -- 200 years after her birth?"
Same old story
Lovelace isn't the only woman whose contributions to technology and the sciences have been denigrated. Many historians point to Rosalind Franklin, whose X-ray image, known as Photograph 51, revealed the double helix structure of DNA.
Yet her colleagues, including James Watson, not only dismissed her monumental contribution, but also criticized her as a woman.
"The thought could not be avoided that the best home for a feminist was in another person's lab," Watson wrote in his 1968 book "The Double Helix." Watson faulted "Rosy" -- as he insisted on calling her -- for not wearing lipstick or emphasizing "her feminine qualities."
"So it was quite easy to imagine her the product of an unsatisfied mother who unduly stressed the desirability of professional careers that could save bright girls from marriages to dull men."
Watson, along with Francis Crick and Maurice Wilkins, was awarded the Nobel Prize in 1962 for the "discoveries concerning the molecular structure of nucleic acids." Franklin had died six years earlier, making her ineligible for the award. She was 37.
Consider also astrophysicist Jocelyn Bell Burnell, who discovered the first radio pulsars while a research student at Cambridge University. Her supervisor, Antony Hewish, along with radio astronomer Martin Ryle, received the Nobel Prize in 1974 for that discovery. She wasn't recognized. That omission caused so much controversy she was forced to respond three years later.
"I believe it would demean Nobel Prizes if they were awarded to research students, except in very exceptional cases, and I do not believe this is one of them," she said. "I am not myself upset about it."

And then there's Marie Curie, the first woman to win a Nobel Prize and the only honoree to win in two different disciplines -- physics and chemistry. Yet the Royal Swedish Academy of Sciences initially wanted to ignore her work identifying radium and polonium, and instead award the honor to Curie's husband, Pierre, and their research partner, Henri Becquerel. They issued the prize only after Pierre insisted that his wife receive the public recognition she was due.
It's the same story throughout history, says Telle Whitney, co-founder of the Grace Hopper Celebration of Women, named after the computing pioneer who was one of the first programmers of the Mark 1 computer at Harvard University in 1944.
"That is a common thread," said Whitney. "Not just historically, but today."
That's how you get cases where the death of a brilliant rocket scientist ends up becoming a discussion about beef stroganoff. No matter how mean the stroganoff was, it still leaves a bitter aftertaste.






No comments:
Post a Comment
Please leave a comment-- or suggestions, particularly of topics and places you'd like to see covered