The first long-term global map of PM2.5 in a recent issue of Environmental Health Perspectives. Canadian researchers Aaron van Donkelaar and Randall Martin at Dalhousie University, Halifax, Nova Scotia, Canada, created the map by blending total-column aerosol amount measurements from two NASA satellite instruments with information about the vertical distribution of aerosols from a computer model.
Their map, which shows the average PM2.5 results between 2001 and 2006, offers the most comprehensive view of the health-sapping particles to date. Though the new blending technique has not necessarily produced more accurate pollution measurements over developed regions that have well-established surface-based monitoring networks, it has provided the first PM2.5satellite estimates in a number of developing countries that have had no estimates of air pollution levels until now.
Dense foliage and an abundance of species means that the Northwest of America has seen increasing numbers of tree houses popping up in its canopies.
Many of the lofty homes have been created by Pete Nelson a renowned tree house builder who lives in Fall City, Washington and has written several books on the subject.
Temple of the Blue Moon: Tree house expert Pete Nelson built this structure on his land in Fall City, Washington
In Europe and the U.S., recreational tree houses, for entertaining and as workshops and studios, have become increasingly popular thanks to higher disposable incomes, better technology for builders and growing interest in eco-friendly lifestyles.
In other parts of the world, tree houses are part of a more traditional way of life. Stilt houses line the banks of many tropical river valleys in South America, particularly in the Amazon and Orinoco.
Thinking outside the box: The Treehotel, which recently opened 40 miles south of the Arctic Circle in Sweden is almost invisible among the trunks
Perfect hideaway: The glass cube is constructed from sustainably harvested wood and have underfloor heating
Most of the tree houses are complete with running water, flushing toilets and electricity. There are also special touches including hot tubs, zip lines, spiral slides, lookout towers and even an iron bridge.
Although tree houses often function as workshops, studios or places for entertaining, there are some people who live their lives permanently above solid ground.
Earlier this year a film entitled Out On A Limb was made about David ‘Squirrelman’ Csaky, a homeless man who came to global attention after Seattle authorities evicted him from the elaborate tree house he had been living in on city property for two years.
After he was evicted from his self-constructed, 300 sq ft home, 52-year-old Mr Csaky’s neighbours were so outraged by his treatment that they clubbed together to buy him a motor home to live in.
There are several construction methods when it comes to crafting a home in the trees.
Some can be supported by stilts and don’t need the tree to take any of the stress of building materials.
Rope and cable are the most common methods of suspension tree houses but these are among the most difficult to construct and access.
Google’s Voice Search, which launched on cellphones in 2008 and was added to the desktop in June, seems like such a simple proposition. You speak your query into your phone (or computer), and, ta-da, the system pops out an answer.
But teaching Google’s voice bot to understand what users are saying isn’t simple at all. And if you’re trying to get it to speak all the languages in the world, it’s even more complicated.
Enter Linne Ha (pictured, right), Google’s Voice Hunter. Her official title is “International Program Manager, Google Voice Search,” but Ha spends her days crisscrossing the globe, gathering the voice samples needed to train the voice bot, the way a lepidopterist might go hunting for rare butterflies.
Typically, a company would solve this problem by licensing samples from firms that specialize in assembling speech databases. But that wasn’t going to work for Google.
Many of the standard lexicons simply don’t include the kind of words people use in search queries. (And Google has found, interestingly enough, that the words people use in voice searches are pretty much the same as the ones they use in written ones.)
Plus, Google needed their bot to understand queries spoken in all the settings that someone might possibly use Voice Search (or voice input and Voice Actions, two other Android features that also use the acoustic models created using the samples). Traditional firms weren’t set up to handle those situations.
So Ha had to get scrappy. Her solution: Tap local Google users around the world and hand them Android smartphones loaded with a specially designed speech-gathering app. Then send them out into their communities to record their family and friends.
“The local experts are people who would be using our products,” Ha tells Fast Company. “We want to make sure that whatever we develop is something they would want to use.”
The program, dubbed “word of mouth,” started last year, when Ha, who previously worked on Google Maps and Google Earth, developed the project. It’s taken her everywhere from Mexico City to Hanoi, Amsterdam, and Jakarta, and kept her on the road for over 230 days a year.
So far, “word of mouth” has collected “millions” of samples, Ha says—Google won’t get more specific—including a whopping 250,000 utterances for each language or dialect.
To make sure Google’s scientists get the range of samples they need, Ha’s local teams have gotten creative about where they do their recordings. In Hong Kong, they jumped on trolleys and subways, since so many people there use their phones during commutes. In Brazil, they went to shopping centers, in Singapore to soccer matches, and in the Netherlands to the beach.
And while Ha has had to tangle with everything from power outages to typhoons, she says it hasn’t been a problem finding locals to participate in the program. In Indonesia, they put out a call for volunteers to show up at a university, and over 900 people turned out.
“People are really proud of their language,” Ha says. “They want to make sure [Voice Search] works properly and that they can use it with their native tongue.”
So far Voice Search works with 27 languages and dialects, which means it has about 273 more to go just to support the 300 languages in the world that have more than a million speakers. At the current rate, it could take another Ha another decade to collect all the necessary samples herself. So instead, she’s looking to scale the program by partnering with organizations on the ground, like universities, to do some of the voice hunting on Google’s behalf.
In the meantime, Ha is getting ready for her next trip, which will take the program to Africa for the first time. But before that, Ha has a little vacation planned. Her destination? She’s staying home.
[Top Image: Flickr user jaredpolin]
[Additional Images: Ha in Iceland (top), Voice collections in Jakarta (middle) and Buenos Aires (bottom), courtesy Google]
In countries across the world, IP rightholders are pushing website blocking as the latest weapon against online copyright infringement. United Nations’ Human Rights experts, security engineers, law professors and others are pushing back, noting both the enormous collateral damage such blocking can cause and the likelihood that it will do little to actually curb infringement.The UK government’s decision was based on a report by UK regulator Ofcom, dated 27 May 2011, but released on Wednesday as part of the UK government’s welcome response to the landmark Hargreaves report from May. As we saw in last week’s Newzbin2 judgment, rightsholders in the UK already have the ability under existing law to obtain court injunctions requiring ISPs to block websites proven to have infringed copyright, but the reserved blocking powers in sections 17-18 of the 2010 Digital Economy Act are broader and more controversial. That’s why the UK Department of Culture, Media and Sports asked Ofcom in February to review whether those website blocking provisions were workable. Ofcom concluded they weren’t, but did not reject use of website-blocking altogether. Ofcom concludes that while it is feasible to “constrain access to prohibited locations on the Internet” using these techniques alone or in combination, “none of the techniques is 100% effective; each carries different costs and has a different impact on network performance and the risk of over-blocking” and that “[f]or all blocking methods circumvention by site operators and Internet users is technically possible and would be relatively straightforward by determined users.” “Although imperfect and technically challenging, site blocking could nevertheless raise the costs and undermine the viability of at least some infringing sites, while also introducing barriers for users wishing to infringe. Site blocking is likely to deter casual and unintentional infringers and by requiring some degree of active circumvention raise the threshold even for determined infringers.” The report goes on to suggest that if blocking is to be implemented, DNS blocking is the preferable approach because it would cause the least delay and cost – two of the key concerns voiced by copyright holders. It also suggests augmenting this by requiring search engines to delist websites. Ofcom notes that DNS blocking is at best a short term solution because implementation of DNS Security Extensions (DNSSEC) will shortly make DNS blocking more transparent and hence less effective. It therefore recommends Packet Inspection for a longer term solution but highlights that it is technically complicated (and therefore slower to implement), expensive (translation: Internet access costs are likely to go up as costs are passed onto subscribers), and raises a number of pesky legal questions – such as whether DPI is compatible with UK privacy and data protection law. (We’d like to read that legal opinion). via EFF
A handful of Internet service providers (ISPs) in the U.S. are redirecting search traffic around specific keywords to brands’ websites, presumably for affiliate marketing revenue.A study released today by a UC Berkeley research group revealed that for some Internet users on some ISPs, using a search engine and typing in a word such as “apple” or “bloomingdales” would redirect the user to websites for Apple or Bloomingdale’s rather than to a page or search results about the keyword in question. The Berkeley project, called Netalyzr, was created to measure DNS behavior. However, over the past few months, the Netalyzr team noticed some unexplained and unexpected redirections across at least 12 ISPs in the United States. The affected ISPs use services provided by a company called Paxfire to monetize certain web search requests. Paxfire’s main line of business is DNS-error traffic monetization, i.e., the practice of presenting advertisements and search results to users who mistyped a website’s address in their browser. “In addition, some ISPs employ an optional, unadvertised Paxfire feature that redirects the entire stream of affected customers’ web search requests to Bing, Google and Yahoo via HTTP proxies operated by Paxfire.” Following the money The Electronic Frontier Foundation helped the Netalyzr team investigate the matter. As EFF senior staff technologist Peter Eckersley told VentureBeat, “They knew the general category of false DNS responses might be possible and worth checking for, while the details that emerged about Paxfire and what it was actually up to were a bit more surprising.” The research team found that around 170 specific, brand-related keywords would trigger interference by the HTTP proxies, causing users to be redirected to affiliate marketing landing pages. “In the process, the ISPs and Paxfire presumably earn commission payments for the redirected flows,” the researchers wrote. Some of the ISPs involved are, according to data presented by multiple organizations involved in the investigation, Cavalier, Cincinnati Bell, Cogent, DirecPC, Frontier, Fuse, Hughes, IBBS, Insight Broadband, Megapath, Paetec, RCN, Wide Open West and XO Communication. Charter and Iowa Telecom claim to have recently stopped doing DNS redirects. While it’s likely that ISPs had at least some knowledge of at least some of the DNS redirection, if not search traffic redirection, it’s less likely that the brands themselves were involved in the scheme. “There is probably a chain of several intermediaries in these affiliate marketing programs between the brand itself and Paxfire,” said Eckersley. The problem with Paxfire “I’m not an expert on affiliate marketing programs, so I can’t comment on whether anything that Paxfire is doing might be a violation of the rules or norms of that business sector,” said Eckersley. But he did say that the marketing company “has no business” granting itself access to the keywords people are using to navigate the Internet. “If my search engine is untrustworthy or not returning the results I was actually looking for, I can go and pick a different search engine. But if Paxfire has snuck out onto the network and secretly replaced all my choices of search engine with itself, I no longer get to go elsewhere for my searches.” And when Paxfire’s proxies malfunction, any search attempts return an error message. “Users will often blame the search engine for that, when in fact it’s the fault of the company that’s secretly hijacking them,” said Eckersley. According to the EFF, Google has repeatedly put pressure on ISPs to stop DNS-based redirects and has been at least somewhat successful. However, the EFF notes that Yahoo and Bing search engines are still particularly susceptible to redirects. “This is why the ISPs that were proxying Google stopped in the past couple of months,” wrote Berkeley researcher Nicholas Weaver in a Slashdot thread today. “Google’s abuse-detection threw up a CAPTCHA on the queries, and then Google posted about it.” Evidently, the combined noise from the web and pressure from the search engine were enough to put a stop to search redirection in some cases. A Google spokesperson confirmed, “We aren’t aware of any DNS providers that are currently doing this hijacking for searches intended for Google.” Hopefully, continued pressure and the watchful eyes of the media, Berkeley researchers and advocacy groups like the EFF will help to end the practice of search redirects. by Jolie O’Dell Image courtesy of Magic Glasses.
Well, boo-hoo. It’s no wonder many probably nodded in agreement when John Gruber fired back that if the patents were so overpriced, why did Google itself bid $3 billion for them, at one point?
I’m not a patent expert. I can’t tell you what in this portfolio might be applicable fairly to Android or not. I have no idea if they were worth $3 billion just for the security of holding them, and perhaps not enforcing them, as Google may have wanted. Maybe Google was going to go all predatory with them.
Reading some of these accusations, you come away with the impression that Google should have sat in its little search box and let other companies expand into new areas. Moreover, by not sitting in its designated search corner, the company deserves whatever anyone wants to sling at it.
For example, there’s Brian Hall’s post, where he lists Google’s sins:
That feeds into Gruber’s second post on this week’s patent actions, which seems to reassert all these copying facts:
Google seems to feel entitled to copy whatever it wants. Android copies the UI from the iPhone. Places copied data from Yelp. Google+ copies from Facebook. Their coupon thing is a clone of Groupon. And yet it’s Google that acts as though it has been offended when these competitors fight back.
And from MG Siegler today:
Increasingly, Google is trying to do everything. And they have the arrogance to think that they can. And it’s pissing people off.
That goes on to cite examples of various companies behind the scenes who feel like Google just copies everything they do. Apparently, the three remaining people at Yahoo involved with search that haven’t jumped ship to work at Google are among these.
Time for a little push-back. Before I do so, just as I don’t believe that John Gruber is some Apple fanboy who only sees things from a positive Apple point of view, I’d hope that what I’m writing isn’t seen as coming from a Google fanboy.
I’m doing some pushback not out of great love for Google but rather from a great love of balance and reasonable discussion. Right now, we could use some balance on all the copying accusations that are flying around, I’d say.
Don’t get me wrong. It’s not that Google is perfect and only does the “nice” thing. Google’s just another company, which acts in its own self-interest and can act just as ruthlessly as any of its competitors to get what it wants.
One of the most disturbing areas is the conflict of Google hosting content as opposed to pointing outbound. Its Google Places pages are arguably destinations of their own. Its YouTube pages certainly are. Should a search engine also have destinations sites? If not, then should that standard be applied to all search engines, as Google’s not alone here.
Google’s also a big fan of the “open,” yet despite the Open Handset Alliance, I still can’t get a pure Android phone on Verizon. And despite Open Social, there’s still no API for Google+.
In the tech space, I think many people understand that Google is no longer that scrappy little company of old with the fun beach balls and the quirky founders. It’s not. It’s just like all the others.By Danny Sullivan