jueves, 2 de mayo de 2013

Do we stop to think how people come to our maps on the Internet?


Lately spanish speaking web mapping developers community is very active, producing a series of awesome web mapping applications. 

One of the latest applications  I had known is http://www.spaininbooks.com/ a mashup that combines maps produced by ArcGIS Online API, pictures of landmarks and points of interest from Wikimedia Commons and the Amazon API to locate books to buy. Its author, Aitor Calero, is well known in the spanish speaking GIS community, especially through his participation in RedIRIS GIS mail list.
 
http://www.spaininbooks.com/
 
Last summer, practically coinciding with the publication of www.lookingformaps.com was published another social mapping platform, http://natmaps.com/ whose most outstanding feature is, imho, the visual quality of its maps compositions.
 
http://natmaps.com/ Application to build web maps.

Both applications offer a sensational user experience.  They're completely optimized for users who open their browser, visit a web page knowing previously its address, offering them high quality graphics, and a good usability,  whre everything is a map.

However, the question would be how users search for geographic information on the Internet? This question was left in the air last summer in MapBrief blog, in its post "How the Public Actually Uses Web Local Government Maps: Metrics from Denver ".  In it, the author offers a number of statistics on how users access the Denver local gobernment geoportal, and finish with a number of interesting conclusions.

To provide a fresh vision to this debate, we will rely on Looking4Maps user access data, collected with the web analytics tool Google Analytics, recognizing  that Looking4Maps is far from reaching the levels of usability of NatMaps, SpainInBooks or extended solutions as CartoDB or MapBox.
 
From August 8, 2012, date on which came into operation Looking4Maps, 33,245 unique users have visited the site, and of these, only 13.69% did it by writing http://www.lookingformaps.com in their browser . In contrast, 71.23% of visits come from users who performed searches on a search engine.  If we compare this data with data provided by MapBrief post, we must conclude that are not too different: 60% in the case of Denver.

We can conclude that the vast majority of users who come to our website made it through Google, so it's critical that our website will be friendly to Google,  it is essential that our web mapping projects will do a few search engine optimization task . Let's see two examples:
  • If we search in Google "site: spaininbooks.com" the engine only shows 46 results. A visit to the website of SpainInBooks.com shows that the number of books and monuments of its database is of several hundreds, but the application is not designed to allow Google to discover this information.
  • If we do the same search for the portal NatMaps, "site: natmaps.com" , in this case we get 184 results, but the information provided by the search engine, is not at all descriptive and user friendly: all the titles of the pages are the same ("natmaps - Social Mapping"), and URLs are not self descriptive about the content behind them. It doesnt invite the user who has searched for information through Google to click on them.
For SpainInBooks, making the application more friendly to search engines force a change in the system architecture. As it is designed using the same metaphor of conventional GIS applications, providing detailed information on points of interest on the map in internal dialogues. It Would have to evolve it so that each item of information, each point of interest and each of the related books, will be accessible through its own and unique URL, because that is what search engines store:  URLs and descriptive metadata about the information behind these URLs
.
http://spaininbooks.com/ shows the detail information using modal dialogs, not addressable by a unique URL, which makes it difficult for search engines.


The task of optimizing NatMaps to be more friendly to search engines seems simpler: you just have to change the title of the maps (html attribute "title"), to be unique and representative of each map, and exchange URL to be more  autodescriptive (instead of the hexadecimal string that uses right now) and that each map has a description, in the form of the html tag "meta description".

 

And a tip ...


Did you know that Google in the information displayed to users of the search engine is based on the content of certain HTML tags that indexes the contents?

It is essential to pay attention to the content of the labels "title", "meta name = 'description'" and keywords, as well as the friendliness of the URL that identifies our resort.


Did you know that youy can help Google to find the contents of your web application?

 To do this, create one or more XML files according to a format provided by Google called "sitemaps".  Within each sitemap, we should create an entry for each URL that we want Google to include in its index.