Before the awe-inspiring photograph of the Pale Blue Dot, and the more recent image of an evening “star” in the Martian sky, the crew of Apollo 17 captured our home world in The Blue Marble (AS17-148-22727). Possibly the most reproduced photograph in human history, this piece of “viz art” I created using Tableau Public adds to this collection. Beyond being a low-res reproduction of Blue Marble, this visualization is an interactive orthographic projection of Earth allowing for the rotation of the view across lines of latitude and longitude. Coupled with Tabelau’s built-in mapping capability, and an additional custom map projection, we can compare these different two-dimensional representations of our three-dimensional planet.
Given a set of angles of latitude and longitude (and length of radius), an orthographic map projection transforms these spherical coordinates into 2D Cartesian coordinates. The Cartesian coordinates are projected onto a plane (and in the case of a sphere, projected onto a circle with diameter equal to that of the original spherical object) tangent with a set center point of projection. This type of projection mimics the view of a spherical object from an arbitrarily large distance.
Like many other digital map services, Tableau’s OpenStreetMap-based mapping capability uses the Web Mercator projection. A variation of the 16th century Mercator projection, the Web variant simplifies Earth’s natural shape further for computational ease. This added (or subtracted) feature, along with the Mercator’s parallel lines of longitude and latitude, allow for simplifying geospatial calculations and is entirely suitable for use at larger map scales where latitudinal distortions are minimized by viewing only a small region at a time (neighborhood, city, or even country scales). The distortions to relative size are not apparent until considering continent, hemisphere, and global scales (or where a wide range of latitudes are concerned.
Mapping the surface of an irregular ellipsoid such as the Earth, projected onto a 2D surface calls for creativity and sacrifices to obtain a least-distorted view. Cartographers, geographers, and geodesists fill tomes debating and weighing the drawbacks, benefits, and appropriate use cases for the myriad of map projections that attempt to realistically represent planetary surfaces.
I’ve skipped that process and chose the Wagner IV projection to work with. It is a personal favorite and was developed with aesthetic considerations in mind, sacrificing some shape distortion for accuracy of area. In this type of projection, latitude is plotted along a linear axis, equally spaced. The real irregular ellipsoid planet Earth has a varied arc-length subtended by each degree of latitude, but a spherical approximation such as this give us an arc length of about 111 km for each 1° of latitude. That is, if you were to sail a boat along a line of longitude either northward or southward, 111 km across the ocean surface you will have traveled 1° of latitude. Much like on the real Earth, the apparent projected distance between lines of longitude vary in the Wagner IV projection from 111 km at the equator to 0 km where they intersect at the poles.
To create the image of Earth within Tableau, I wrote python script to convert a JPG image into data points of latitude, longitude, and assigned color. First, the source image file is opened and the image’s pixel data is loaded into a list variable, pix. Column headers of the data to save out are set up next to include for each point: longitude, latitude, color, and a hex value of color. Because I know my image size is 180×360 (1 pixel for every 1 degree, I resized a larger image to this convenient dimension in GIMP) I can loop through all the rows and columns of pixels to inspect each one. If I didn’t know my image size I could determine It by looking at the size (length) of my data in pix. For each pixel, I get the red, green, and blue values (these are 8-bit integers, ranging from 0-255).
An additional idea not currently implemented took the r,g,b values and binned them into one of 3 bins each (corresponding to decimal values 0, 127, 255, or hex 00, 7F, FF) this gives us a palette with 27 possible colors to work with.
To assign each pixel to one of the five “earth color” bins I am using, I developed the following criteria by looking at the r,g,b color space and through trial and error, tuning it for the specific source image I used. A pixel is assigned to white (clouds and snow/ice) if all three r,g, and b are high values (>200). This includes any light greys and other very light colors. Any other pixel that doesn’t fit into any of the below bins is also assigned to white (when in doubt, make it a cloud). Green and blue pixels are assigned by the same criteria, the value of green/blue must be greater than the other two colors.
Brown/tan pixels were the most challenging to identify due to their variety (earlier version without fourth and fifth colors). In the source image they correspond to arid deserts (such as North Africa), semi-arid regions (portions of Central Asia), and mountains with more sparse vegetation (such as western North America). Pixels with red and green that are close in value ( 400), with blue also above 150 were assigned to the brown/tan group.
I added a fifth color bin to take on additional regions that had previously fallen in either the brown/tan or green regions, but when parsed out as a drab green/grey made the resulting image blend better between vegetated and non-vegetated areas (what had previously been a sharp transition from tropical forests to Sahara desert now have an intermediate green/grey between them, and regions such as western North America and Central Asia were no longer as green as the tropics, but assigned this new intermediate color). This color was defined where red was the highest value of the three. Because I know from my source image (and some knowledge of our planet) that red is a very uncommon color in the natural terrain (save for some flowers, autumn leaves, and mineral types which wouldn’t constitute a single whole pixel at this scale), so that any pixel with red as the maximum color must be a sort of subdued grey/green or brown/tan that I can include in this “catchall” group.
Now with each pixel inspected and assigned to one of my five color bins, the data is written out to a csv file for implementation into the Tableau workbook!
The page will change the viz’s parameter values in response to the following inputs from the url:
lat: (expects a number between -90 and 90)
lon: (expects a number between -180 and 180)
rotate: (expects a number between -180 and 180)
These parameter values can be included in the url in any order following a ‘?’ character and separated by ‘&’ characters.
Example to center the projection on the Equator and Prime Meridian with no rotation:
Example for a south-up view of the Americas:
Along with a URL action within the tableau viz, we can now click on a point on Earth to center the view while on this page.
“We succeeded in taking that picture, and, if you look at it, you see a dot. That’s here. That’s home. That’s us. On it, everyone you ever heard of, every human being who ever lived, lived out their lives. The aggregate of all our joys and sufferings, thousands of confident religions, ideologies and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilizations, every king and peasant, every young couple in love, every hopeful child, every mother and father, every inventor and explorer, every teacher of morals, every corrupt politician, every superstar, every supreme leader, every saint and sinner in the history of our species, lived there – on a mote of dust, suspended in a sunbeam…”
– Carl Sagan