This tutorial shows how to use Google Cloud Platform to build an app that receives telemetric data about geolocation, processes it, and then stores the processed and transformed data for further analysis.
The instructions in this readme show you how to run the tutorial by using Google Cloud Shell with Docker. You can find a version of this tutorial that uses your own development environment, without Docker, on the Google Cloud Platform website.
Cloud Shell provides a ready runtime environment and can save you several steps
and some time. However, Cloud Shell times out after 60 minutes of inactivity
and can cost you some repeated work, if that happens. For example, anything
copied to the /tmp
directory will be lost.
Docker provides some automation in deployment that can also make the tutorial easier to run and save you time. However, if you're not familiar with Docker, or simply want to see every step in action, you might prefer to run through the full, manual tutorial steps yourself.
The tutorial:
This tutorial uses billable components of Google Cloud Platform, including:
The cost of running this tutorial will vary depending on run time. Use the pricing calculator estimate to see a cost estimate based on your projected usage. New Cloud Platform users may be eligible for a free trial.
The Maps API standard plan offers a free quota and pay-as-you-go billing after the quota has been exceeded. If you have an existing license for the Maps API or have a Maps APIs Premium Plan, see the documentation first for some important notes. You can purchase a Maps APIs Premium Plan for higher quotas.
You must have a Maps for Work license for any application that restricts access, such as behind a firewall or on a corporate intranet. For more details about Google Maps API pricing and plans, see the online documentation.
Click the following link to enable the required Cloud Platform APIs. If prompted, be sure to select the project you created in step 1.
These APIs include:
For this tutorial, you'll need the following credentials:
If you don't already have them, you'll need Google Maps API keys.
Click the following link to open the Cloud Console in the Credentials page. If you have more than one project, you might be prompted to select a project.
Click Create credentials > API key > Server key.
Name the key "Maps tutorial server key".
Click Create.
Click Ok to dismiss the dialog box that shows you the new key. You can retrieve your keys from the Cloud Console anytime.
Stay on the page.
The browser key is a requirement for using the Maps Javascript API. Follow these steps:
Important: Keep your API keys secure. Publicly exposing your credentials can result in your account being compromised, which could lead to unexpected charges on your account. To keep your API keys secure, follow these best practices.
Create service account credentials and download the JSON file. Follow these steps:
You must upload the service account credential file to a Google Cloud Storage bucket so that you can transfer it to Cloud Shell in an upcoming step.
Create a client ID that you can use to authenticate end-user requests to BigQuery. Follow these steps:
ifconfing
on Linux or ipconfig -all
on Windows.http://[YOUR_IP_ADDRESS]:8000
https://[YOUR_IP_ADDRESS]:8000
Adding these URLs enables an end user to access BigQuery data through JavaScript running in a browser. You need this authorization for an upcoming section of the tutorial, when you display a visualization of data on a map in your web browser.
Cloud Pub/Sub is the messaging queue that handles moving the data from CSV files to BigQuery. You'll need to create a topic, which publishes the messages, and a subscription, which receives the published messages.
The topic publishes the messages. Follow these steps to create the topic:
The subscription receives the published messages. Follow these steps to create the subscription:
traffic
topic, click the downward arrow on the right-hand end of the row.Follow these steps to prepare to run the code.
Open Cloud Shell. In the Cloud Platform console, in the upper-right corner, click the Activate Google Cloud Shell icon.
In Cloud Shell, clone this repository.
Change directory to resources
:
cd geo_bq/resources
Edit setup.yaml
. Use your favorite command-line text editor.
For PROJECT_ID
, replace your-project-id
with your project's ID string. Keep the single quotation marks in this and all
other values that you replace.
For DATASET_ID
, don't change sandiego_freeways
.
For TABLE_ID
, don't change geocoded_journeys.
For PUBSUB_TOPIC
, replace your-project-id
with your project's ID string.
For ROOTDIR
, replace the provided path with /tmp/creds/data
.
For SUBSCRIPTION
, replace your-project-id
with your project's ID string.
For MAPS_API_KEY
, replace Your-server-key
with the server key you created. You can see your credentials by clicking the
following link:
Save and close the file.
Run the following command:
bash setup.sh
The setup.sh
script performs the following steps for you:
/tmp/creds/data
, that you use to store your service account credentials (the JSON file you
uploaded to your bucket) and the traffic data.Copies the data files from your GitHub clone to the data directory.
/tmp/creds
:cd /tmp/creds
gsutil cp gs://[YOUR_BUCKET]/[YOUR_CREDENTIALS_FILE].json .
Next, run the code to push the traffic data to Cloud Pub/Sub. Run the following command. Replace [YOUR_CREDENTIALS_FILE] with the name of the file.
docker run -e "GOOGLE_APPLICATION_CREDENTIALS=/tmp/creds/[YOUR_CREDENTIALS_FILE].json" --name map-push -v /tmp/creds:/tmp/creds gcr.io/cloud-solutions-images/map-pushapp
After Docker finishes initializing, you should see repeated lines of output like this one:
Vehicle ID: 1005, location: 33.2354833333, -117.387343333; speed: 44.698 mph,
bearing: 223.810 degrees
It can take some time to push all the data to Pub/Sub.
BigQuery pulls the data by using the Cloud Pub/Sub subscription. To get it going, run the following command. Replace [YOUR_CREDENTIALS_FILE] with the name of the file.
docker run -e "GOOGLE_APPLICATION_CREDENTIALS=/tmp/creds/[YOUR_CREDENTIALS_FILE].json" --name map-app -v /tmp/creds:/tmp/creds gcr.io/cloud-solutions-images/map-pullapp
After Docker finishes initializing, you should see repeated lines of output like this one:
Appended one row to BigQuery. Address: 11th St, Oceanside, CA 92058, USA
It can take some to pull all the data from the topic. When it’s done, the terminal window will stop showing lines of output as it waits for further data. You can exit the process at any time by pressing Control+C. If Cloud Shell times out, you can simply click the reconnect link.
Now that the you have transcoded and loaded the data into BigQuery, you can use BigQuery to gain insights. This section of the tutorial shows you how to use the BigQuery console run a few simple queries against this data.
SELECT AVG(Speed) avg_speed, Zipcode FROM [sandiego_freeways.geocoded_journeys]
WHERE Zipcode <> ''
GROUP BY Zipcode ORDER BY avg_speed DESC
You can use Google Maps to visualize the data you stored in BigQuery. This tutorial shows you how to superimpose a heat map visualization onto a map of the region. The heat map shows the volume of traffic activity captured in the data in BigQuery.
To keep the tutorial straightforward, the provided example uses OAuth 2.0 to authenticate the user for the BigQuery service. You could choose another approach that might be better-suited for your scenario. For example, you could export query results from BigQuery and create a static map layer that doesn’t require the end user to authenticate against BigQuery, or you could set up authentication by using a service account and a proxy server.
To show the data visualization, follow these steps.
bqapi.html
For these modifications, you need to use keys and credentials you created earlier. You can see these values in the Cloud Console on the Credentials page.
bqapi.html
. You can find the file in the following directory where you installed the
source code:geo_bq/web/
script
element, in the src
attribute, replace Your-Maps-API-Key
with your Google Maps API browser key:<script src="<a href="https://maps.googleapis.com/maps/api/js?libraries=visualization,drawing&key=Your-Maps-Api_browser-key">https://maps.googleapis.com/maps/api/js?libraries=visualization,drawing&key=</a>Your-Maps-API-Key"
clientId
variable, replace Your-Client-ID with the OAuth 2.0 client ID you created earlier. Your-Project-ID
, with the your project ID.You can use Cloud Shell to view the web page. Follow these steps:
From the geo_bq/web
directory, run the following command to start the server:
python -m SimpleHTTPServer 8080
When the server is running, Cloud Shell prints the following message:
Serving HTTP on 0.0.0.0 port 8080 ...
In the top-left corner of Cloud Shell, click Web preview and then click Preview on port 8080. Cloud Shell opens a new browser tab that connects to the web server.
In the new browser tab, note the origin URL. The origin URL has a format similar to the following example, where [RANDOM_NUMBER] could be any value:
In the Cloud Console, return to the Credentials page;
Click the name of your OAuth 2.0 client ID.
In the Restrictions section, add the origin URL you noted in the previous step. Do not add a port number.
The origin URL that you provide in this step tells OAuth 2.0 that it's safe to accept requests from the Cloud Shell browser. Without this step, the web page can't use script to access the data you loaded into BigQuery.
Click Save.
In the browser tab that Cloud Shell opened, click the link for bqapi.html. If your browser has a pop-up blocker, turn it off for this site.
In the pop-up window, follow the OAuth 2.0 authentication prompts. You won't have to repeat this flow in this session if, for example, you reload the web page.
After the map has loaded, select the rectangle tool in the upper-left corner of the map.
Use the tool to draw a rectangle around the entire currently visible land mass on the map.
The page shows a heat map. Exactly where the heat map regions display on the map depends on the data you loaded into BigQuery.
For details about how the code works, see the tutorial on the Google Cloud Platform site.