Google Cloud App Engine provides a solid, easy-to-access platform for a wide range of web apps. Its interoperability with other Google Cloud services, such as Cloud Datastore, enhances its effectiveness. In this hands-on lab, we’ll deploy an app to App Engine that allows users to enter details into a NoSQL database, Cloud Datastore, and displays the proper HTML template and CSS. The process requires that we customize the config.py
file before deploying the app.
- From the main navigation, visit the APIs & Services > Libraries page, and search for "Datastore".
- Select the
Cloud Datastore API
. - If the API is not available, click Enable.
- Activate the Cloud Shell.
- When it spins up, create the necessary bucket (bucket name must be unique):
gsutil mb -c regional -l us-east1 gs://[BUCKET_NAME] gsutil acl ch -u AllUsers:R gs://[BUCKET_NAME]
- Clone the GitHub repository:
git clone https://github.com/linuxacademy/content-gc-essentials
- Change directory to the
content-gc-essentials/app-engine-lab
folder.
- From the Cloud Shell Editor, open
config.py
in theapp-engine-lab
folder. - Change
PROJECT_ID
to the current project, as shown in the Cloud Shell. - Change the
"CLOUD_STORAGE_BUCKET"
variable value to your unique bucket name. - Save the file.
- In the Cloud Shell, enter the following code:
gcloud app deploy
- When prompted, choose the nearest region.
- In the Cloud Shell, enter the following code:
gcloud app browse
- If the browser window does not open, click the generated link.
Cloud Functions can be triggered in two ways: through a direct HTTP request or through a background event. One of the most frequently used services for background events is Cloud Pub/Sub, Google Cloud’s platform-wide messaging service. In this pairing, Cloud Functions becomes a direct subscriber to a specific Cloud Pub/Sub topic, which allows code to be run whenever a message is received on a specific topic. In this hands-on lab, we’ll walk through the entire experience, from setup to confirmation.
- Use the API Library to find and enable the Cloud Pub/Sub API.
- Use the API Library to find and enable the Cloud Functions API.
- From the main console navigation, go to
Cloud Pub/Sub
. - Click
Create a topic
. - In the dialog, enter a name
greetings
for the topic. - Click
Create
.
- From the main console, go to Cloud Functions.
- Click Create function.
- Configure the function with the following values:
- Name: la-pubsub-function
- Trigger: Cloud Pub/Sub
- Topic: greetings
- Source code: Inline editor
- Runtime: Python 3.7
- In the main.py field, enter the following code:
import base64 def greetings_pubsub(data, context): if 'data' in data: name = base64.b64decode(data['data']).decode('utf-8') else: name = 'folks' print('Greetings {} from Linux Academy!'.format(name))
- Set Function to Execute to greetings_pubsub.
- Click Create.
- Click the newly created Cloud Function name.
- Switch to Trigger.
- Click the topic link to go to the Cloud Pub/Sub topic.
- From the Topic page, click Publish Message.
- In the Message field, enter everyone around the world.
- Click Publish.
- Return to the Cloud Functions dashboard.
- Click the Cloud Function's name.
- From the Cloud Function navigation, click View Logs.
- Locate the most recent log.
- Confirm function execution.
- Click Activate Cloud Shell from the top row of the console.
- In the Cloud Shell, enter the following code:
DATA=$(printf 'my friends' | base64) gcloud functions call la-pubsub-function --data '{"data":"'$DATA'"}'
- After a moment, refresh the logs page and confirm the Cloud Function execution.
- In the Cloud Shell, enter the following command:
gcloud pubsub topics publish greetings --message "y'all"
- After a moment, refresh the log page to confirm the function has executed.
Compute Engine VM instances can utilize a wide range of boot disks; Google Cloud currently offers well over 50 disk images to choose from. Cloud computing professionals must be adept in spinning up a variety of operating systems, including Windows server. In this hands-on lab, you’ll experience the creation of a Windows-based Compute Engine VM instance, set up an IIS server, and push your first web page live to confirm the server’s operation.
- From the main navigation, choose Compute Engine > VMs.
- In the VM Instances area, click Create.
- With New VM instance chosen from the options on the left, configure your instance:
- In the name field, provide a relevant name using hyphens, like la-windows-1.
- Keep the suggested Region and Zone.
- In the Boot Disk section, click Change and select Windows Server 2019 Datacenter, Server with Desktop Experience from the list; click Select.
- Under Firewall, choose the Allow HTTP traffic option.
- Click Create.
- From the RDP options, click Set Windows password.
- In the dialog, confirm your username is correct and click Set.
- Copy the supplied password and click Close.
- Launch the RDP window by using one of the following methods:
- If you're on a Windows system, click RDP.
- If you're on a Mac using Chrome in a standard window, first install the Chrome RDP extension, and then click RDP.
- If you're on a Mac using another browser or Incognito window, from the App Store, download and install the latest version of the Microsoft Remote Desktop app. Then, choose Download the RDP file from the RDP options and open the file.
- From the Windows Start menu, right-click on Windows Powershell and choose Run as administrator.
- In the PowerShell window, enter the following commands:
import-module servermanager add-windowsfeature web-server -includeallsubfeature echo '<!doctype html><html><body><h1>Greetings from Linux Academy!</h1></body></html>' > C:\inetpub\wwwroot\index.html
- From the Compute Engine VM page, click the External link for your Windows VM instance.
- Review the page in your browser.
While saving an object in a Cloud Storage bucket is relatively inexpensive, there is, nonetheless, a cost. The cost varies depending on the storage class selected. Certain objects are required to be more available at first, requiring the storage class with the highest availability — and cost. Such objects may eventually be relegated to less available and less expensive storage classes and, even, be deleted. Management of these objects over time can be handled automatically by establishing and implementing lifecycle rules. In this hands-on lab, we'll set a variety of lifecycle rules for Google Cloud Storage buckets both from the console and the command line.
- From the Google Cloud console main navigation, choose Cloud Storage.
- Click Create bucket.
- Name the bucket uniquely.
- In the Storage Class section, select Regional.
- Click Create.
- From the Cloud Storage browser page, click None in the Lifecycle column for the bucket just created.
- Click Add rule.
- Under Select object conditions, set the following:
- Age: 180
- Storage class: Regional, Standard
- Click Continue.
- Under Select action, choose Set to Nearline.
- Click Continue.
- Click Save.
- From the Cloud Storage browser page, click Enabled in the Lifecycle column.
- Click Add rule.
- Under Select object conditions, set the following:
- Age: 365
- Storage class: Nearline
- Click Continue.
- Under Select action, choose Set to Coldline.
- Click Continue.
- Click Save.
- Click Activate Cloud Shell.
- In the Cloud Shell, enter the following code:
gsutil lifecycle get gs://[BUCKETNAME]
- Review output.
- Clone a repo and change to the lab's directory:
git clone https://github.com/linuxacademy/content-gc-essentials cd content-gc-essentials/cloud-storage-lifecycle-lab
- Review file in editor.
- In the Cloud Shell, enter the following code:
gsutil lifecycle set delete-after-two-years.json gs://[BUCKET_NAME]
- Confirm the lifecycle rule has been added in the console.
When working with Cloud SQL on any scale, you’ll need to manage multiple instances: creating, cloning, starting, stopping, restarting, and — eventually — deleting them. This hands-on lab gives you the experience of performing all those tasks so you’ll be familiar with the steps necessary to handle Cloud SQL instances completely.
- From the main navigation, click Cloud SQL.
- Click Create Instance.
- Choose your database engine.
- Enter an instance ID.
- Provide a password.
- Click Create.
- Select the recently created Cloud SQL instance.
- Choose Create database.
- Name the database and leave the other fields at their default values.
- Click Create.
- From the Instance details page, choose Create clone.
- Enter a new name for the clone if desired.
- Choose Clone latest state of instance.
- Click Create clone.
- Select the first instance created.
- Click Stop.
- After the instance has stopped, click Start.
- Select the cloned instance.
- Click Restart.
- Select the cloned instance.
- Click Delete.
- Enter the name of the instance in the dialog to confirm the deletion.
- Click Delete.
Data comes in all shapes, sizes, and use cases. A relational database service like Cloud SQL isn't always the answer. Cloud Datastore is a NoSQL database service, ideal for semi-structured data that needs to be highly scalable and available. Cloud Firestore is the next generation of Datastore with enhanced features, and in this hands-on lab, you'll see how to build a Firestore NoSQL database in Cloud Datastore mode for the best of both worlds.
- From the main navigation, click APIs and Libraries > Library.
- Search for Datastore.
- Click Enable.
- From the main navigation, click Datastore.
- Choose Datastore Mode.
- Select your closest location.
- Click Create database.
- For Kind, enter Flights.
- Click Create Entity.
- Click Add property for each of the following entries, of the specified type:
- Airline: String
- Flight Number: Integer
- Arrival: Data and Time
- OnTime: Boolean
- Click Save.
- Repeat steps 3 and 4 twice more with different values.
- For the final entry, add another property:
- Note: Text
- Click Save.
- Switch to Query by GQL.
- In the Query field, enter the following:
SELECT * FROM Flights
- Click Run Query.
- In the Query field, enter the following:
SELECT * FROM Flights WHERE OnTime = false
- Click Run Query.
- Review results.
Sometimes data is relatively straight-forward — there's just an overwhelming amount of it. That's exactly what Cloud Bigtable is meant for. Cloud Bigtable is a fully managed NoSQL database service designed to handle massive amounts of information. In this hands-on lab, you’ll configure database instances and clusters for Cloud Bigtable in the console and then use command line cbt commands to create a schema and populate the table with data.
- From the main navigation, choose APIs & Services > Library.
- In the search field, enter Bigtable.
- Select the Cloud Bigtable Admin API card.
- Click Enable.
- From the main navigation, choose Bigtable in the Storage section.
- Choose Create instance.
- Set the following fields:
- Name: la-data-cbt
- Instance type: Development
- Storage type: HDD
- Cluster Region: us-east1
- Cluster Zone: us-east1-b
- Click Done and then Create.
- Activate the Cloud Shell by clicking its icon in the top row.
- In the Cloud Shell, enter the following:
gcloud components update gcloud components install cbt
- Configure cbt with the following commands:
echo project = [PROJECT_ID] > ~/.cbtrc echo instance = la-data-cbt >> ~/.cbtrc
- In the Cloud Shell, enter the following:
cbt createtable la-table cbt ls
- In the Cloud Shell, enter the following:
cbt createfamily la-table offerings cbt set la-table r1 offerings:c1=labs cbt read la-table
- Review results.
- Enter the following:
cbt set la-table r1 offerings:c2=courses cbt read la-table
- From the console, select the Bigtable instance.
- Choose Delete instance.