Challenge Lab
Qwiklabs [GSP311]
Task 1: Create a Cloud Storage Bucket
change the project id.
Task 2: Create a Cloud Function
Go to the “saf-longrun-job-func” directory
Task 3: Create a BigQuery Dataset and Table
# To create the dataset
Task 4: Create Cloud Pub/Sub Topic
Task 5: Create a Cloud Storage Bucket for Staging Contents
wathch the video to complete the task
Create a folder “DFaudio” in the bucket
do it manually
Task 6: Deploy a Cloud Dataflow Pipeline
Go to the saf-longrun-job-dataflow directory
Now, execute the command given below:
Task 7: Upload Sample Audio Files for Processing
After performing Task 7: Upload Sample Audio Files for Processing, we have to wait....until we see output in bigquery > dataset > table.
Task 8: Run a Data Loss Prevention Job
Now, click on “Save Query Results” and select “BigQuery table” option
Enter a name for a new table and save.
Go to the new table in which result is saved and then Click on Export > Scan with DLP
0 comments:
Post a Comment
We appreciate your feedback, We will definitely send it to Prakash Foundation.
Thanks for feedback.