Data Engineering with Apache Spark, Delta Lake, and Lakehouse by Manoj Kukreja & Danil Zburivsky

Data Engineering with Apache Spark, Delta Lake, and Lakehouse by Manoj Kukreja & Danil Zburivsky

Author:Manoj Kukreja & Danil Zburivsky [Manoj Kukreja]
Language: eng
Format: epub
Publisher: Packt Publishing
Published: 2021-10-21T16:00:00+00:00


Figure 6.3 – Container for Delta Lake exercises

If the preceding command is successful, you should be able to see the newly created container on the Azure portal by browsing to Home > All Resources > traininglakehouse:

Figure 6.4 – Scratch namespace in traininglakehouse

For this exercise, we will use Azure Databricks to read and write data from the Azure data lake storage created previously. To read and write data, Azure Databricks requires the storage account keys for Azure Data Lake storage. Invoke the following commands on the Azure Cloud Shell:STORAGEACCOUNTNAME="traininglakehouse"

az storage account keys list --account-name $STORAGEACCOUNTNAME

If the preceding commands worked as desired, you should see two storage keys as follows. Take note of the value for key1 from the output for the command. We will need it later in the Databricks workspace configuration.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.