Once you are connected to the Data Box Heavy shares, the next step is to copy data. You cannot copy files directly to root folder in the storage account. The folder created under block blob and page blob shares represents a container to which data is uploaded as blobs. Sudo mount -t nfs -o sec=sys,resvport 10.161.23.130:/Mystoracct_Blob /home/databoxheavyubuntuhost/databoxheavyĪlways create a folder for the files that you intend to copy under the share and then copy the files to that folder.
Sudo mount -t nfs 10.161.23.130:/Mystoracct_Blob /home/databoxheavyubuntuhost/databoxheavyįor Mac clients, you will need to add an additional option as follows: The Data Box Heavy IP is 10.161.23.130, the share Mystoracct_Blob is mounted on the ubuntuVM, mount point being /home/databoxheavyubuntuhost/databoxheavy.
The following example shows how to connect via NFS to a Data Box Heavy share. Once the NFS client is installed, use the following command to mount the NFS share on your Data Box device: Use the specific version for your Linux distribution. Click OK.Įnsure that the Linux host computer has a supported version of NFS client installed. You can configure access for multiple NFS clients by repeating this step. Supply the IP address of the NFS client and click Add. Under NFS settings, click NFS client access. In the local web UI, go to Connect and copy page. Supply the IP addresses of the allowed clients that can access the share. Azure Storage URL: If you are using a Linux host computer, perform the following steps to configure your device to allow access to NFS clients.The final Azure Storage path URL can be derived from the UNC share path. The following table shows the UNC path to the shares on your Data Box Heavy and Azure Storage path URL where the data is uploaded. These shares are created on both the nodes of the device. Three shares for each associated storage account for GPv1 and GPv2.If you do not have 40-GbE connection available, we recommend that you have at least two 10-GbE connections (one per node).īased on the storage account selected, Data Box Heavy creates up to: For fastest copy speeds, two 40-GbE connections (one per node) can be utilized in parallel. You have a host computer that has the data that you want to copy over to Data Box Heavy.You have received your Data Box Heavy and the order status in the portal is Delivered.You have completed the Tutorial: Set up Azure Data Box Heavy.