Update for Azure SQL Hybrid Cloud Toolkit (#13360)

* Added azurehybridtoolkit to list of external extensions

* Added updated book

* added to recommended extensions

* extensions.js updated for build

* added small changes to extension

* small changes to extension

* tsconfig change

* gitignore and vscode changes

* changed package display name
This commit is contained in:
Alex Ma
2020-11-12 14:22:50 -08:00
committed by GitHub
parent a082c1e478
commit a2f7136728
27 changed files with 229 additions and 175 deletions

View File

@@ -213,6 +213,7 @@ const externalExtensions = [
'arc',
'asde-deployment',
'azdata',
'azurehybridtoolkit',
'cms',
'dacpac',
'data-workspace',

View File

@@ -247,6 +247,7 @@ const externalExtensions = [
'arc',
'asde-deployment',
'azdata',
'azurehybridtoolkit',
'cms',
'dacpac',
'data-workspace',

View File

@@ -1 +1,2 @@
notebooks/hybridbook/Components/**/obj
*.vsix

View File

@@ -1,4 +1,7 @@
.gitignore
src/**
out/**
tsconfig.json
extension.webpack.config.js
*.vsix
yarn.lock

View File

@@ -1,6 +1,6 @@
# Azure SQL Hybrid Cloud Toolkit Jupyter Book Extension for Azure Data Studio
# Azure SQL Hybrid Cloud Toolkit *(preview)*
Welcome to the Azure SQL Hybrid Cloud Toolkit Jupyter Book Extension for Azure Data Studio! This extension opens a Jupyter Book that has several utilities for Azure SQL such as migration assessments and setting up networking connectivity.
Adds a Jupyter Book that has several utilities for Azure SQL Hybrid Cloud.
## Code of Conduct

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.3 KiB

View File

@@ -8,6 +8,15 @@
- title: Search
search: true
- title: Assessment
url: /Assessments/readme
not_numbered: true
expand_sections: false
sections:
- title: SQL Server Assessment Tool
url: Assessments/sql-server-assessment
- title: Compatibility Assessment
url: Assessments/compatibility-assessment
- title: Networking
url: /networking/readme
not_numbered: true
@@ -19,15 +28,6 @@
url: networking/p2svnet-creation
- title: Create Site-to-Site VPN
url: networking/s2svnet-creation
- title: Assessments
url: /Assessments/readme
not_numbered: true
expand_sections: false
sections:
- title: SQL Server Best Practices Assessment
url: Assessments/sql-server-assessment
- title: Compatibility Assessment
url: Assessments/compatibility-assessment
- title: Provisioning
url: /provisioning/readme
not_numbered: true

View File

@@ -12,4 +12,4 @@ search_page: true
<script>
// Add the lunr store since we will now search it
{% include search/lunr/lunr-store.js %}
</script>
</script>

View File

@@ -17,38 +17,31 @@
{
"cell_type": "markdown",
"source": [
"Migration Compatibility Assessment\n",
"=======================================================\n",
"# Migration Compatibility Assessment\n",
"Use dmacmd.exe to assess databases in an unattended mode, and output the result to JSON or CSV file. This method is especially useful when assessing several databases or huge databases.\n",
"\n",
"Description\n",
"-----------\n",
"Use this notebook to analzye an on-premises SQL Server instance or database for compatibility for migration to SQL Azure. The assessment will provide guidance on features not currently supported in Azure and remediation actions that can be taken to prepare for migration.\n",
""
"## Notebook Variables\n",
"\n",
"| Line | Variable | Description |\n",
"| --- | --- | --- |\n",
"| 1 | ExecutableFile | Path to DmaCmd.exe file, usually _\"C:\\\\Program Files\\\\Microsoft Data Migration Assistant\\\\DmaCmd.exe\"_ if installed to default location |\n",
"| 2 | AssessmentName | Unique name for assessment |\n",
"| 3 | Server | Target SQL Server |\n",
"| 4 | InitialCatalog | Name of the database for the specified server |\n",
"| 5 | ResultPath | Path and name of the file to store results in json format |"
],
"metadata": {
"azdata_cell_guid": "6764dd37-fb1f-400d-8f2b-70bc36fc3b61"
}
},
{
"cell_type": "markdown",
"source": [
"This notebook requires Data Migration Assistant to be installed in order to execute the below commands.\r\n",
"The installtion link would be [Data Migration Assistant download](https://www.microsoft.com/en-us/download/confirmation.aspx?id=53595)\r\n",
"\r\n",
"_With version 2.1 and above, when installation of Data Migration Assistant is successfull, it will also install dmacmd.exe in %ProgramFiles%\\Microsoft Data Migration Assistant\\. Use dmacmd.exe to assess databases in an unattended mode, and output the result to JSON or CSV file. This method is especially useful when assessing several databases or huge databases_"
],
"metadata": {
"azdata_cell_guid": "68506e39-d34b-4f17-a0c6-94e978f76488"
}
},
{
"cell_type": "code",
"source": [
"$ExecutableFile = \"\" # Path of the DmaCmd.exe file, generally the path would be \"C:\\Program Files\\Microsoft Data Migration Assistant\\DmaCmd.exe\"\r\n",
"$AssessmentName = \"\" # Name of the Assessment\r\n",
"$Server = \"\" # Targert Sql Server\r\n",
"$InitialCatalog = \"\" # Database name of the specified Sql Server\r\n",
"$ResultPath = \"\" # Path and Name of the file to store the result in json format, for example \"C:\\\\temp\\\\Results\\\\AssessmentReport.json\""
"$ExecutableFile = \"C:\\Program Files\\Microsoft Data Migration Assistant\\DmaCmd.exe\" # Update if different\r\n",
"$AssessmentName = \"\"\r\n",
"$Server = \"\"\r\n",
"$InitialCatalog = \"\"\r\n",
"$ResultPath = \"\""
],
"metadata": {
"azdata_cell_guid": "d81972c1-3b0b-47d9-b8a3-bc5ab4001a34"

View File

@@ -1,9 +1,10 @@
# Assessments
[Home](../readme.md)
## Notebooks in this Chapter
- [SQL Server Best Practices Assessment](sql-server-assessment.ipynb) - Use the SQL Server Assessment API to review the configuration of instances by name or dynamically by specifying the instance of a Central Management Server. SQL Assessment API provides a mechanism to evaluate the configuration of your SQL Server for best practices. The API is delivered with a ruleset containing best practice rules suggested by SQL Server Team. This ruleset is enhancing with the release of new versions but at the same time, the API is built with the intent to give a highly customizable and extensible solution. So, users can tune the default rules and create their own ones. SQL Assessment API is useful when you want to make sure your SQL Server configuration is in line with recommended best practices. After an initial assessment, configuration stability can be tracked by regularly scheduled assessments.
Preparing for the cloud requires a crawl-walk-run mentality. The first step, or crawl, towards hybrid migration is determining the fitness of existing on-premise resources. An assessment is an analysis performed against a chosen SQL Server object such as a Server or Database instance. It is recommended to fix any issues found by the analysis prior to migrating a database from on-premise to Azure.
- [Compatibility Assessment](compatibility-assessment.ipynb) - Coming soon
## Notebooks in this Chapter
- [SQL Server Best Practices Assessment](sql-server-assessment.ipynb) - demonstrates the use of the [SQL Server Assessment API](https://docs.microsoft.com/en-us/sql/sql-assessment-api/sql-assessment-api-overview), a tool to review the configuration of a SQL Server and Databases for best practices.
- [Compatibility Assessment](compatibility-assessment.ipynb) - Analzye an on-premises SQL Server instance or database for compatibility for migration to SQL Azure. The assessment will provide guidance on features not currently supported in Azure and remediation actions that can be taken to prepare for migration.

View File

@@ -19,17 +19,16 @@
"source": [
"# SQL Server Assessment Tool\n",
"\n",
"This notebook will demonstrate the use of the [SQL Server Assessment API](https://docs.microsoft.com/en-us/sql/sql-assessment-api/sql-assessment-api-overview), a tool to review the configuration of a SQL Server and Databases for best practices. An assessment is performed against a chosen SQL Server object. The default ruleset checks for two kinds of objects: Server and Database. In addition, the API supports Filegroup and AvailabilityGroup. When attempting to migrate a database from on-premise to Azure, it is recommended to fix any assessment items prior.\n",
"\n",
"**Unlike other notebooks, do not execute all cells of this notebook!** \n",
"Unlike other notebooks, **do not execute all cells of this notebook!** \n",
"\n",
"A single assessment may take awhile so fill out the variables and execute the cell that matches the desired environment to perform the assessment needed. Only one of these cells needs to be executed after the variables are defined.\n",
"\n",
"1. Ensure that the proper APIs and modules are installed per the <a href=\"../prereqs.ipynb\">prerequisites</a> notebook\n",
"2. Define a service instance and group corresponding to the SQL Server instances to be assessed\n",
"3. Choose an example below that corresponds to the appropriate task\n",
"4. Execute only that example's code block and wait for results\n",
"5. Fix any recommended issues and rerun Assessment API until clear"
"## Notebook Variables\n",
"\n",
"| Line | Variable | Description |\n",
"| ---- | -------- | ----------- |\n",
"| 1 | ServerInstance | Name of the SQL Server instance |\n",
"| 2 | Group | (Optional) Name of the server group, if known | "
],
"metadata": {
"azdata_cell_guid": "86ecfb01-8c38-4a99-92a8-687d8ec7f4b0"
@@ -47,6 +46,20 @@
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
"## Notebook Steps\r\n",
"1. Ensure that the proper APIs and modules are installed per the <a href=\"../prereqs.ipynb\">prerequisites</a> notebook\r\n",
"2. Define a service instance and group corresponding to the SQL Server instances to be assessed\r\n",
"3. Choose an example below that corresponds to the appropriate task\r\n",
"4. Execute only that example's code block and wait for results\r\n",
"5. Fix any recommended issues and rerun Assessment API until clear"
],
"metadata": {
"azdata_cell_guid": "541f6806-f8d2-4fc5-a8fb-6d42947d1a64"
}
},
{
"cell_type": "markdown",
"source": [

View File

@@ -1,8 +1,10 @@
# Appendices
[Home](readme.md)
## Appendix: Locations
See the <a href="https://azure.microsoft.com/en-us/global-infrastructure/locations/">Azure locations</a> page for a complete list of Azure regions along with their general physical location. The following is a list of common North American location settings for this guide:
### US Regions
### Regions
| Setting | Location |
| ------------ | --------- |
| Central US | Iowa |

View File

@@ -18,12 +18,46 @@
"cell_type": "markdown",
"source": [
"# Export Existing Azure SQL Server Resources\r\n",
"Export notebook that will utilize the ADP resources\r\n",
"\r\n",
"\r\n",
"<!-- Disable bullets to be shown for checkbox markup -->\r\n",
"<style type=\"text/css\">\r\n",
" ul { list-style-type: none }\r\n",
"</style>\r\n",
"## Notebook Variables\r\n",
"| Line | Variable | Description |\r\n",
"| -- | -- | -- |\r\n",
"| 1 | AdpSubscription | Azure Subscription ID/Name for the ADP Resource Group # Both RG are assumed to be in the same subscription |\r\n",
"| 2 | AdpResourceGroup | Azure Resource Group which contains the ADP Resources | \r\n",
"| 3 | SourceResourceGroup | Azure ResourceGroup where the sql server to be exported exists | \r\n",
"| 4 | LogicalSQLServerName | Logical sql server name of the sql server to be exported | \r\n",
"| 5 | StorageAccount | target storage account to store exported files # any storage account, but must be in the same RG as the ADP resources | \r\n",
"| 6 | AdpFunc | |\r\n",
"| 7 | AdpBatch | | \r\n",
"| 8 | AdpVNET | | "
],
"metadata": {
"azdata_cell_guid": "b72d138a-566f-4161-b7a6-7264487e446c"
}
},
{
"cell_type": "code",
"source": [
"$AdpSubscription = \"\"\r\n",
"$AdpResourceGroup = \"\"\r\n",
"$SourceResourceGroup= \"\"\r\n",
"$LogicalSQLServer = \"\"\r\n",
"$StorageAccount = \"\"\r\n",
"$AdpFunc = $AdpResourceGroup + \"Control\"\r\n",
"$AdpBatch = $AdpResourceGroup.ToLower() + \"batch\"\r\n",
"$AdpVNET = $AdpResourceGroup + \"Vnet\""
],
"metadata": {
"azdata_cell_guid": "417edc0e-1107-4a27-a4cf-e921f79b3f6a",
"tags": []
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
"## Steps\r\n",
"Gather input:\r\n",
"* [ ] Connect to Azure Subscription\r\n",
@@ -45,39 +79,6 @@
"azdata_cell_guid": "a9da248a-20f1-4574-bd04-7324e70c05a3"
}
},
{
"cell_type": "markdown",
"source": [
"## Set Variables for the Notebook"
],
"metadata": {
"azdata_cell_guid": "b72d138a-566f-4161-b7a6-7264487e446c"
}
},
{
"cell_type": "code",
"source": [
"# ADP Resource \r\n",
"$Env:BOOTSTRAP_Subscription = \"\" # Azure Subscription ID/Name for the ADP Resource Group # Both RG are assumed to be in the same subscription\r\n",
"$Env:BOOTSTRAP_ResourceGroup = \"\" # Azure Resource Group which contains the ADP Resources\r\n",
"\r\n",
"# SQL Server \r\n",
"$SourceResourceGroupName = \"\" # Azure ResourceGroup where the sql server to be exported exists\r\n",
"$LogicalSQLServerName = \"\" # Logical sql server name of the sql server to be exported\r\n",
"$StorageAccount = \"\" # target storage account to store exported files # any storage account, but must be in the same RG as the ADP resources.\r\n",
"\r\n",
"# Set Variables for ADP Resources\r\n",
"$Env:BOOTSTRAP_FUNC = $Env:BOOTSTRAP_ResourceGroup + \"Control\"\r\n",
"$Env:BOOTSTRAP_BATCH = $Env:BOOTSTRAP_ResourceGroup.ToLower() + \"batch\"\r\n",
"$Env:BOOTSTRAP_VNET = $Env:BOOTSTRAP_ResourceGroup + \"Vnet\""
],
"metadata": {
"azdata_cell_guid": "417edc0e-1107-4a27-a4cf-e921f79b3f6a",
"tags": []
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
@@ -105,9 +106,9 @@
" $subscriptions = az account list -o JSON | ConvertFrom-Json # getting subscriptions for the user to use in gridview\r\n",
" }\r\n",
"\r\n",
" if(![string]::IsNullOrWhiteSpace($Env:BOOTSTRAP_Subscription)) #If there is a subscription specified by user in the variables section\r\n",
" if(![string]::IsNullOrWhiteSpace($AdpSubscription)) #If there is a subscription specified by user in the variables section\r\n",
" {\r\n",
" $specified_Subscription= az account show --subscription $Env:BOOTSTRAP_Subscription -o json |ConvertFrom-Json \r\n",
" $specified_Subscription= az account show --subscription $AdpSubscription -o json |ConvertFrom-Json \r\n",
" if (!$specified_Subscription) #if specified subscription is not valid\r\n",
" { \r\n",
" $currentUser= az ad signed-in-user show --query \"{displayName:displayName,UPN:userPrincipalName}\" -o json|ConvertFrom-Json # get current logged in user infomration\r\n",
@@ -123,8 +124,8 @@
" $selectedSubscription = $subscriptions | Select-Object -Property Name, Id | Out-GridView -PassThru\r\n",
" $SubscriptionId = $selectedSubscription.Id\r\n",
" $Subscription = $selectedSubscription.Name \r\n",
" $Env:BOOTSTRAP_Subscription = $subscription \r\n",
" Write-Output \"Using subscription... '$Env:BOOTSTRAP_Subscription' ... '$SubscriptionId'\" \r\n",
" $AdpSubscription = $subscription \r\n",
" Write-Output \"Using subscription... '$AdpSubscription' ... '$SubscriptionId'\" \r\n",
" } \r\n",
"}\r\n",
"\r\n",
@@ -369,8 +370,8 @@
{
"cell_type": "code",
"source": [
"Verify-ADPResources -Subscription $Env:BOOTSTRAP_Subscription -ADPResourceGroupName $Env:BOOTSTRAP_ResourceGroup `\r\n",
" -BatchAccountName $Env:BOOTSTRAP_BATCH -FunctionName $Env:BOOTSTRAP_FUNC -VNetName $Env:BOOTSTRAP_VNET "
"Verify-ADPResources -Subscription $AdpSubscription -ADPResourceGroupName $AdpResourceGroup `\r\n",
" -BatchAccountName $AdpBatch -FunctionName $AdpFunc -VNetName $AdpVNET "
],
"metadata": {
"azdata_cell_guid": "8185f2ea-d368-42c5-9246-bc1871affc63"
@@ -391,7 +392,7 @@
{
"cell_type": "code",
"source": [
"Provision-FuncRBAC -FunctionName $Env:BOOTSTRAP_FUNC -ScopeRGName $SourceResourceGroupName -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Provision-FuncRBAC -FunctionName $AdpFunc -ScopeRGName $SourceResourceGroup -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "7678701e-ec40-43d9-baff-fd1cdabba1cd"
@@ -413,7 +414,7 @@
{
"cell_type": "code",
"source": [
"$sqlServer = az sql server show --name $LogicalSQLServerName --resource-group $SourceResourceGroupName --subscription $Env:BOOTSTRAP_Subscription -o JSON | ConvertFrom-JSON\r\n",
"$sqlServer = az sql server show --name $LogicalSQLServerName --resource-group $SourceResourceGroup --subscription $AdpSubscription -o JSON | ConvertFrom-JSON\r\n",
"if ($sqlServer)\r\n",
"{\r\n",
" Write-Host \"Source SQL Server: \" $sqlServer.name\r\n",
@@ -426,7 +427,7 @@
" Write-Host \"ERROR: Source server is not in Ready state. Current state is: \" $sqlServer.state\r\n",
" }\r\n",
"\r\n",
" $sqlAzureAdmin = az sql server ad-admin list --server $LogicalSQLServerName --resource-group $SourceResourceGroupName --subscription $Env:BOOTSTRAP_Subscription -o JSON | ConvertFrom-JSON\r\n",
" $sqlAzureAdmin = az sql server ad-admin list --server $LogicalSQLServerName --resource-group $SourceResourceGroup --subscription $AdpSubscription -o JSON | ConvertFrom-JSON\r\n",
" if ($sqlAzureAdmin)\r\n",
" {\r\n",
" Write-Host \"Azure AD admin set to\" $sqlAzureAdmin.login\r\n",
@@ -442,8 +443,8 @@
"{\r\n",
" Write-Host \"ERROR: Source server \" $sqlServer.name \"not found or current account lacks access to resource.\"\r\n",
" Write-Host \"Validate input settings:\"\r\n",
" Write-Host \"Resource group: \" $SourceResourceGroupName\r\n",
" Write-Host \"Subscription: \" $Env:BOOTSTRAP_Subscription\r\n",
" Write-Host \"Resource group: \" $SourceResourceGroup\r\n",
" Write-Host \"Subscription: \" $AdpSubscription\r\n",
"}"
],
"metadata": {
@@ -464,8 +465,8 @@
{
"cell_type": "code",
"source": [
"$InputForExportFunction = Prepare-InputForExportFunction -Subscription $Env:BOOTSTRAP_Subscription -ADPResourceGroupName $Env:BOOTSTRAP_ResourceGroup `\r\n",
" -BatchAccountName $Env:BOOTSTRAP_BATCH -FunctionName $Env:BOOTSTRAP_FUNC -VNetName $Env:BOOTSTRAP_VNET -SourceRGName $SourceResourceGroupName `\r\n",
"$InputForExportFunction = Prepare-InputForExportFunction -Subscription $AdpSubscription -ADPResourceGroupName $AdpResourceGroup `\r\n",
" -BatchAccountName $AdpBatch -FunctionName $AdpFunc -VNetName $AdpVNET -SourceRGName $SourceResourceGroup `\r\n",
" -SqlServerName $LogicalSQLServerName -StorageAccountName $StorageAccount\r\n",
"Write-Host \"Setting parameter variables for Export Function Call...\"\r\n",
"$InputForExportFunction.Header\r\n",
@@ -528,7 +529,7 @@
" Write-Host \"`tCreated Export Batch Job ID: \" $batchJobId\r\n",
" Write-Host \"`tExport container URL: \" $containerUrl\r\n",
"\r\n",
" $azBatchLogin = az batch account login --name $Env:BOOTSTRAP_BATCH --resource-group $Env:BOOTSTRAP_ResourceGroup -o JSON | ConvertFrom-Json\r\n",
" $azBatchLogin = az batch account login --name $AdpBatch --resource-group $AdpResourceGroup -o JSON | ConvertFrom-Json\r\n",
" $jobStatus = az batch job show --job-id $batchJobID -o JSON | ConvertFrom-Json\r\n",
" Write-Host \"Export Job running on Pool: \" $jobStatus.poolInfo.poolId\r\n",
" Write-Host \"`tExport Request Status: \" $jobStatus.state\r\n",

View File

@@ -63,8 +63,8 @@
"cell_type": "code",
"source": [
"# ADP Resource \r\n",
"$Env:BOOTSTRAP_Subscription = \"\" # Azure Subscription ID/Name # The bacpac files and ADP Resources are assumed to be in the same subscription\r\n",
"$Env:BOOTSTRAP_ResourceGroup = \"\" # Azure Resource Group which contains the ADP Resources\r\n",
"$AdpSubscription = \"\" # Azure Subscription ID/Name # The bacpac files and ADP Resources are assumed to be in the same subscription\r\n",
"$AdpResourceGroup = \"\" # Azure Resource Group which contains the ADP Resources\r\n",
"\r\n",
"# SQL Server \r\n",
"$TargetResourceGroupName = \"\" # Azure ResourceGroup into which the sql server backup needs to be restored\r\n",
@@ -74,9 +74,9 @@
"$LSqlServerPassword = \"\"\r\n",
"\r\n",
"# Set Variables for ADP Resources\r\n",
"$Env:BOOTSTRAP_FUNC = $Env:BOOTSTRAP_ResourceGroup + \"Control\" \r\n",
"$Env:BOOTSTRAP_BATCH = $Env:BOOTSTRAP_ResourceGroup.ToLower() + \"batch\"\r\n",
"$Env:BOOTSTRAP_VNET = $Env:BOOTSTRAP_ResourceGroup + \"Vnet\""
"$AdpFunc = $AdpResourceGroup + \"Control\" \r\n",
"$AdpBatch = $AdpResourceGroup.ToLower() + \"batch\"\r\n",
"$AdpVNET = $AdpResourceGroup + \"Vnet\""
],
"metadata": {
"azdata_cell_guid": "01888595-0d1c-445b-ba85-dd12caa30192",
@@ -112,9 +112,9 @@
" $subscriptions = az account list -o JSON | ConvertFrom-Json # getting subscriptions for the user to use in gridview\r\n",
" }\r\n",
"\r\n",
" if(![string]::IsNullOrWhiteSpace($Env:BOOTSTRAP_Subscription)) #If there is a subscription specified by user in the variables section\r\n",
" if(![string]::IsNullOrWhiteSpace($AdpSubscription)) #If there is a subscription specified by user in the variables section\r\n",
" {\r\n",
" $specified_Subscription= az account show --subscription $Env:BOOTSTRAP_Subscription -o json |ConvertFrom-Json \r\n",
" $specified_Subscription= az account show --subscription $AdpSubscription -o json |ConvertFrom-Json \r\n",
" if (!$specified_Subscription) #if specified subscription is not valid\r\n",
" { \r\n",
" $currentUser= az ad signed-in-user show --query \"{displayName:displayName,UPN:userPrincipalName}\" -o json|ConvertFrom-Json # get current logged in user infomration\r\n",
@@ -130,8 +130,8 @@
" $selectedSubscription = $subscriptions | Select-Object -Property Name, Id | Out-GridView -PassThru\r\n",
" $SubscriptionId = $selectedSubscription.Id\r\n",
" $Subscription = $selectedSubscription.Name \r\n",
" $Env:BOOTSTRAP_Subscription = $subscription \r\n",
" Write-Output \"Using subscription... '$Env:BOOTSTRAP_Subscription' ... '$SubscriptionId'\" \r\n",
" $AdpSubscription = $subscription \r\n",
" Write-Output \"Using subscription... '$AdpSubscription' ... '$SubscriptionId'\" \r\n",
" } \r\n",
"}\r\n",
"\r\n",
@@ -381,8 +381,8 @@
{
"cell_type": "code",
"source": [
"Verify-ADPResources -Subscription $Env:BOOTSTRAP_Subscription -ADPResourceGroupName $Env:BOOTSTRAP_ResourceGroup `\r\n",
" -BatchAccountName $Env:BOOTSTRAP_BATCH -FunctionName $Env:BOOTSTRAP_FUNC -VNetName $Env:BOOTSTRAP_VNET "
"Verify-ADPResources -Subscription $AdpSubscription -ADPResourceGroupName $AdpResourceGroup `\r\n",
" -BatchAccountName $AdpBatch -FunctionName $AdpFunc -VNetName $AdpVNET "
],
"metadata": {
"azdata_cell_guid": "e89f6eb9-fcbc-4b7d-bcd1-37f1eb52cc02",
@@ -406,7 +406,7 @@
{
"cell_type": "code",
"source": [
"Provision-FuncRBAC -FunctionName $Env:BOOTSTRAP_FUNC -ScopeRGName $TargetResourceGroupName -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Provision-FuncRBAC -FunctionName $AdpFunc -ScopeRGName $TargetResourceGroupName -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "c374e57c-51ec-4a3f-9966-1e50cefc8510"
@@ -426,9 +426,9 @@
{
"cell_type": "code",
"source": [
"$InputForImportFunction = Prepare-InputForImportFunction -Subscription $Env:BOOTSTRAP_Subscription -ADPResourceGroupName $Env:BOOTSTRAP_ResourceGroup `\r\n",
" -BatchAccountName $Env:BOOTSTRAP_BATCH -FunctionName $Env:BOOTSTRAP_FUNC -TargetRGName $TargetResourceGroupName `\r\n",
" -VNetName $Env:BOOTSTRAP_VNET -BackupFiles_StorageAccount $StorageAccountName -BackupFiles_ContainerName $ContainerName `\r\n",
"$InputForImportFunction = Prepare-InputForImportFunction -Subscription $AdpSubscription -ADPResourceGroupName $AdpResourceGroup `\r\n",
" -BatchAccountName $AdpBatch -FunctionName $AdpFunc -TargetRGName $TargetResourceGroupName `\r\n",
" -VNetName $AdpVNET -BackupFiles_StorageAccount $StorageAccountName -BackupFiles_ContainerName $ContainerName `\r\n",
" -SqlServerName $LogicalSQLServerName -SqlServerPassword $LSqlServerpassword\r\n",
"Write-Host \"Setting parameter variables for Import Function Call...\"\r\n",
"$InputForImportFunction.Header\r\n",
@@ -495,7 +495,7 @@
" $containerUrl = $outputParams.Item2[3]\r\n",
"\r\n",
" Write-Host \"`tCreated Import Batch Job ID: \" $batchJobId\r\n",
" $azBatchLogin = az batch account login --name $Env:BOOTSTRAP_BATCH --resource-group $Env:BOOTSTRAP_ResourceGroup -o JSON | ConvertFrom-Json\r\n",
" $azBatchLogin = az batch account login --name $AdpBatch --resource-group $AdpResourceGroup -o JSON | ConvertFrom-Json\r\n",
" $jobStatus = az batch job show --job-id $batchJobID -o JSON | ConvertFrom-Json\r\n",
" Write-Host \"Import Job running on Pool: \" $jobStatus.poolInfo.poolId\r\n",
" Write-Host \"`Import Request Status: \" $jobStatus.state\r\n",

View File

@@ -1,15 +1,20 @@
# Data Portability
[Home](../readme.md)
Notebooks in this chapter perform a data migration using a custom Azure function that can be deployed to an Azure subscription. It enables [Azure Batch](https://azure.microsoft.com/en-us/services/batch) computing of a complex SQL Server migration to and from a single Resource Group. Azure Batch is a process that runs large-scale parallel and high-performance computing (HPC) batch jobs efficiently in Azure. This greatly reduces the processing required locally which should prevent long execution times, timeouts and retries. Importing and exporting data to and from Azure is supported for multiple SQL database instances. Data is imported and exported to and from standard SQL backup formats (*.bacpac) which "encapsulates the database schema as well as the data stored in the database" ([Microsoft Docs](https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/data-tier-applications)).
## Notebooks in this Chapter
- [Azure Data Portability Setup](bootstrap.ipynb) - Configure and install a custom Azure function to migrate data to and from Azure
- [Azure Data Portability Setup](setup-adp.ipynb) - Configure and install a custom Azure function to migrate data to and from Azure <br/>
<img width="25%" src="VisualBootstrapperNB.PNG"/>
- [Export Sql Server](export-sql-server.ipynb) - from SQL Azure to a standard SQL backup format
- [Import Sql Server](import-sql-server.ipynb) - from SQL backup format to Azure
The Notebooks in this chapter perform a data migration using a custom Azure function that can be deployed to an Azure subscription. It enables [Azure Batch](https://azure.microsoft.com/en-us/services/batch) computing of a complex SQL Server migration to and from a single Resource Group. Azure Batch is a process that runs large-scale parallel and high-performance computing (HPC) batch jobs efficiently in Azure. This greatly reduces the processing required locally which should prevent long execution times, timeouts and retries. Importing and exporting data to and from Azure is supported for multiple SQL database instances. Data is imported and exported to and from standard SQL backup formats (*.bacpac) which "encapsulates the database schema as well as the data stored in the database" ([Microsoft Docs](https://docs.microsoft.com/en-us/sql/relational-databases/data-tier-applications/data-tier-applications)).
## Steps
1. The Azure function must first be deployed using the setup notebook
2. Open the notebook for the desired migration path
2. Open the notebook for the desired migration path (import or export)
3. Configure and execute notebook
4. Monitor progress with periodic notebook queries
5. Verify data has been imported/exported by reviewing the storage account for the migrated Resource Group

View File

@@ -55,22 +55,22 @@
"cell_type": "code",
"source": [
"# Setup client environment variables that the rest of the notebook will use\r\n",
"$Env:BOOTSTRAP_ResourceGroup = \"\" # Target Resource Group to bootstrap with ADP components - A new one will be created if the specified Resource Group doesn't exist\r\n",
"$Env:BOOTSTRAP_RG_REGION = \"eastus\" # Region/Location of the resource group to be bootstrapped\r\n",
"$AdpResourceGroup = \"\" # Target Resource Group to bootstrap with ADP components - A new one will be created if the specified Resource Group doesn't exist\r\n",
"$AdpRegion = \"eastus\" # Region/Location of the resource group to be bootstrapped\r\n",
"\r\n",
"# Derived settings\r\n",
"$Env:BOOTSTRAP_Subscription = \"\" # Target Azure Subscription Name or ID to bootstrap data portability resources\r\n",
"$Env:BOOTSTRAP_FUNC = $Env:BOOTSTRAP_ResourceGroup + \"Control\"\r\n",
"$Env:BOOTSTRAP_STORAGE = $Env:BOOTSTRAP_ResourceGroup.ToLower() + \"storage\"\r\n",
"$Env:BOOTSTRAP_BATCH = $Env:BOOTSTRAP_ResourceGroup.ToLower() + \"batch\"\r\n",
"$Env:BOOTSTRAP_VNET = $Env:BOOTSTRAP_ResourceGroup + \"VNet\"\r\n",
"$AdpSubscription = \"\" # Target Azure Subscription Name or ID to bootstrap data portability resources\r\n",
"$AdpFunc = $AdpResourceGroup + \"Control\"\r\n",
"$AdpStorage = $AdpResourceGroup.ToLower() + \"storage\"\r\n",
"$AdpBatch = $AdpResourceGroup.ToLower() + \"batch\"\r\n",
"$AdpVNET = $AdpResourceGroup + \"VNet\"\r\n",
"\r\n",
"# Bootstrapper URLs - Update with the recommended toolkit version and build\r\n",
"$BaseToolkitUrl = \"https://hybridtoolkit.blob.core.windows.net/components\"\r\n",
"$ReleaseVersion = \"0.13\"\r\n",
"$BuildNumber = \"74938\"\r\n",
"$Env:BOOTSTRAP_URL_FUNC = \"$BaseToolkitUrl/$ReleaseVersion/ADPControl-$BuildNumber.zip\"\r\n",
"$Env:BOOTSTRAP_URL_WRAP = \"$BaseToolkitUrl/$ReleaseVersion/BatchWrapper-$BuildNumber.zip\"\r\n",
"$AdpDownloadUrl = \"$BaseToolkitUrl/$ReleaseVersion/ADPControl-$BuildNumber.zip\"\r\n",
"$AdpWrapperUrl = \"$BaseToolkitUrl/$ReleaseVersion/BatchWrapper-$BuildNumber.zip\"\r\n",
"\r\n",
"Write-Output \"Setting the Environment:\"\r\n",
"Get-ChildItem Env: | Where-Object Name -Match \"BOOTSTRAP\""
@@ -121,9 +121,9 @@
" $subscriptions = az account list -o JSON | ConvertFrom-Json # getting subscriptions for the user to use in gridview\r\n",
" }\r\n",
"\r\n",
" if(![string]::IsNullOrWhiteSpace($Env:BOOTSTRAP_Subscription)) #If there is a subscription specified by user in the variables section\r\n",
" if(![string]::IsNullOrWhiteSpace($AdpSubscription)) #If there is a subscription specified by user in the variables section\r\n",
" {\r\n",
" $specified_Subscription= az account show --subscription $Env:BOOTSTRAP_Subscription -o json |ConvertFrom-Json \r\n",
" $specified_Subscription= az account show --subscription $AdpSubscription -o json |ConvertFrom-Json \r\n",
" if (!$specified_Subscription) #if specified subscription is not valid\r\n",
" { \r\n",
" $currentUser= az ad signed-in-user show --query \"{displayName:displayName,UPN:userPrincipalName}\" -o json|ConvertFrom-Json # get current logged in user infomration\r\n",
@@ -139,8 +139,8 @@
" $selectedSubscription = $subscriptions | Select-Object -Property Name, Id | Out-GridView -PassThru\r\n",
" $SubscriptionId = $selectedSubscription.Id\r\n",
" $Subscription = $selectedSubscription.Name \r\n",
" $Env:BOOTSTRAP_Subscription = $subscription \r\n",
" Write-Output \"Using subscription... '$Env:BOOTSTRAP_Subscription' ... '$SubscriptionId'\" \r\n",
" $AdpSubscription = $subscription \r\n",
" Write-Output \"Using subscription... '$AdpSubscription' ... '$SubscriptionId'\" \r\n",
" } \r\n",
"}\r\n",
"\r\n",
@@ -207,8 +207,8 @@
" else { \r\n",
" #VNet or defaut subnet not found under specified resource group. Create new VNet with default Subnet /Add default subnet to existing VNet\r\n",
" Write-Output \"Creating new Virtual network with default Subnet ID ... \"\r\n",
" $newVNet = az network vnet create --name \"$Env:BOOTSTRAP_VNET\" --resource-group $Env:BOOTSTRAP_ResourceGroup --subscription $Env:BOOTSTRAP_Subscription --subnet-name $SubNetName -o JSON |ConvertFrom-Json #vnet create/Update command: Bug: In this command, the output variable is not getting converted to PS objects.\r\n",
" $newVNet = az network vnet subnet show -g $Env:BOOTSTRAP_ResourceGroup --vnet-name $Env:BOOTSTRAP_VNET -n $SubNetName --subscription $Env:BOOTSTRAP_Subscription -o JSON |ConvertFrom-Json # added this line due to above bug\r\n",
" $newVNet = az network vnet create --name \"$AdpVNET\" --resource-group $AdpResourceGroup --subscription $AdpSubscription --subnet-name $SubNetName -o JSON |ConvertFrom-Json #vnet create/Update command: Bug: In this command, the output variable is not getting converted to PS objects.\r\n",
" $newVNet = az network vnet subnet show -g $AdpResourceGroup --vnet-name $AdpVNET -n $SubNetName --subscription $AdpSubscription -o JSON |ConvertFrom-Json # added this line due to above bug\r\n",
" Write-Output \"Created VNet with default Subnet - ID: '$($newVNet.id)'\"\r\n",
" }\r\n",
"}\r\n",
@@ -508,7 +508,7 @@
{
"cell_type": "code",
"source": [
"Bootstrap-AzResourceGroup -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -ResourceGroupLocation $Env:BOOTSTRAP_RG_REGION -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-AzResourceGroup -ResourceGroupName $AdpResourceGroup -ResourceGroupLocation $AdpRegion -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "9beb8d22-4560-4c7e-917b-5a3c0d58e1a2",
@@ -533,7 +533,7 @@
{
"cell_type": "code",
"source": [
"Bootstrap-AzVirtualNetwork -VNetName $Env:BOOTSTRAP_VNET -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-AzVirtualNetwork -VNetName $AdpVNET -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "d014a6a6-57ff-4de7-8210-b3360bf34daa"
@@ -555,7 +555,7 @@
{
"cell_type": "code",
"source": [
"Bootstrap-AzStorageAccount -StorageAccountName $Env:BOOTSTRAP_STORAGE -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-AzStorageAccount -StorageAccountName $AdpStorage -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "290498ee-3f31-4395-adab-a5fa93d28c80",
@@ -580,8 +580,8 @@
{
"cell_type": "code",
"source": [
"Bootstrap-AzFunctionApp -FunctionName $Env:BOOTSTRAP_FUNC -StorageAccountName $Env:BOOTSTRAP_STORAGE -FunctionAppPackageURL $Env:BOOTSTRAP_URL_FUNC `\r\n",
" -ConsumptionPlanLocation $Env:BOOTSTRAP_RG_REGION -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-AzFunctionApp -FunctionName $AdpFunc -StorageAccountName $AdpStorage -FunctionAppPackageURL $AdpDownloadUrl `\r\n",
" -ConsumptionPlanLocation $AdpRegion -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "6fc2b5ec-c16f-4eb7-b2f9-c8c680d9a2df",
@@ -605,8 +605,8 @@
{
"cell_type": "code",
"source": [
"Bootstrap-AzBatchAccount -BatchAccountName $Env:BOOTSTRAP_BATCH -StorageAccountName $Env:BOOTSTRAP_STORAGE -BatchAccountLocation $Env:BOOTSTRAP_RG_REGION `\r\n",
" -ApplicationPackageURL $Env:BOOTSTRAP_URL_WRAP -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-AzBatchAccount -BatchAccountName $AdpBatch -StorageAccountName $AdpStorage -BatchAccountLocation $AdpRegion `\r\n",
" -ApplicationPackageURL $AdpWrapperUrl -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "489733c4-1162-479b-82b4-b0c18954b25b",
@@ -628,7 +628,7 @@
{
"cell_type": "code",
"source": [
"Bootstrap-FuncRBAC -AzFunctionName $Env:BOOTSTRAP_FUNC -ResourceGroupName $Env:BOOTSTRAP_ResourceGroup -Subscription $Env:BOOTSTRAP_Subscription"
"Bootstrap-FuncRBAC -AzFunctionName $AdpFunc -ResourceGroupName $AdpResourceGroup -Subscription $AdpSubscription"
],
"metadata": {
"azdata_cell_guid": "75882d3a-2004-4304-ab8f-e5146e14500c",

View File

@@ -1,4 +1,6 @@
# Glossary
[Home](readme.md)
A list of terms and their definitions can be found below
* **ADS** - *Azure Data Studio* is a desktop tool for managing Azure Data resources in the cloud, on-premises, or hybrid environments.
@@ -31,4 +33,5 @@ A list of terms and their definitions can be found below
* **SQL Assessment API** - evaluates a SQL instance configuration for best practices
* **SQL Virtual Machine** - an IaaS Azure offer that provisions and manages virtual machine with SQL Server installed
* **SQL Managed Instance** - a PaaS Azure offer for SQL Server that is ran on Azure infrastructure. Microsoft will manage the complexities of the infrastructure for the user
* **SMO** - SQL Management Objects are "objects designed for programmatic management of Microsoft SQL Server" ([Microsoft](https://docs.microsoft.com/en-us/sql/relational-databases/server-management-objects-smo/overview-smo))
* **VPN** - a *virtual private network* is a collection of computing resources that organizes and extends a private network configuration over the public Internet, normally using some kind of encryption for security and privacy.

View File

@@ -1,9 +1,10 @@
# High Availability and Disaster Recovery
[Home](../readme.md)
**Coming soon**: Notebooks to help with HADR tasks in a Hybrid Cloud environment.
Notebooks to help with HADR tasks in a Hybrid Cloud environment.
## Notebooks in this Chapter
- [Backup Database to Blob Storage](backup-to-blob.ipynb)
- [Add Azure Passive Secondary Replica](add-passive-secondary.ipynb)
- [Add Azure Passive Secondary Replica](add-passive-secondary.ipynb)

View File

@@ -1,12 +1,13 @@
# Networking
[Home](../readme.md)
This chapter contains notebooks to configure and make a secure network connection in an Azure hybrid cloud environment.
<img width="50%" src="https://docs.microsoft.com/en-us/azure/vpn-gateway/media/point-to-site-about/p2s.png">
## Notebooks in this Chapter
- [Download VPN Client Certificate](download-VpnClient.ipynb) - Used to install certificates that encrypt communication between on-site and Azure services
- [Create Point-to-Site VPN](p2svnet-creation.ipynb) - Enables secure **Point-to-Site** (P2S) communication between a virtual private network in Azure and local resources. P2S is used by individuals and small groups for remote connectivity. A Point-to-Site (P2S) VPN gateway connection lets you create a secure connection to your VPN from an individual client computer. A P2S connection is established by starting it from the client computer. This solution is useful for telecommuters who want to connect to Azure VNets from a remote location, such as from home or a conference. P2S VPN is also a useful solution to use instead of S2S VPN when you have only a few clients that need to connect to a virtual network.
- [Create Site-to-Site VPN](s2svnet-creation.ipynb) - **Site-to-site** (S2S) is normally used by organizations that want greater control between on-premise and cloud resources using a VPN gateway. A S2S VPN gateway connection is used to connect your on-premises network to an Azure virtual network over an IPsec/IKE (IKEv1 or IKEv2) VPN tunnel. This type of connection requires a VPN device located on-premises that has an externally facing public IP address assigned to it. For more information about VPN gateways, see [About VPN gateway](https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-about-vpngateways) and [Create and manage S2S VPN connections using PowerShell](https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-tutorial-vpnconnection-powershell "https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-tutorial-vpnconnection-powershell"). **NOTE:** *May require the help of a Network Administrator or similar role to setup a secure Gateway*.
This chapter contains notebooks to configure and make a secure network connection in an Azure hybrid cloud environment.
<img width="50%" src="https://docs.microsoft.com/en-us/azure/vpn-gateway/media/point-to-site-about/p2s.png">
- [Create Site-to-Site VPN](s2svnet-creation.ipynb) - **Site-to-site** (S2S) is normally used by organizations that want greater control between on-premise and cloud resources using a VPN gateway. A S2S VPN gateway connection is used to connect your on-premises network to an Azure virtual network over an IPsec/IKE (IKEv1 or IKEv2) VPN tunnel. This type of connection requires a VPN device located on-premises that has an externally facing public IP address assigned to it. For more information about VPN gateways, see [About VPN gateway](https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-about-vpngateways) and [Create and manage S2S VPN connections using PowerShell](https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-tutorial-vpnconnection-powershell "https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-tutorial-vpnconnection-powershell"). **NOTE:** *May require the help of a Network Administrator or similar role to setup a secure Gateway*.

View File

@@ -13,3 +13,4 @@ This chapter contains a set of notebooks useful for doing offline migration of d
- [Migrate Database to Azure SQL MI](db-to-MI.ipynb)
- [Migrate Database to Azure SQL DB](db-to-SQLDB.ipynb)

View File

@@ -126,7 +126,13 @@
"SQL Assessment API is part of the SQL Server Management Objects (SMO) and can be used with the SQL Server PowerShell module. Because installing the modules may require a local Administrator account's permission, it cannot be done automatically with this Notebook. The **Assessments** Notebooks require the following:\n",
"\n",
"- [Install SMO](https://docs.microsoft.com/en-us/sql/relational-databases/server-management-objects-smo/installing-smo?view=sql-server-ver15)\n",
"- [Install SQL Server PowerShell module](https://docs.microsoft.com/en-us/sql/powershell/download-sql-server-ps-module?view=sql-server-ver15)"
"- [Install SQL Server PowerShell module](https://docs.microsoft.com/en-us/sql/powershell/download-sql-server-ps-module?view=sql-server-ver15)\n",
"\n",
"## Compatibility Assessment Tool - Data Migration Assistant\n",
"\n",
"The Compatibility Assessment Notebook requires the Data Migration Assistant tool to be installed in order to execute. The installation link would be [Data Migration Assistant download](https://www.microsoft.com/en-us/download/confirmation.aspx?id=53595)\n",
"\n",
"With version 2.1 and above, when installation of Data Migration Assistant is successful, it will install dmacmd.exe in _%ProgramFiles%\\\\Microsoft Data Migration Assistant_ folder."
],
"metadata": {
"azdata_cell_guid": "1b49a7e5-a773-4104-8f88-bd2ea3c806a3"

View File

@@ -1,8 +1,9 @@
# Azure SQL Provisioning
# Provisioning
[Home](../readme.md)
This chapter contains Notebooks that help provision new Azure SQL resources that can be used as migration targets for existing on-premises SQL instances and databases. Use alongside the planning notebooks to use existing resources as the basis for the best type of resource to create and how it should be configured. You can use the notebooks and configure the settings manually or provide a provisioning plan created by the [Create Provisioning Plan](../provisioning/provisioning-plan.ipynb) notebook.
## Notebooks in this Chapter
- [Create Azure SQL Virtual Machine](create-sqlvm.ipynb) - SQL Server on Azure Virtual Machines enables to use full versions of SQL Server in the cloud without having to manage any on-premises hardware. The virtual machine image gallery allows to create a SQL Server VM with the right version, edition, and operating system
- [Create Azure SQL Managed Instance](create-sqlmi.ipynb) - Azure SQL Managed Instance is the intelligent, scalable, cloud database service that combines the broadest SQL Server engine compatibility with all the benefits of a fully managed and evergreen platform as a service. An instance is a copy of the sqlservr.exe executable that runs as an operating system service
- [Create Azure SQL Database](create-sqldb.ipynb) - Azure SQL Database is Microsoft's fully managed cloud relational database service in Microsoft Azure. It shares the same code base as traditional SQL Servers but with Microsoft's Cloud first strategy the newest features of SQL Server are actually released to Azure SQL Database first. Use this notebook when a need is systematic collection of data that stores data in tables
This chapter contains Notebooks that help provision new Azure SQL resources that can be used as migration targets for existing on-premises SQL instances and databases. Use alongside the planning notebooks to use existing resources as the basis for the best type of resource to create and how it should be configured. You can use the notebooks and configure the settings manually or provide a provisioning plan created by the [Create Provisioning Plan](../provisioning/provisioning-plan.ipynb) notebook.
- [Create Azure SQL Database](create-sqldb.ipynb) - Azure SQL Database is Microsoft's fully managed cloud relational database service in Microsoft Azure. It shares the same code base as traditional SQL Servers but with Microsoft's Cloud first strategy the newest features of SQL Server are actually released to Azure SQL Database first. Use this notebook when a need is systematic collection of data that stores data in tables

View File

@@ -1,28 +1,35 @@
# Azure SQL Hybrid Cloud Toolkit
# Welcome to the Azure SQL Hybrid Cloud Toolkit!
## Chapters
* [Prerequisites and Initial Setup](prereqs.ipynb) - Notebook installation of required modules.
The **Azure SQL Hybrid Cloud Toolkit** is a [Jupyter Book](https://jupyterbook.org/intro.html) extension of [Azure Data Studio](https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio) (ADS) designed to help [Azure SQL Database](https://azure.microsoft.com/en-us/services/sql-database/) and ADS users deploy, migrate and configure for a hybrid cloud environment. The toolkit was designed with and intended to be executed within ADS. This is to ensure the best possible experience.
* [Assessments](Assessments/readme.md) - Notebooks that contain examples to determine whether a given database or SQL Server instance is ready to migrate by utilizing SQL Assessments. SQL instances are scanned based on a "best practices" set of rules.
* [Networking](networking/readme.md) - Setup secure Point-to-Site (P2S) or Site-to-Site (S2S) network connectivity to Microsoft Azure using a Virtual Private Network (VPN). This notebook serves as a building block for other notebooks as communicating securely between on-premise and Azure is essential for many tasks.
* [Provisioning](provisioning/readme.md) - Creating and communicating with SQL Resources in Microsoft Azure. Includes common tasks such as creating SQL Virtual Machines or SQL Managed Instances in the cloud.
* [Data Portability](data-portability/readme.md) - Install a custom Azure function to facilitate importing and exporting cloud resources. The solution uses parallel tasks in Azure Batch to perform data storage work. Azure Batch is a process that runs large-scale parallel and high-performance computing jobs efficiently in Azure.
* [High Availability and Disaster Recovery](hadr/readme.md) - Notebooks to leverage Azure SQL for business continuity in a hybrid cloud environment.
* [Offline Migration](offline-migration/readme.md) - Notebooks to perform various migrations.
* [Glossary](glossary.md) - set of defined terms.
* [Appendices](appendices.md) - misc info.
## About
The **Azure SQL Hybrid Cloud Toolkit** is a [Jupyter Book](https://jupyterbook.org/intro.html) extension of [Azure Data Studio](https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio) (ADS) designed to help [Azure SQL Database](https://azure.microsoft.com/en-us/services/sql-database/) and ADS users deploy, migrate and configure for a hybrid cloud environment. The toolkit was designed with and intended to be executed within ADS. This is to ensure the best possible user experience for those without vast knowledge of Azure services while adhering closely to the software _best practices_ standards required by experienced cloud users.
## Goals and Methodology
The toolkit better positions a customer with regards to planning, migrating, and thriving in a hybrid cloud environment by:
* Providing SQL Azure users with reliable free software and content that is well-written and executable
* Providing SQL'zure users with reliable free software and content that is well-written and executable
* Greatly simplifying the integration of Azure Data services into an existing environment
* Positioning Azure to be the natural cloud services choice with a low-friction experience
* Notebooks are executable by a normal user (unless otherwise specificed) on minimal hardware
* Most notebooks require some configuration. If so, the proper configurations should be clearly located towards the top of the notebook or cell, whichever is most appropriate
* Modify the cells to meet the desired requirements
* By design, Notebooks are written to be executed from top-to-bottom. Therefore, each notebook has a specific task to perform and should focus only on that task. It may contain several cells to execute but it will adhere to the one-task per notebook paradigm
**NOTE:** Executing notebooks could potentially create new Azure Resources which may incur charges to the Azure Subscription. Make sure the repercussions of executing any cells are understood.
## Prerequisites and Initial Setup
The notebooks may leverage various modules from Python or Microsoft PowerShell and the OSS community. To execute the notebooks in this toolkit, start with the [Prerequisites and Initial Setup Notebook](Prerequisites/prereqs.ipynb) where all prerequisite modules will be checked and installed if not found in the execution environment.
## Chapters
The toolkit has chapters on network configuration, on-premise SQL Server assessment, resource provisioning, and Azure migration. See below:
* [Networking](networking/readme.md) - Setup secure Point-to-Site (P2S) or Site-to-Site (S2S) network connectivity to Microsoft Azure using a Virtual Private Network (VPN). This notebook serves as a building block for other notebooks as communicating securely between on-premise and Azure is essential for many tasks
* [Assessments](Assessments/readme.md) - Notebooks that contain examples to determine whether a given database or SQL Server instance is ready to migrate by utilizing SQL Assessments. SQL instances are scanned based on a "best practices" set of rules.
* [Provisioning](provisioning/readme.md) - Creating and communicating with SQL Resources in Microsoft Azure. Includes common tasks such as creating SQL Virtual Machines or SQL Managed Instances in the cloud
* [Data Portability](data-portability/readme.md) - Install a custom Azure function to facilitate importing and exporting cloud resources. The solution uses parallel tasks in Azure Batch to perform data storage work. Azure Batch is a process that runs large-scale parallel and high-performance computing jobs efficiently in Azure.
* [High Availability and Disaster Recovery](hadr/readme.md) - Notebooks to leverage Azure SQL for business continuity in a hybrid cloud environment
* [Offline Migration](offline-migration/readme.md) - Notebooks to perform various migrations
**NOTE:** Executing notebooks could potentially create new Azure Resources which may incur charges to the Azure Subscription. Make sure the repercussions of executing any cells are understood.

View File

@@ -4,14 +4,26 @@
"description": "%description%",
"version": "0.1.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",
"icon": "images/extension.png",
"aiKey": "AIF-37eefaf0-8022-4671-a3fb-64752724682e",
"engines": {
"vscode": "*",
"azdata": "*"
},
"main": "./out/main",
"activationEvents": [
"*"
],
"repository": {
"type": "git",
"url": "https://github.com/Microsoft/azuredatastudio.git"
},
"main": "./out/main",
"extensionDependencies": [
"Microsoft.mssql",
"Microsoft.notebook"
],
"contributes": {
"commands": [
{

View File

@@ -1,5 +1,5 @@
{
"displayName": "Azure SQL Hybrid Cloud Toolkit Jupyter Book Extension",
"displayName": "Azure SQL Hybrid Cloud Toolkit",
"description": "Opens up Azure SQL Hybrid Cloud Toolkit Jupyter Book",
"title.openJupyterBook": "Open Azure SQL Hybrid Cloud Toolkit Jupyter Book",
"title.cloudHybridBooks": "Azure SQL Hybrid Cloud Toolkit",

View File

@@ -1,8 +1,8 @@
{
"extends": "../shared.tsconfig.json",
"compileOnSave": true,
"compilerOptions": {
"module": "commonjs",
"target": "es6",
"outDir": "./out",
"lib": [
"es6", "es2015.promise"

View File

@@ -46,6 +46,7 @@
"Microsoft.arc",
"Microsoft.azdata",
"Microsoft.azuredatastudio-postgresql",
"Microsoft.azurehybridtoolkit",
"Microsoft.cms",
"Microsoft.dacpac",
"Microsoft.import",