SQL Server Deployment extension for Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL ServerDeploy SQL Server…DeploymentSQL Server container imageRun SQL Server container image with dockerSQL Server Big Data ClusterSQL Server Big Data Cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on KubernetesVersionSQL Server 2017SQL Server 2019 RC./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 RCDeployment targetNew Azure Kubernetes Service ClusterExisting Azure Kubernetes Service ClusterExisting Kubernetes Cluster (kubeadm)./notebooks/bdc/2019/deploy-bdc-aks.ipynb./notebooks/bdc/2019/deploy-bdc-existing-aks.ipynb./notebooks/bdc/2019/deploy-bdc-existing-kubeadm.ipynb./notebooks/bdc/2019/azdata/deploy-bdc-aks.ipynb./notebooks/bdc/2019/azdata/deploy-bdc-existing-aks.ipynb./notebooks/bdc/2019/azdata/deploy-bdc-existing-kubeadm.ipynbDeploy SQL Server 2017 container images with dockerDeploy SQL Server 2019 container images with dockerContainer nameSQL Server passwordConfirm passwordPortDeployment target: new AKS clusterDeployment target: existing AKS clusterSQL Server Big Data Cluster settingsCluster nameController usernamePasswordConfirm passwordAzure settingsSubscription idUse my default Azure subscriptionResource group nameRegionAKS cluster nameVM sizeVM countDeployment target: existing Kubernetes cluster (kubeadm)Storage class nameCapacity for data (GB)Capacity for logs (GB)SQL Server on WindowsRun SQL Server on Windows, select a version to get started.I accept {0}, {1} and {2}.Microsoft Privacy Statementazdata License TermsSQL Server License TermsUnknown field type: "{0}"{0} doesn't meet the password complexity requirement. For more information: https://docs.microsoft.com/sql/relational-databases/security/password-policy{0} doesn't match the confirmation passwordPlease fill out the required fields marked with red asterisks.Invalid output received.Error retrieving version information.{0}Error: {1}{0}stdout: {2} Open NotebookFailed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.The resource type: {0} is not definedThe notebook {0} does not existDownload and launch installer, URL: {0}Downloading from: {0}Successfully downloaded: {0}Launching: {0}Successfully launched: {0}Download failed, status code: {0}, message: {1}Save config filesScript to NotebookDeployDeploy SQL Server Big Data Cluster "{0}"Connect to Master SQL ServerSuccessfully deployed SQL Server Big Data Cluster: {0}Failed to retrieve the endpoint list. {0}{1}Master SQL Server endpoint is not found.View error detailFailed to deploy SQL Server Big Data Cluster "{0}".An error occured launching the output notebook. {1}{2}.Failed to deploy SQL Server Big Data Cluster and no output notebook was generated.Save config filesConfig files saved to {0}Deploy SQL Server 2019 Big Data Cluster on a new AKS clusterDeploy SQL Server 2019 Big Data Cluster on an existing AKS clusterDeploy SQL Server 2019 Big Data Cluster on an existing kubeadm clusterSummaryA browser window for logging to Azure will be opened during the SQL Server Big Data Cluster deployment.Deployment targetKube configCluster contextCluster settingsDeployment profileCluster nameController usernameAuthentication modeActive DirectoryBasicOrganizational unitDomain controller FQDNsDomain DNS IP addressesDomain DNS nameCluster admin groupCluster usersApp ownersApp readersService account usernameAzure settingsSubscription idDefault Azure SubscriptionResource groupLocationAKS cluster nameVM sizeVM countScale settingsSQL Server master instancesCompute pool instancesData pool instancesSpark pool instancesStorage pool (HDFS) instances(Spark included)Storage class for dataClaim size for data (GB)Storage class for logsClaim size for logs (GB)ControllerStorage pool (HDFS)DataSQL Server MasterStorage settingsSQL Server MasterGatewayApplication proxyManagement proxyReadable secondaryEndpoint settingsError occured while closing the wizard: {0}, open 'Debugger Console' for more information.Azure settingsConfigure the settings to create an Azure Kubernetes Service clusterSubscription idUse my default Azure subscriptionThe default subscription will be used if you leave this field blank.{0}View available Azure subscriptionsNew resource group nameLocation{0}View available Azure locationsAKS cluster nameVM countVM size{0}View available VM sizesCluster settingsConfigure the SQL Server Big Data Cluster settingsCluster nameAdmin usernameThis username will be used for controller and SQL Server. Username for the gateway will be root.PasswordThis password can be used to access the controller, SQL Server and gateway.Confirm passwordAuthentication modeBasicActive DirectoryDocker settingsRegistryRepositoryImage tagUsernamePasswordActive Directory settingsOrganizational unitDistinguished name for the organizational unit. For example: OU=bdc,DC=contoso,DC=com.Domain controller FQDNsUse comma to separate the values.Fully qualified domain names for the domain controller. For example: DC1.CONTOSO.COM. Use comma to separate multiple FQDNs.Domain DNS IP addressesUse comma to separate the values.Domain DNS servers' IP Addresses. Use comma to separate multiple IP addresses.Domain DNS nameCluster admin groupThe Active Directory group for cluster admin.Cluster usersUse comma to separate the values.The Active Directory users/groups with cluster users role. Use comma to separate multiple users/groups.Service account usernameDomain service account for Big Data ClusterService account passwordApp ownersUse comma to separate the values.The Active Directory users or groups with app owners role. Use comma to separate multiple users/groups.App readersUse comma to separate the values.The Active Directory users or groups of app readers. Use comma as separator them if there are multiple users/groups.PasswordThere are some errors on this page, click 'Show Details' to view the errors.Service settingsScale settingsSQL Server master instancesCompute pool instancesData pool instancesSpark pool instancesStorage pool (HDFS) instancesInclude Spark in storage poolUse controller settingsStorage class for dataClaim size for data (GB)Storage class for logsClaim size for logs (GB)ControllerBy default Controller storage settings will be applied to other services as well, you can expand the advanced storage settings to configure storage for other services.Advanced storage settingsStorage pool (HDFS)Data poolSQL Server MasterStorage settingsDNS namePortController DNS nameController portSQL Server Master DNS nameSQL Server Master portGatewayGateway DNS nameGateway portManagement proxyManagement proxy DNS nameManagement proxy portApplication proxyApplication proxy DNS nameApplication proxy portReadable secondaryReadable secondary DNS nameReadable secondary portEndpoint settingsInvalid Spark configuration, you must check the 'Include Spark' checkbox or set the 'Spark pool instances' to at least 1.Target cluster contextSelect the kube config file and then select a cluster context from the listPlease select a cluster context.Kube config file pathBrowseCluster ContextsNo cluster information is found in the config file or an error ocurred while loading the config fileSelectFailed to load the config fileDeployment configuration templateSelect the target configuration templateNote: The settings of the deployment profile can be customized in later steps.Failed to load the deployment profiles: {0}ServiceInstancesSQL Server MasterComputeDataHDFS + SparkStorage sizeGB per InstanceData storageLog storageFeaturesBasic authenticationActive Directory authenticationHigh AvailabilityPlease select a deployment profile.Provides the ability to package and run an application in isolated containersdockerA command-line tool for managing Azure resourcesAzure CLIA command-line utility written in Python that enables cluster administrators to bootstrap and manage the Big Data Cluster via REST APIsazdataA command-line tool allows you to run commands against Kubernetes clusterskubectlSelect the deployment optionsSelectYou must agree to the license agreements in order to proceed.ToolDescriptionInstalledVersionOptionsRequired toolsNo tools required{0}: {1}Additional status information for tool: {0}. {1}YesNoYou will need to restart Azure Data Studio if the tools are installed after Azure Data Studio is launched to pick up the updated PATH environment variable. You may find additional details in the debug console.Some required tools are not installed or do not meet the minimum version requirement.