Files
azuredatastudio/extensions/resource-deployment/notebooks/bdc/2019/deploy-bdc-existing-aks.ipynb
Arvind Ranasaria 738ca479e4 fixes for #8165, #8167, & #8260 (#8250)
* saving untested work

* fixes for #8165 and #8167

* minor fixes

* fix for #8260

* minor quoting fixes

* fix for #8264

* minor fixes

* minor fixes.

* move tools constants to their own files

* remove execution cell results from notebooks.

* remove extraneous changes

* move ensuring of  StoragePath to platformservice

* remove fix for #8264 pending pm input
2019-11-08 09:11:21 -08:00

329 lines
14 KiB
Plaintext

{
"metadata": {
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python",
"version": "3.6.6",
"mimetype": "text/x-python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"pygments_lexer": "ipython3",
"nbconvert_exporter": "python",
"file_extension": ".py"
}
},
"nbformat_minor": 2,
"nbformat": 4,
"cells": [
{
"cell_type": "markdown",
"source": [
"![Microsoft](https://raw.githubusercontent.com/microsoft/azuredatastudio/master/src/sql/media/microsoft-small-logo.png)\n",
" \n",
"## Deploy SQL Server 2019 Big Data Cluster on an existing Azure Kubernetes Service (AKS) cluster\n",
" \n",
"This notebook walks through the process of deploying a <a href=\"https://docs.microsoft.com/sql/big-data-cluster/big-data-cluster-overview?view=sqlallproducts-allversions\">SQL Server 2019 Big Data Cluster</a> on an existing AKS cluster.\n",
" \n",
"* Follow the instructions in the **Prerequisites** cell to install the tools if not already installed.\n",
"* The **Required information** will check and prompt you for password if it is not set in the environment variable. The password can be used to access the cluster controller, SQL Server, and Knox.\n",
"\n",
"<span style=\"color:red\"><font size=\"3\">Please press the \"Run Cells\" button to run the notebook</font></span>"
],
"metadata": {
"azdata_cell_guid": "82e60c1a-7acf-47ee-877f-9e85e92e11da"
}
},
{
"cell_type": "markdown",
"source": [
"### **Prerequisites** \n",
"Ensure the following tools are installed and added to PATH before proceeding.\n",
" \n",
"|Tools|Description|Installation|\n",
"|---|---|---|\n",
"|kubectl | Command-line tool for monitoring the underlying Kuberentes cluster | [Installation](https://kubernetes.io/docs/tasks/tools/install-kubectl/#install-kubectl-binary-using-native-package-management) |\n",
"|azdata | Command-line tool for installing and managing a Big Data Cluster |[Installation](https://docs.microsoft.com/en-us/sql/big-data-cluster/deploy-install-azdata?view=sqlallproducts-allversions) |"
],
"metadata": {
"azdata_cell_guid": "714582b9-10ee-409e-ab12-15a4825c9471"
}
},
{
"cell_type": "markdown",
"source": [
"### **Setup**"
],
"metadata": {
"azdata_cell_guid": "e3dd8e75-e15f-44b4-81fc-1f54d6f0b1e2"
}
},
{
"cell_type": "code",
"source": [
"import pandas,sys,os,json,html,getpass,time\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"def run_command(command):\n",
" print(\"Executing: \" + command)\n",
" !{command}\n",
" if _exit_code != 0:\n",
" sys.exit(f'Command execution failed with exit code: {str(_exit_code)}.\\n\\t{command}\\n')\n",
" print(f'Successfully executed: {command}')"
],
"metadata": {
"azdata_cell_guid": "d973d5b4-7f0a-4a9d-b204-a16480f3940d",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 1
},
{
"cell_type": "markdown",
"source": [
"### **Set variables**\n",
"Generated by Azure Data Studio using the values collected in the Deploy Big Data Cluster wizard"
],
"metadata": {
"azdata_cell_guid": "4b266b2d-bd1b-4565-92c9-3fc146cdce6d"
}
},
{
"cell_type": "markdown",
"source": [
"### **Check dependencies**"
],
"metadata": {
"azdata_cell_guid": "2544648b-59c9-4ce5-a3b6-87086e214d4c"
}
},
{
"cell_type": "code",
"source": [
"run_command('kubectl version --client=true')\n",
"run_command('azdata --version')"
],
"metadata": {
"azdata_cell_guid": "691671d7-3f05-406c-a183-4cff7d17f83d",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 0
},
{
"cell_type": "markdown",
"source": [
"### **Required information**"
],
"metadata": {
"azdata_cell_guid": "0bb02e76-fee8-4dbc-a75b-d5b9d1b187d0"
}
},
{
"cell_type": "code",
"source": [
"invoked_by_wizard = \"AZDATA_NB_VAR_BDC_ADMIN_PASSWORD\" in os.environ\n",
"if invoked_by_wizard:\n",
" mssql_password = os.environ[\"AZDATA_NB_VAR_BDC_ADMIN_PASSWORD\"]\n",
"else:\n",
" mssql_password = getpass.getpass(prompt = 'SQL Server 2019 Big Data Cluster controller password')\n",
" if mssql_password == \"\":\n",
" sys.exit(f'Password is required.')\n",
" confirm_password = getpass.getpass(prompt = 'Confirm password')\n",
" if mssql_password != confirm_password:\n",
" sys.exit(f'Passwords do not match.')\n",
"print('You can also use the controller password to access Knox and SQL Server.')"
],
"metadata": {
"azdata_cell_guid": "e7e10828-6cae-45af-8c2f-1484b6d4f9ac",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 3
},
{
"cell_type": "markdown",
"source": [
"### **Set and show current context**"
],
"metadata": {
"azdata_cell_guid": "127c8042-181f-4862-a390-96e59c181d09"
}
},
{
"cell_type": "code",
"source": [
"run_command(f'kubectl config use-context {mssql_cluster_context}')\n",
"run_command('kubectl config current-context')"
],
"metadata": {
"azdata_cell_guid": "7d1a03d4-1df8-48eb-bff0-0042603b95b1",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 0
},
{
"cell_type": "markdown",
"source": [
"### **Create deployment configuration files**"
],
"metadata": {
"azdata_cell_guid": "138536c3-1db6-428f-9e5c-8269a02fb52e"
}
},
{
"cell_type": "code",
"source": [
"mssql_target_profile = 'ads-bdc-custom-profile'\n",
"if not os.path.exists(mssql_target_profile):\n",
" os.mkdir(mssql_target_profile)\n",
"bdcJsonObj = json.loads(bdc_json)\n",
"controlJsonObj = json.loads(control_json)\n",
"bdcJsonFile = open(f'{mssql_target_profile}/bdc.json', 'w')\n",
"bdcJsonFile.write(json.dumps(bdcJsonObj, indent = 4))\n",
"bdcJsonFile.close()\n",
"controlJsonFile = open(f'{mssql_target_profile}/control.json', 'w')\n",
"controlJsonFile.write(json.dumps(controlJsonObj, indent = 4))\n",
"controlJsonFile.close()\n",
"print(f'Created deployment configuration folder: {mssql_target_profile}')"
],
"metadata": {
"azdata_cell_guid": "2ff82c8a-4bce-449c-9d91-3ac7dd272021",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 6
},
{
"cell_type": "markdown",
"source": [
"### **Create SQL Server 2019 Big Data Cluster**"
],
"metadata": {
"azdata_cell_guid": "efe78cd3-ed73-4c9b-b586-fdd6c07dd37f"
}
},
{
"cell_type": "code",
"source": [
"print (f'Creating SQL Server 2019 Big Data Cluster: {mssql_cluster_name} using configuration {mssql_target_profile}')\n",
"os.environ[\"ACCEPT_EULA\"] = 'yes'\n",
"os.environ[\"AZDATA_USERNAME\"] = mssql_username\n",
"os.environ[\"AZDATA_PASSWORD\"] = mssql_password\n",
"if os.name == 'nt':\n",
" print(f'If you don\\'t see output produced by azdata, you can run the following command in a terminal window to check the deployment status:\\n\\t{os.environ[\"AZDATA_NB_VAR_KUBECTL\"]} get pods -n {mssql_cluster_name} ')\n",
"run_command(f'azdata bdc create -c {mssql_target_profile}')"
],
"metadata": {
"azdata_cell_guid": "373947a1-90b9-49ee-86f4-17a4c7d4ca76",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 7
},
{
"cell_type": "markdown",
"source": [
"### **Login to SQL Server 2019 Big Data Cluster**"
],
"metadata": {
"azdata_cell_guid": "4e026d39-12d4-4c80-8e30-de2b782f2110"
}
},
{
"cell_type": "code",
"source": [
"run_command(f'azdata login -n {mssql_cluster_name}')"
],
"metadata": {
"azdata_cell_guid": "79adda27-371d-4dcb-b867-db025f8162a5",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 8
},
{
"cell_type": "markdown",
"source": [
"### **Show SQL Server 2019 Big Data Cluster endpoints**"
],
"metadata": {
"azdata_cell_guid": "c1921288-ad11-40d8-9aea-127a722b3df8"
}
},
{
"cell_type": "code",
"source": [
"from IPython.display import *\n",
"pandas.set_option('display.max_colwidth', -1)\n",
"cmd = f'azdata bdc endpoint list'\n",
"cmdOutput = !{cmd}\n",
"endpoints = json.loads(''.join(cmdOutput))\n",
"endpointsDataFrame = pandas.DataFrame(endpoints)\n",
"endpointsDataFrame.columns = [' '.join(word[0].upper() + word[1:] for word in columnName.split()) for columnName in endpoints[0].keys()]\n",
"display(HTML(endpointsDataFrame.to_html(index=False, render_links=True)))"
],
"metadata": {
"azdata_cell_guid": "a2202494-fd6c-4534-987d-15c403a5096f",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 9
},
{
"cell_type": "markdown",
"source": [
"### **Connect to SQL Server Master instance in Azure Data Studio**\n",
"Click the link below to connect to the SQL Server Master instance of the SQL Server 2019 Big Data Cluster."
],
"metadata": {
"azdata_cell_guid": "621863a2-aa61-46f4-a9d0-717f41c009ee"
}
},
{
"cell_type": "code",
"source": [
"sqlEndpoints = [x for x in endpoints if x['name'] == 'sql-server-master']\n",
"if sqlEndpoints and len(sqlEndpoints) == 1:\n",
" connectionParameter = '{\"serverName\":\"' + sqlEndpoints[0]['endpoint'] + '\",\"providerName\":\"MSSQL\",\"authenticationType\":\"SqlLogin\",\"userName\":' + json.dumps(mssql_username) + ',\"password\":' + json.dumps(mssql_password) + '}'\n",
" display(HTML('<br/><a href=\"command:azdata.connect?' + html.escape(connectionParameter)+'\"><font size=\"3\">Click here to connect to SQL Server Master instance</font></a><br/>'))\n",
"else:\n",
" sys.exit('Could not find the SQL Server Master instance endpoint.')"
],
"metadata": {
"azdata_cell_guid": "48342355-9d2b-4fa6-b1aa-3bc77d434dfa",
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": 10
}
]
}