Compare commits

..

75 Commits

Author SHA1 Message Date
Charles Gagnon
d296b6397e Fix HDFS node for Integrated auth (#12906) (#12907)
* Fix some HDFS issues

* Undo other changes
2020-10-13 15:10:11 -07:00
Vasu Bhog
05615c796d Fix connection when changing kernel from Kusto to SQL (#12881) (#12887)
* Fix Kusto to SQL kernel connection change

* Updated Fix - removes kernel alias mapping while ensuring multi kusto notebooks work properly

* Fix tests
2020-10-12 10:52:24 -07:00
Aasim Khan
e60b01ac00 Adding sql vm and sql db notebooks to october (#12880)
* SQL VM deployments (#12144)

* Added sql vm deployment option

* Added more fields for sql vm deployments

* created basic sqlvm deployment. Mostly hardcoded

* added string to package.nls

* added poc deployments for sql vm

* Made some changes in the notebook that was mentioned in PR

* Added scaffolding for azure sql vm wizard.

* code cleanups

* added some async logic

* added loading component

* fixed loader code

* completed page2 of wizard

* added some more required fields.

* added some more fields

* added network settings page

* added sql server settings page

* added azure signin support and sql server settings page

* added some helper methods in wizard code

* added some fixes

* fixed azure and vm setting page
added validation in azure setting page

* added changes for the notebook variable

* validations and other bug fixes

* commenting sql storage optimization dropdown

* cleanedup wizard base page

* reversing  vm image list to display newer images first

* cleaning model code

* added validations for network setting

* Completed summary page
fixed the code poisition
some additional field validations

* fixed networking page

* - fixed an error with vm size model variable
- removed byol images because it was not working with az sql vm
- Fixed vm size display names in dropdown

* added double quotes to some localized strings

* added some space inside strings

* -Added live validations
-Restyled network component
-Added required to regions
-Some bug fixes

* -redesigned summary page
-localized some strings

* Fixed summary page section titles

* -Fixed validations on sql server settings page
-Fixed some fields on Summary Page

* corrected onleave validation
using array for error messages
using Promises.all

* Fixed bug on network settings dropdowns when user does not have existing resource to populate them

* Change resource deployment display name
Added Ninar's iteration of the notebook
Changed RDP check box label
Surfacing API errors to user
Filtering regions based on Azure VM regions and user's subscription region
Made form validation async
Displaying new checkbox on network page when dropdowns empty
Fixed a small bug in SQL auth form validation
Made summary single item per row and fixed the gaps in spacing
Fixed validations in vm page
Checking if vm name already exists on azure

* Fixed sql vm eula
Fixed sql vm description
Added hyperlink for more info on vm sizes

* Replaced loading component with dropdown loaders.

* localized string
Fixed a bug in network settings page

* Added additonal filtering

* added reverse to image images

* Fixing some merge related issues

* Fixed conflicts

* sql db deployments into main (WIP) (#12767)

* added my resource-deployment

* changed notebook message

* Add more advanced properties for spark job submission dialog (#12732)

* Add more advanced properties for spark job submission dialog

* Add queue

* Revert "Add more advanced properties for spark job submission dialog (#12732)"

This reverts commit e6a7e86ddbe70b39660098a8ebd9ded2a1c5530c.

* Changes made for simplification

* changed error messages

* tags added

* tags removed due to redundancy

* Update package.json

* Update resourceTypePickerDialog.ts

* changes based on feedback

* activaterealtimevalidation removed

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>

* adding tags to sql vm

* added register navigation for Azure settings page

* simplified check

Co-authored-by: Alex Ma <alma1@microsoft.com>
Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2020-10-11 13:06:41 -07:00
Charles Gagnon
b68cdbeebe Update HDFS mount path (#12865) (#12866) 2020-10-09 15:49:21 -07:00
Charles Gagnon
7429407029 [Port] Sync up arc and azdata extensions with main (#12810)
* Sync up arc and azdata extensions with main

* capture 'this' to use retrieveVariable as callback (#12828)

* capture 'this' to use retrieveVariable as callback

* remove change not needed for #12082

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-08 16:03:27 -07:00
Barbara Valdez
6adeffbc8e replace pip in notebook (#12808) (#12827) 2020-10-08 15:23:30 -07:00
Chris LaFreniere
8a078d2d68 default to relative links in images and links (#12802) (#12813) 2020-10-08 12:30:11 -07:00
Charles Gagnon
eadac3af3a Fix arc strings (#12803) 2020-10-07 20:37:59 -07:00
Charles Gagnon
8e8d9b5f59 port c679d5e1f0 (#12780)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-07 20:36:51 -07:00
Aasim Khan
93e806cca1 Aasim/release1.23/resource filter (#12796)
* Added categories and search based filtering to the resource dialog. (#12658)

* added filtering to the resource type along with a new component.

* -Added caching of cards
-Removed unused component props
-localized tags
-limited the scope of list items

* Made some changes in the PR

* - Added Iot Category to SQL edge
- Moved category names to constants
- Moved localization strings to localized constants
- Made filtering logic more concise
- Changed how category list is generated
--Category list can now be ordered
-Added back event generation for selectedCard

* Fixed bugs, and some additional changes
-Fixed radiogroup height to avoid the movement of options below it
-Restoring the focus back to the search and listview components
- Added focus behaviour for listview
- Fixed a typo in comment

* Made categories an Enum

* Added localized string

* localized category string
converted categories to enum.

* made the filtering logic more concise.

* returning string if no localized string formed
removed unnecessary returns

* fixed the filtering tag logic
resetting search when category is changed

* removing the iot tag from sql edge deployment

* made filtering logic more concise
made enum const

* added vscode list

* some cleanup

* Some PR changes
- Made PR camelcase
- added comments to SQL
- removed unnecessary export

* -Some PR related changes
-Removing unsupported style property
-scoping down css and removing unused ones.

* Fixed a comment text

* Fixed typings for listview event

* Adding tags to azure sql deployment
2020-10-07 14:55:09 -07:00
Charles Gagnon
98ed0d5274 cherry-picked from b8de69dfac (#12777)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-07 10:35:10 -07:00
Chris LaFreniere
7bca43524e Notebooks: WYSIWYG Add Redo, Fix Shortcuts (#12752) (#12784)
* Add redo and out/indent

* Check for active cell before doing shortcut

* PR feedback

* Remove unnecessary parameter
2020-10-07 10:01:46 -07:00
Charles Gagnon
a8c983519e Save username/password for BDC HDFS connections (#12667) (#12778)
* Save username/password for BDC HDFS connections

* comment
2020-10-06 21:51:04 -07:00
Charles Gagnon
ac6ef2639f Port 807a4ae8c4 (#12747) 2020-10-06 13:41:27 -07:00
Barbara Valdez
35957cc283 Fix search for pinned notebooks (#12719) (#12766)
* fix search for pinned notebooks

* fix filtering when verifying that a search folder is not a subdirectory from the current folder queries path

* Show book node on pinned notebooks search results

* fix parent node on pinned notebooks search results

* fix search for pinned notebook and modify how pinned notebooks are stored in workspace

* update format of pinned notebooks for users that used the september release version

* removed unused functions

* Address PR comments

* fix parent node for legacy version of jupyter books

* remove cast from book path
2020-10-06 13:38:39 -07:00
Charles Gagnon
b054295eac Add additional logging to spark command failures (#12706) (#12761) 2020-10-06 11:47:06 -07:00
Charles Gagnon
5b7a7c9865 Fix HDFS node to only show up for BDC connections (#12714) (#12762) 2020-10-06 11:36:40 -07:00
Charles Gagnon
867faae14f [Port] Improved behavior for accepting EULA. (#12453) (#12749)
* Improved behavior for accepting EULA. (#12453)

* working version of overloading "select" button

* promptForEula to use showErrorMessage

* make parameter optional in promptForEula

* remove test code

* PR feedback

* eula to EULA

* minor fix

* Fix compile error

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 18:52:16 -07:00
Charles Gagnon
4c6b606c82 use selected subscriptions (#12691) (#12741)
* working version

* pr feedback

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 18:51:26 -07:00
Monica Gupta
d5daaf918d Fix notebook issue when creating Kusto notebooks 2nd time after launching ADS (#12700) (#12750)
* Fix notebook issue

* Removed not required code

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-05 15:56:19 -07:00
Charles Gagnon
72d48bda61 Allow non-admin BDC connections to see BDC features (#12663) (#12737)
* Add handling for non-admin BDC users

* Bump STS

* Fix HDFS root node commands

* remove nested awaits

* colon
2020-10-05 15:55:23 -07:00
Charles Gagnon
93156ccf04 cherry-pick 7bfea07b9b (#12742)
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-10-05 15:39:48 -07:00
Udeesha Gautam
781c7de772 ML extension - revised button component (#12674) (#12746)
* Revert "Revert "ML extension updates  (#11817)" (#12645)"

This reverts commit 34a6200a47.

* Modified button template and renamed infoButton ElementRef

* fix rendering issue

* Minor code cleanup.

* add clean up previous button logic

Co-authored-by: Alan Ren <alanren@microsoft.com>

Co-authored-by: Hale Rankin <harankin@microsoft.com>
Co-authored-by: Alan Ren <alanren@microsoft.com>
2020-10-05 13:58:01 -07:00
Charles Gagnon
41e8b73ac4 vBump notebooks to get latest CU6 version of book (#12683) (#12739) 2020-10-05 13:41:16 -07:00
Udeesha Gautam
61254c7298 Updating SqltoolsService Version to Pick DacFx changes (#12743)
Co-authored-by: Benjin Dubishar <benjin.dubishar@gmail.com>
2020-10-05 13:06:44 -07:00
Charles Gagnon
5f59fa021c Fix checkbox change event not firing on enter press (#12703) (#12735)
* Fix checkbox change event not firing

* Add comment
2020-10-05 12:53:59 -07:00
Charles Gagnon
1f65216889 Port bf9646ba98 (#12738) 2020-10-05 12:52:46 -07:00
Charles Gagnon
c801d46814 Fix root group name check (#12660) (#12736) 2020-10-05 12:51:23 -07:00
Alan Ren
6c85cf2bdd update preview feature notification (#12723) (#12734) 2020-10-05 12:48:28 -07:00
Aasim Khan
9067204979 Aasim/release1.23/importfixes (#12721)
* Fixing import getting stuck on step 4  (#12677)

* Getting the proper attribute during column modification
Exposing errors of change column settings and stopping import if they occur

* removing extra space

* Added a comment for error handling

* Fixed a test error that was caused due to insufficient null checks.

* removing unnecessary return

* version bump of flat file services (#12686)
2020-10-02 15:17:55 -07:00
Karl Burtram
ac6bc56c4e Bump ADS to 1.23.0 2020-10-02 14:54:53 -07:00
Charles Gagnon
1b5c54dd8c revert grid streaming changes (#12650) (#12652)
(cherry picked from commit cf9754f627)

Co-authored-by: Lucy Zhang <luczhan@microsoft.com>
2020-09-28 21:49:49 -07:00
Aditya Bist
4082170522 bump version for hotfix (#12592) 2020-09-25 21:10:34 -07:00
Alan Ren
5ecf1c6e6f bump sts version (#12636) (#12638) 2020-09-25 14:59:04 -07:00
Charles Gagnon
6de11c8107 Fix undefined error in server tree data source (#12616) (#12617)
* Fix undefined error in server tree data source

* Add comment

(cherry picked from commit 1ea33d83bf)
2020-09-25 13:43:47 -07:00
Monica Gupta
76d7b0a9fe Addressed comments (#12618)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-24 17:16:23 -07:00
Alan Ren
ce4c3e9586 clone the object to be modified (#12583) (#12590) 2020-09-23 13:42:44 -07:00
Alan Ren
5190bf376c escape the value for display (#12547) (#12571) 2020-09-22 14:50:41 -07:00
Udeesha Gautam
77b9a708df fix the reference error due to extra $ in default variable (#12524) 2020-09-21 10:21:23 -07:00
Udeesha Gautam
a4ee871b88 Port/db project fixes (#12521)
* Update default values and example text when dropdown value changes (#12493)

* remove option to add reference to same database (#12495)

Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
2020-09-20 21:09:46 -07:00
Charles Gagnon
3f4e19fc08 Arc good ARC bad (#12499) (#12511)
Co-authored-by: Chris LaFreniere <40371649+chlafreniere@users.noreply.github.com>
2020-09-20 11:44:30 -07:00
Barbara Valdez
571fca6de5 In-Viewlet Notebooks Search (#12455) (#12514)
* fix search

* Add sql carbon tags to vs files

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>
2020-09-19 18:13:10 -07:00
Barbara Valdez
5a2fdc4034 Add warning message for users using the new version of jupyter book (#12496) (#12500)
* Add warning message for users

* Address pr comments
2020-09-18 20:15:12 -07:00
Chris LaFreniere
cc6d84e7f6 Notebooks: Fix Grids Not Rendering when Unsaved Notebook Reloaded (#12483) (#12498)
* Clear Output and fix output change

* Fix tests after forced clear + append output
2020-09-18 20:14:45 -07:00
Vasu Bhog
99e11d2e22 Fix PySpark kernel connection change (#12494) (#12497) 2020-09-18 20:10:37 -07:00
Charles Gagnon
9a85123e21 Revert BDC deployment back to using old azdata check (#12470) (#12474) 2020-09-18 18:46:22 -07:00
Lucy Zhang
56669db6b6 update resultSet in data provider (#12478) (#12486) 2020-09-18 18:36:40 -07:00
Udeesha Gautam
8782eeb32f Port/ml fixes (#12491)
* change to allow refresh and delete correctly (#12477)

* add table name to models that are imported (#12445)
2020-09-18 17:44:58 -07:00
Charles Gagnon
7f3d5bac0a start with eulaCheckButton hidden (#12427) (#12458)
* start with eulaCheckButton hidden

* reset buttons on card select

* remove testcode

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-18 11:52:32 -07:00
Charles Gagnon
7a1e0a7d2e Fix resource deployment text field validation (#12421) (#12457) 2020-09-18 11:22:23 -07:00
Alan Ren
681ecbd946 fix the legacy card style issue (#12428) (#12442)
* fix the legacy card style issue

* replace the card class
2020-09-18 11:14:02 -07:00
Vasu Bhog
e7798a8e32 Fix Spark kernel connections and switch from Kusto to Spark kernels (#12436) (#12441)
* Fix connection dialog for Spark and issue when switching from Kusto to Spark

* Address comments
2020-09-17 21:32:03 -07:00
Aasim Khan
b158180ef4 Added portal link for Azure SQL (#12425) 2020-09-17 17:37:41 -07:00
Aditya Bist
7ad9da7fda fix connection dialog indentation (#12414) 2020-09-17 15:55:54 -07:00
Charles Gagnon
94e2016a16 Port updates for removing EULA acceptance checkbox from Arc deployments (#12409)
* controller dropdown field to SQL MIAA and Postgres deployment. (#12217)

* saving first draft

* throw if no controllers

* cleanup

* bug fixes

* bug fixes and caching controller access

* pr comments and bug fixes.

* fixes

* fixes

* comment fix

* remove debug prints

* comment fixes

* remove debug logs

* inputValueTransformer returns string|Promise

* PR feedback

* pr fixes

* remove _ from protected fields

* anonymous to full methods

* small fixes

(cherry picked from commit 9cf80113fc)

* fix option sources (#12387)


(cherry picked from commit fca8b85a72)

* Remove azdata eula acceptance from arc deployments (#12292)

* saving to switch tasks

* activate to exports in extApi

* working version - cleanup pending

* improve messages

* apply pr feedback from a different review

* remove unneeded strings

* redo apiService

* remove async from getVersionFromOutput

* remove _ prefix from protected fields

* error message fix

* throw specif errors from azdata extension

* arrow methods to regular methods

* pr feedback

* expand azdata extension api

* pr feedback

* remove unused var

* pr feedback

(cherry picked from commit ba44a2f02e)

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-09-17 15:05:02 -07:00
Aditya Bist
21bb577da8 fix maximize bug (#12335) 2020-09-17 14:18:53 -07:00
Udeesha Gautam
5e8325ba28 marking intermittent test failure as unstable (#12402) (#12407) 2020-09-17 13:36:13 -07:00
Aasim Khan
25b7ccade3 Added awaits to change column setting (#12315) 2020-09-17 13:28:21 -07:00
Barbara Valdez
57940c581c Update Windows command and minor update to installation cell (#12361) (#12400)
* Fix windows command and minor update to installation cell

* Add expand_section field on the first section of the book
2020-09-17 13:17:50 -07:00
Chris LaFreniere
82f9e4e24b Notebooks: Fast update WYSIWYG support for source update (#12289) (#12399)
* Fast update WYSIWYG support for source update

* Do bracket matching over hardcoding line offsets
2020-09-17 13:17:03 -07:00
Hale Rankin
3e22fcfd2d 12360 Notebook UI - Mac/Win fix for Select all. (#12383) (#12397)
* 12360 Notebook UI - Mac/Win fix for Select all.

* Fix for ctrl key selecting all in windows

* Fix undo as well

* preventDefault to prevent confusing behavior

Co-authored-by: chlafreniere <hichise@gmail.com>

Co-authored-by: chlafreniere <hichise@gmail.com>
2020-09-17 12:18:47 -07:00
Lucy Zhang
0bc81e1078 Fix notebook table rendering with multiple code cells (#12363) (#12391)
* create unique query runner for each cell

* use cellUri instead of cellId to identify runner

* disconnect each query runner connection

* remove queryrunners size check
2020-09-17 10:32:11 -07:00
Barbara Valdez
7b6328dccf Fix highlight issue (#12278) (#12362)
* Fix highlight issue

* Address PR comments
2020-09-16 13:48:14 -07:00
Vasu Bhog
05124273ea Fix Notebook Kusto Kernel Consistency (#12256) (#12352)
* fix kusto notebook consistency

* Address undefined
2020-09-16 12:08:28 -07:00
Lucy Zhang
b1d4444522 Fix notebook cancel query bug (#12300) (#12351)
* fix undefined query runner error

* store connection id

* revert sqlSessionManager change
2020-09-16 12:07:38 -07:00
Alan Ren
4ee2d369cf vbump sql-db-proj extension (#12336) (#12354)
* vbump sql-db-proj extension (#12336)

* update sqlproj dependency version (#12359)

Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>
2020-09-16 11:47:51 -07:00
Charles Gagnon
fb28b69bb0 Fix component items in declarative table not showing (#12330) (#12331)
(cherry picked from commit 4dd04cb250)
2020-09-16 11:43:01 -07:00
Chris LaFreniere
f2709c7100 Watch for on load event (#12309) (#12346) 2020-09-16 00:43:23 -07:00
Chris LaFreniere
3476f5ae38 Add newline after caption (#12276) (#12340) 2020-09-15 23:07:18 -07:00
Chris LaFreniere
b937fdee7a 12284 Removed custom CSS that positioned editor text beneath overlapping layers. Text is now selectable. (#12312) (#12339)
Co-authored-by: Hale Rankin <harankin@microsoft.com>
2020-09-15 23:04:30 -07:00
Chris LaFreniere
dd9ac2e362 Add heasdingStyle atx option (#12286) (#12338) 2020-09-15 22:50:46 -07:00
Alan Ren
403ff6cfec remove data-workspace dependency (#12321) (#12327) 2020-09-15 17:05:29 -07:00
Udeesha Gautam
4a6226974e adding icon for add new and open project (#12265) (#12324) 2020-09-15 16:28:42 -07:00
Charles Gagnon
6a2c47f511 Disable resource viewer (#12291) (#12298)
* Disable resource viewer

* comment

* Remove unused

(cherry picked from commit 95b76f08f2)
2020-09-15 16:13:53 -07:00
Aditya Bist
3d9a316f4b bump vscode version (#12258) 2020-09-14 14:53:26 -07:00
1448 changed files with 18758 additions and 46963 deletions

View File

@@ -12,10 +12,6 @@
{
"file": "build\\actions\\AutoMerge\\dist\\index.js",
"_justification": "False positive from webpacked code"
},
{
"file": ".devcontainer\\devcontainer.json",
"_justification": "Local development environment - not used in production"
}
]
}

View File

@@ -73,7 +73,6 @@ RUN apt-get update \
libnss3 \
libxss1 \
libasound2 \
libgbm1 \
xfonts-base \
xfonts-terminus \
fonts-noto \

11
.github/CODEOWNERS vendored
View File

@@ -1,11 +0,0 @@
# Lines starting with '#' are comments.
# Each line is a file pattern followed by one or more owners.
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon @ranasaria
/extensions/resource-deployment/ @ranasaria
/extensions/arc/ @ranasaria
/extensions/azdata/ @ranasaria
/extensions/dacpac/ @kisantia
/extensions/schema-compare/ @kisantia
/extensions/sql-database-projects/ @Benjin @kisantia

View File

@@ -1,2 +0,0 @@
Needs Logs:
comment: "We need more info to debug your particular issue. If you could attach your logs to the issue (ensure no private data is in them), it would help us fix the issue much faster.\n\nTo find your logs:\n\n- Open command palette (Click **View** -> **Command Palette**)\n- Run the command: **`Developer: Open Logs Folder`**\n\nThis will open the log file locally. Please zip up this folder and attach it to the issue."

View File

@@ -1,7 +0,0 @@
{
"label-to-subscribe-to": [
"list of usernames to subscribe",
"such as:",
"JacksonKearl"
]
}

View File

@@ -31,10 +31,7 @@ jobs:
with:
node-version: 10
# TODO: cache node modules
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
@@ -82,10 +79,7 @@ jobs:
- uses: actions/setup-python@v1
with:
python-version: '2.x'
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron
name: Download Electron
@@ -118,10 +112,7 @@ jobs:
- uses: actions/setup-node@v1
with:
node-version: 10
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
- run: yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron

View File

@@ -1,50 +0,0 @@
name: "Deep Classifier: Runner"
on:
schedule:
- cron: 0 * * * *
repository_dispatch:
types: [trigger-deep-classifier-runner]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
ref: v35
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/apply/fetch-sources
with:
# slightly overlapping to protect against issues slipping through the cracks if a run is delayed
from: 80
until: 5
configPath: classifier
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade numpy scipy scikit-learn joblib nltk simpletransformers torch torchvision
- name: "Run Classifier: Generator"
run: python ./actions/classifier-deep/apply/generate-labels/main.py
- name: "Run Classifier: Labeler"
uses: ./actions/classifier-deep/apply/apply-labels
with:
configPath: classifier
allowLabels: "needs more info|new release"
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,27 +0,0 @@
name: "Deep Classifier: Scraper"
on:
repository_dispatch:
types: [trigger-deep-classifier-scraper]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
ref: v35
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/train/fetch-issues
with:
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.ISSUE_SCRAPER_TOKEN}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}

View File

@@ -1,27 +0,0 @@
name: Latest Release Monitor
on:
schedule:
- cron: 0/5 * * * *
repository_dispatch:
types: [trigger-latest-release-monitor]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
path: ./actions
ref: v35
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Storage Module
run: npm install @azure/storage-blob@12.1.1
- name: Run Latest Release Monitor
uses: ./actions/latest-release-monitor
with:
storageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -1,15 +0,0 @@
name: On Label
on:
issues:
types: [labeled]
jobs:
processLabelAction:
name: Process Label Action
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Process Label Action
uses: hramos/label-actions@v1
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"September 2020\"",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"August 2020\"",
"editable": true
},
{

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"September 2020\"",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"August 2020\"",
"editable": true
},
{

View File

@@ -14,7 +14,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"September 2020\"",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"July 2020\"",
"editable": true
},
{

View File

@@ -1,194 +0,0 @@
# Query: .innerHTML =
# Flags: CaseSensitive WordMatch
# Including: src/vs/**/*.{t,j}s
# Excluding: *.test.ts
# ContextLines: 3
22 results - 14 files
src/vs/base/browser/markdownRenderer.ts:
161 const strValue = values[0];
162 const span = element.querySelector(`div[data-code="${id}"]`);
163 if (span) {
164: span.innerHTML = strValue;
165 }
166 }).catch(err => {
167 // ignore
243 return true;
244 }
245
246: element.innerHTML = insane(renderedMarkdown, {
247 allowedSchemes,
248 // allowedTags should included everything that markdown renders to.
249 // Since we have our own sanitize function for marked, it's possible we missed some tag so let insane make sure.
src/vs/base/browser/ui/contextview/contextview.ts:
157 this.shadowRootHostElement = DOM.$('.shadow-root-host');
158 this.container.appendChild(this.shadowRootHostElement);
159 this.shadowRoot = this.shadowRootHostElement.attachShadow({ mode: 'open' });
160: this.shadowRoot.innerHTML = `
161 <style>
162 ${SHADOW_ROOT_CSS}
163 </style>
src/vs/code/electron-sandbox/issue/issueReporterMain.ts:
57 const platformClass = platform.isWindows ? 'windows' : platform.isLinux ? 'linux' : 'mac';
58 addClass(document.body, platformClass); // used by our fonts
59
60: document.body.innerHTML = BaseHtml();
61 const issueReporter = new IssueReporter(configuration);
62 issueReporter.render();
63 document.body.style.display = 'block';
src/vs/code/electron-sandbox/processExplorer/processExplorerMain.ts:
320 content.push(`.highest { color: ${styles.highlightForeground}; }`);
321 }
322
323: styleTag.innerHTML = content.join('\n');
324 if (document.head) {
325 document.head.appendChild(styleTag);
326 }
src/vs/editor/browser/view/domLineBreaksComputer.ts:
107 allCharOffsets[i] = tmp[0];
108 allVisibleColumns[i] = tmp[1];
109 }
110: containerDomNode.innerHTML = sb.build();
111
112 containerDomNode.style.position = 'absolute';
113 containerDomNode.style.top = '10000';
src/vs/editor/browser/view/viewLayer.ts:
507 private _finishRenderingNewLines(ctx: IRendererContext<T>, domNodeIsEmpty: boolean, newLinesHTML: string, wasNew: boolean[]): void {
508 const lastChild = <HTMLElement>this.domNode.lastChild;
509 if (domNodeIsEmpty || !lastChild) {
510: this.domNode.innerHTML = newLinesHTML;
511 } else {
512 lastChild.insertAdjacentHTML('afterend', newLinesHTML);
513 }
525 private _finishRenderingInvalidLines(ctx: IRendererContext<T>, invalidLinesHTML: string, wasInvalid: boolean[]): void {
526 const hugeDomNode = document.createElement('div');
527
528: hugeDomNode.innerHTML = invalidLinesHTML;
529
530 for (let i = 0; i < ctx.linesLength; i++) {
531 const line = ctx.lines[i];
src/vs/editor/browser/widget/diffEditorWidget.ts:
2157
2158 let domNode = document.createElement('div');
2159 domNode.className = `view-lines line-delete ${MOUSE_CURSOR_TEXT_CSS_CLASS_NAME}`;
2160: domNode.innerHTML = sb.build();
2161 Configuration.applyFontInfoSlow(domNode, fontInfo);
2162
2163 let marginDomNode = document.createElement('div');
2164 marginDomNode.className = 'inline-deleted-margin-view-zone';
2165: marginDomNode.innerHTML = marginHTML.join('');
2166 Configuration.applyFontInfoSlow(marginDomNode, fontInfo);
2167
2168 return {
src/vs/editor/standalone/browser/colorizer.ts:
40 let text = domNode.firstChild ? domNode.firstChild.nodeValue : '';
41 domNode.className += ' ' + theme;
42 let render = (str: string) => {
43: domNode.innerHTML = str;
44 };
45 return this.colorize(modeService, text || '', mimeType, options).then(render, (err) => console.error(err));
46 }
src/vs/editor/standalone/browser/standaloneThemeServiceImpl.ts:
212 if (!this._globalStyleElement) {
213 this._globalStyleElement = dom.createStyleSheet();
214 this._globalStyleElement.className = 'monaco-colors';
215: this._globalStyleElement.innerHTML = this._css;
216 this._styleElements.push(this._globalStyleElement);
217 }
218 return Disposable.None;
221 private _registerShadowDomContainer(domNode: HTMLElement): IDisposable {
222 const styleElement = dom.createStyleSheet(domNode);
223 styleElement.className = 'monaco-colors';
224: styleElement.innerHTML = this._css;
225 this._styleElements.push(styleElement);
226 return {
227 dispose: () => {
291 ruleCollector.addRule(generateTokensCSSForColorMap(colorMap));
292
293 this._css = cssRules.join('\n');
294: this._styleElements.forEach(styleElement => styleElement.innerHTML = this._css);
295
296 TokenizationRegistry.setColorMap(colorMap);
297 this._onColorThemeChange.fire(theme);
src/vs/editor/test/browser/controller/imeTester.ts:
55 let content = this._model.getModelLineContent(i);
56 r += content + '<br/>';
57 }
58: output.innerHTML = r;
59 }
60 }
61
69 let title = document.createElement('div');
70 title.className = 'title';
71
72: title.innerHTML = description + '. Type <strong>' + inputStr + '</strong>';
73 container.appendChild(title);
74
75 let startBtn = document.createElement('button');
src/vs/workbench/contrib/notebook/browser/view/renderers/cellRenderer.ts:
454
455 private getMarkdownDragImage(templateData: MarkdownCellRenderTemplate): HTMLElement {
456 const dragImageContainer = DOM.$('.cell-drag-image.monaco-list-row.focused.markdown-cell-row');
457: dragImageContainer.innerHTML = templateData.container.outerHTML;
458
459 // Remove all rendered content nodes after the
460 const markdownContent = dragImageContainer.querySelector('.cell.markdown')!;
611 return null;
612 }
613
614: editorContainer.innerHTML = richEditorText;
615
616 return dragImageContainer;
617 }
src/vs/workbench/contrib/notebook/browser/view/renderers/webviewPreloads.ts:
375 addMouseoverListeners(outputNode, outputId);
376 const content = data.content;
377 if (content.type === RenderOutputType.Html) {
378: outputNode.innerHTML = content.htmlContent;
379 cellOutputContainer.appendChild(outputNode);
380 domEval(outputNode);
381 } else {
src/vs/workbench/contrib/webview/browser/pre/main.js:
386 // apply default styles
387 const defaultStyles = newDocument.createElement('style');
388 defaultStyles.id = '_defaultStyles';
389: defaultStyles.innerHTML = defaultCssRules;
390 newDocument.head.prepend(defaultStyles);
391
392 applyStyles(newDocument, newDocument.body);
src/vs/workbench/contrib/welcome/walkThrough/browser/walkThroughPart.ts:
281
282 const content = model.main.textEditorModel.getValue(EndOfLinePreference.LF);
283 if (!strings.endsWith(input.resource.path, '.md')) {
284: this.content.innerHTML = content;
285 this.updateSizeClasses();
286 this.decorateContent();
287 this.contentDisposables.push(this.keybindingService.onDidUpdateKeybindings(() => this.decorateContent()));
303 const innerContent = document.createElement('div');
304 innerContent.classList.add('walkThroughContent'); // only for markdown files
305 const markdown = this.expandMacros(content);
306: innerContent.innerHTML = marked(markdown, { renderer });
307 this.content.appendChild(innerContent);
308
309 model.snippets.forEach((snippet, i) => {

View File

@@ -2,31 +2,43 @@
# Flags: CaseSensitive WordMatch
# ContextLines: 2
12 results - 4 files
14 results - 4 files
src/vs/base/browser/dom.ts:
83 };
84
81 };
82
83: /** @deprecated ES6 - use classList*/
84 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
85: /** @deprecated ES6 - use classList*/
86 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
86 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
87: /** @deprecated ES6 - use classList*/
88 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
88 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
89: /** @deprecated ES6 - use classList*/
90 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
90 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
91: /** @deprecated ES6 - use classList*/
92 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
92 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
93: /** @deprecated ES6 - use classList*/
94 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
95: /** @deprecated ES6 - use classList*/
96 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
97
94 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
95
src/vs/base/common/arrays.ts:
401
402 /**
403: * @deprecated ES6: use `Array.find`
403: * @deprecated ES6: use `Array.findIndex`
404 */
405 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
405 export function firstIndex<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean): number {
417
418 /**
419: * @deprecated ES6: use `Array.find`
420 */
421 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
568
569 /**
570: * @deprecated ES6: use `Array.find`
571 */
572 export function find<T>(arr: ArrayLike<T>, predicate: (value: T, index: number, arr: ArrayLike<T>) => any): T | undefined {
src/vs/base/common/objects.ts:
115
@@ -54,8 +66,8 @@ src/vs/base/common/strings.ts:
170 */
171 export function endsWith(haystack: string, needle: string): boolean {
857
858 /**
859: * @deprecated ES6
860 */
861 export function repeat(s: string, count: number): string {
861
862 /**
863: * @deprecated ES6
864 */
865 export function repeat(s: string, count: number): string {

View File

@@ -2,52 +2,18 @@
# Flags: RegExp
# ContextLines: 2
8 results - 4 files
2 results - 2 files
src/vs/base/browser/ui/tree/asyncDataTree.ts:
241 } : () => 'treeitem',
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
244 } : undefined,
245 getAriaLabel(e) {
243 } : () => 'treeitem',
244 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
245: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
246 } : undefined,
247 getAriaLabel(e) {
src/vs/platform/list/browser/listService.ts:
463
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) {
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
466 this._register(options?.configurationService.onDidChangeConfiguration(() => {
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
468 }));
469 } else {
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
1526
1527 await this._ensureActiveKernel();
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
1529 }
1530
1535
1536 await this._ensureActiveKernel();
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
1538 }
1539
1553
1554 await this._ensureActiveKernel();
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1556 }
1557
1567
1568 await this._ensureActiveKernel();
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1570 }
1571
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
89 .then(() => this._resourceRequestManager.ensureReady())
90 .then(() => {
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
92 });
93 }
src/vs/workbench/contrib/debug/browser/debugConfigurationManager.ts:
254
255 return debugDynamicExtensions.map(e => {
256: const type = e.contributes?.debuggers![0].type!;
257 return {
258 label: this.getDebuggerLabel(type)!,

View File

@@ -1,3 +1,3 @@
disturl "https://atom.io/download/electron"
target "9.3.0"
target "9.2.1"
runtime "electron"

View File

@@ -1,75 +1,5 @@
# Change Log
## Version 1.25.0
* Release date: December 8, 2020
* Release status: General Availability
* Kusto extension improvements
* SQL Project extension improvements
* Notebook improvements
* Azure Browse Connections Preview performance improvements
* Bug Fixes
## Version 1.24.0
* Release date: November 12, 2020
* Release status: General Availability
* SQL Project improvements
* Notebook improvements, including in WYSIWYG editor enhancements
* Azure Arc improvements
* Azure SQL Deployment UX improvements
* Azure Browse Connections Preview
* Bug Fixes
## Version 1.23.0
* Release date: October 14, 2020
* Release status: General Availability
* Added deployments of Azure SQL DB and VM
* Added PowerShell kernel results streaming support
* Added improvements to SQL Database Projects extension
* Bug Fixes
* Extension Updates:
* SQL Server Import
* Machine Learning
* Schema Compare
* Kusto
* SQL Assessment
* SQL Database Projects
* Azure Arc
* azdata
## Version 1.22.1
* Release date: September 30, 2020
* Release status: General Availability
* Fix bug #12615 Active connection filter doesn't untoggle | [#12615](https://github.com/microsoft/azuredatastudio/issues/12615)
* Fix bug #12572 Edit Data grid doesn't escape special characters | [#12572](https://github.com/microsoft/azuredatastudio/issues/12572)
* Fix bug #12570 Dashboard Explorer table doesn't escape special characters | [#12570](https://github.com/microsoft/azuredatastudio/issues/12570)
* Fix bug #12582 Delete row on Edit Data fails | [#12582](https://github.com/microsoft/azuredatastudio/issues/12582)
* Fix bug #12646 SQL Notebooks: Cells being treated isolated | [#12646](https://github.com/microsoft/azuredatastudio/issues/12646)
## Version 1.22.0
* Release date: September 22, 2020
* Release status: General Availability
* New Notebook Features
* Supports brand new text cell editing experience based on rich text formatting and seamless conversion to markdown, also known as WYSIWYG toolbar (What You See Is What You Get)
* Supports Kusto kernel
* Supports pinning of notebooks
* Added support for new version of Jupyter Books
* Improved Jupyter Shortcuts
* Introduced perf loading improvements
* Added Azure Arc extension - Users can try out Azure Arc public preview through Azure Data Studio. This includes:
* Deploy data controller
* Deploy Postgres
* Deploy Managed Instance for Azure Arc
* Connect to data controller
* Access data service dashboards
* Azure Arc Jupyter Book
* Added new deployment options
* Azure SQL Database Edge
* (Edge will require Azure SQL Edge Deployment Extension)
* Added SQL Database Projects extension - The SQL Database Projects extension brings project-based database development to Azure Data Studio. In this preview release, SQL projects can be created and published from Azure Data Studio.
* Added Kusto (KQL) extension - Brings native Kusto experiences in Azure Data Studio for data exploration and data analytics against massive amount of real-time streaming data stored in Azure Data Explorer. This preview release supports connecting and browsing Azure Data Explorer clusters, writing KQL queries as well as authoring notebooks with Kusto kernel.
* SQL Server Import extension GA - Announcing the GA of the SQL Server Import extension, features no longer in preview. This extension facilitates importing csv/txt files. Learn more about the extension in [this article](sql-server-import-extension.md).
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22September+2020+Release%22+is%3Aclosed).
## Version 1.21.0
* Release date: August 12, 2020
* Release status: General Availability

View File

@@ -19,7 +19,7 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| [Linux DEB][linux-deb] |
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
## Try out the latest insiders build from `main`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
@@ -29,8 +29,6 @@ Go to our [download page](https://aka.ms/getazuredatastudio) for more specific i
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## **Feature Highlights**
@@ -131,10 +129,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2150927
[win-system]: https://go.microsoft.com/fwlink/?linkid=2150928
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2151312
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2151311
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2151508
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2151407
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2151506
[win-user]: https://go.microsoft.com/fwlink/?linkid=2138608
[win-system]: https://go.microsoft.com/fwlink/?linkid=2138704
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2138705
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2138609
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2138706
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2138507
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2138508

View File

@@ -1 +0,0 @@
* text eol=lf

View File

@@ -15,7 +15,7 @@
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.6",
"@actions/core": "^1.2.3",
"@actions/github": "^2.1.1",
"axios": "^0.19.2",
"ts-node": "^8.6.2",

View File

@@ -2,10 +2,10 @@
# yarn lockfile v1
"@actions/core@^1.2.6":
version "1.2.6"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.6.tgz#a78d49f41a4def18e88ce47c2cac615d5694bf09"
integrity sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA==
"@actions/core@^1.2.3":
version "1.2.3"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.3.tgz#e844b4fa0820e206075445079130868f95bfca95"
integrity sha512-Wp4xnyokakM45Uuj4WLUxdsa8fJjKVl1fDTsPbTEcTcuu0Nb26IPQbOtjmnfaCPGcaoPOOqId8H9NapZ8gii4w==
"@actions/github@^2.1.1":
version "2.1.1"

View File

@@ -53,7 +53,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
}
function getEnv(name: string): string {

View File

@@ -17,7 +17,7 @@ const fileNames = [
];
async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> {
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> {
@@ -33,7 +33,7 @@ async function uploadBlob(blobService: azure.BlobService, container: string, blo
}
};
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
}
async function publish(commit: string, files: readonly string[]): Promise<void> {

View File

@@ -43,7 +43,6 @@ function createDefaultConfig(quality: string): Config {
}
function getConfig(quality: string): Promise<Config> {
console.log(`Getting config for quality ${quality}`);
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
@@ -53,13 +52,13 @@ function getConfig(quality: string): Promise<Config> {
]
};
return retry(() => new Promise<Config>((c, e) => {
return new Promise<Config>((c, e) => {
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err && err.code !== 409) { return e(err); }
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config);
});
}));
});
}
interface Asset {
@@ -87,7 +86,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
updateTries++;
return new Promise<void>((c, e) => {
console.log(`Querying existing documents to update...`);
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); }
@@ -103,7 +101,6 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
release.updates[platform] = type;
}
console.log(`Replacing existing document with updated version`);
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) { return c(update()); }
if (err) { return e(err); }
@@ -115,8 +112,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
});
}
return retry(() => new Promise<void>((c, e) => {
console.log(`Attempting to create document`);
return new Promise<void>((c, e) => {
client.createDocument(collection, release, err => {
if (err && err.code === 409) { return c(update()); }
if (err) { return e(err); }
@@ -124,7 +120,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
console.log('Build successfully published.');
c();
});
}));
});
}
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
@@ -192,6 +188,7 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
@@ -250,22 +247,6 @@ async function publish(commit: string, quality: string, platform: string, type:
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
const RETRY_TIMES = 10;
async function retry<T>(fn: () => Promise<T>): Promise<T> {
for (let run = 1; run <= RETRY_TIMES; run++) {
try {
return await fn();
} catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
}
}
throw new Error('Retried too many times');
}
function main(): void {
const commit = process.env['BUILD_SOURCEVERSION'];

View File

@@ -87,6 +87,10 @@ steps:
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
- script: |

View File

@@ -96,6 +96,8 @@ steps:
set -e
yarn gulp package-rebuild-extensions
yarn gulp vscode-darwin-min-ci
yarn gulp vscode-reh-darwin-min-ci
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -123,19 +125,19 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Electron)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# - script: |
# set -e
# node ./node_modules/playwright/install.js
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
# yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
# displayName: Run smoke tests (Browser)
# continueOnError: true
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
node ./node_modules/playwright/install.js
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Browser)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e

View File

@@ -31,10 +31,10 @@ steps:
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git checkout origin/electron-11.x.y
git checkout origin/electron-x.y.z
git merge origin/master
# Push master branch into exploration branch
git push origin HEAD:electron-11.x.y
git push origin HEAD:electron-x.y.z
displayName: Sync & Merge Exploration

View File

@@ -52,25 +52,21 @@ steps:
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
echo -n $VSCODE_ARCH > .build/arch
displayName: Prepare arch cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
set -e
CHILD_CONCURRENCY=1 npm_config_arch=$(NPM_ARCH) yarn --frozen-lockfile
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
@@ -89,64 +85,64 @@ steps:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
yarn gulp vscode-linux-x64-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
yarn gulp vscode-reh-linux-x64-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build
- script: |
set -e
service xvfb start
displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-x64" \
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: 'crash-dump-linux-$(VSCODE_ARCH)'
artifactName: crash-dump-linux
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
@@ -161,26 +157,15 @@ steps:
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
# needed for code signing
- task: UseDotNet@2
displayName: 'Install .NET Core SDK 2.x'
inputs:
version: 2.x
yarn gulp "vscode-linux-x64-build-deb"
yarn gulp "vscode-linux-x64-build-rpm"
yarn gulp "vscode-linux-x64-prepare-snap"
displayName: Build packages
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '.build/linux/rpm'
FolderPath: '.build/linux/rpm/x86_64'
Pattern: '*.rpm'
signConfigType: inlineSignParams
inlineOperation: |
@@ -201,16 +186,14 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- task: PublishPipelineArtifact@0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: 'snap-$(VSCODE_ARCH)'
artifactName: snap-x64
targetPath: .build/linux/snap-tarball
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'

View File

@@ -4,10 +4,11 @@ REPO="$(pwd)"
ROOT="$REPO/.."
# Publish tarball
PLATFORM_LINUX="linux-$VSCODE_ARCH"
PLATFORM_LINUX="linux-x64"
BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$VSCODE_ARCH-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$VSCODE_ARCH-$BUILD_VERSION.tar.gz"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
rm -rf $ROOT/code-*.tar.*
@@ -27,36 +28,24 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish DEB
case $VSCODE_ARCH in
x64) DEB_ARCH="amd64" ;;
*) DEB_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
PLATFORM_DEB="linux-deb-x64"
DEB_ARCH="amd64"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH"
# Publish RPM
case $VSCODE_ARCH in
x64) RPM_ARCH="x86_64" ;;
armhf) RPM_ARCH="armv7hl" ;;
arm64) RPM_ARCH="aarch64" ;;
*) RPM_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
PLATFORM_RPM="linux-rpm-x64"
RPM_ARCH="x86_64"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_RPM" package "$RPM_FILENAME" "$RPM_PATH"
if [ "$VSCODE_ARCH" == "x64" ]; then
# Publish Snap
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)
fi
# Publish Snap
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)

View File

@@ -91,7 +91,8 @@ steps:
- script: |
set -e
yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-web-min-ci
yarn gulp vscode-reh-linux-x64-min-ci
yarn gulp vscode-reh-web-linux-x64-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -133,8 +134,7 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
export INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export NO_CLEANUP=1
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
@@ -149,15 +149,6 @@ steps:
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_UNSTABLE_TESTS'], 'true'))
- script: |
set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/logs-linux-x64.tar.gz adsuser*
displayName: Archive Logs
continueOnError: true
condition: succeededOrFailed()
- script: |
set -e
yarn gulp vscode-linux-x64-build-deb
@@ -230,7 +221,6 @@ steps:
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
condition: succeededOrFailed()
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'

View File

@@ -13,12 +13,6 @@ resources:
- container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub
- container: vscode-arm64
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-arm64
endpoint: VSCodeHub
- container: vscode-armhf
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-armhf
endpoint: VSCodeHub
- container: snapcraft
image: snapcore/snapcraft:stable
@@ -70,9 +64,6 @@ stages:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
@@ -81,28 +72,22 @@ stages:
- Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: vscode-armhf
variables:
VSCODE_ARCH: armhf
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux.yml
- template: linux/product-build-linux-multiarch.yml
- job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: vscode-arm64
variables:
VSCODE_ARCH: arm64
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux.yml
- template: linux/product-build-linux-multiarch.yml
- job: LinuxAlpine
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))

View File

@@ -52,13 +52,9 @@ steps:
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
echo -n $VSCODE_ARCH > .build/arch
displayName: Prepare arch cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
@@ -71,7 +67,7 @@ steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
@@ -116,8 +112,8 @@ steps:
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))

View File

@@ -17,7 +17,7 @@ jobs:
- template: sql-product-compile.yml
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
pool:
vmImage: macOS-latest
dependsOn:
@@ -27,7 +27,7 @@ jobs:
timeoutInMinutes: 180
- job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true))
pool:
vmImage: macOS-latest
dependsOn:
@@ -50,7 +50,7 @@ jobs:
timeoutInMinutes: 70
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
@@ -61,15 +61,15 @@ jobs:
steps:
- template: web/sql-product-build-web.yml
# - job: Docker
# condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
# pool:
# vmImage: 'Ubuntu-16.04'
# container: linux-x64
# dependsOn:
# - Linux
# steps:
# - template: docker/sql-product-build-docker.yml
- job: Docker
condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
dependsOn:
- Linux
steps:
- template: docker/sql-product-build-docker.yml
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -98,7 +98,7 @@ jobs:
dependsOn:
- macOS
- Linux
# - Docker
- Docker
- Windows
- Windows_Test
- LinuxWeb

View File

@@ -96,8 +96,8 @@ steps:
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))

View File

@@ -12,9 +12,9 @@ $ServerZipLocation = "$Repo\.build\win32-$Arch\server"
$ServerZip = "$ServerZipLocation\azuredatastudio-server-win32-$Arch.zip"
# Create server archive
# New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
$global:LASTEXITCODE = 0
# exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
# exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { node build/azure-pipelines/common/copyArtifacts.js } "Copy Artifacts"

View File

@@ -95,8 +95,8 @@ steps:
$ErrorActionPreference = "Stop"
exec { yarn gulp "package-rebuild-extensions" }
exec { yarn gulp "vscode-win32-x64-min-ci" }
exec { yarn gulp "vscode-reh-win32-x64-min" }
exec { yarn gulp "vscode-reh-web-win32-x64-min" }
exec { yarn gulp "vscode-reh-win32-x64-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-x64-min-ci" }
exec { yarn gulp "vscode-win32-x64-code-helper" }
exec { yarn gulp "vscode-win32-x64-inno-updater" }
displayName: Build
@@ -131,7 +131,7 @@ steps:
$AppRoot = "$(agent.builddirectory)\azuredatastudio-win32-x64"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
# exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\azuredatastudio-reh-win32-x64"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\azuredatastudio-reh-win32-x64"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))

View File

@@ -104,7 +104,6 @@ const indentationFilter = [
'!extensions/admin-tool-ext-win/ssmsmin/**',
'!extensions/resource-deployment/notebooks/**',
'!extensions/mssql/notebooks/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/integration-tests/testData/**',
'!extensions/arc/src/controller/generated/**',
'!extensions/sql-database-projects/resources/templates/*.xml',
@@ -179,9 +178,7 @@ const copyrightFilter = [
'!extensions/mssql/src/prompts/**',
'!extensions/kusto/src/prompts/**',
'!extensions/notebook/resources/jupyter_config/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/query-history/images/**',
'!extensions/sql/build/update-grammar.js',
'!**/*.gif',
'!**/*.xlf',
'!**/*.dacpac',

View File

@@ -261,7 +261,7 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
.pipe(fileLengthFilter.restore)
.pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions())
.pipe(electron(_.extend({}, config, { platform, arch: arch === 'armhf' ? 'arm' : arch, ffmpegChromium: true })))
.pipe(electron(_.extend({}, config, { platform, arch, ffmpegChromium: true })))
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version'], { dot: true }));
if (platform === 'linux') {
@@ -345,7 +345,7 @@ const BUILD_TARGETS = [
{ platform: 'darwin', arch: null, opts: { stats: true } },
{ platform: 'linux', arch: 'ia32' },
{ platform: 'linux', arch: 'x64' },
{ platform: 'linux', arch: 'armhf' },
{ platform: 'linux', arch: 'arm' },
{ platform: 'linux', arch: 'arm64' },
];
BUILD_TARGETS.forEach(buildTarget => {

View File

@@ -23,7 +23,7 @@ const commit = util.getVersion(root);
const linuxPackageRevision = Math.floor(new Date().getTime() / 1000);
function getDebPackageArch(arch) {
return { x64: 'amd64', armhf: 'armhf', arm64: 'arm64' }[arch];
return { x64: 'amd64', arm: 'armhf', arm64: 'arm64' }[arch];
}
function prepareDebPackage(arch) {
@@ -53,11 +53,6 @@ function prepareDebPackage(arch) {
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const workspaceMime = gulp.src('resources/linux/code-workspace.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('usr/share/mime/packages/' + product.applicationName + '-workspace.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('usr/share/pixmaps/' + product.linuxIconName + '.png'));
@@ -101,7 +96,7 @@ function prepareDebPackage(arch) {
.pipe(replace('@@UPDATEURL@@', product.updateUrl || '@@UPDATEURL@@'))
.pipe(rename('DEBIAN/postinst'));
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, workspaceMime, icon, bash_completion, zsh_completion, code);
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, icon, bash_completion, zsh_completion, code);
return all.pipe(vfs.dest(destination));
};
@@ -121,7 +116,7 @@ function getRpmBuildPath(rpmArch) {
}
function getRpmPackageArch(arch) {
return { x64: 'x86_64', armhf: 'armv7hl', arm64: 'aarch64' }[arch];
return { x64: 'x86_64', arm: 'armhf', arm64: 'arm64' }[arch];
}
function prepareRpmPackage(arch) {
@@ -150,11 +145,6 @@ function prepareRpmPackage(arch) {
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const workspaceMime = gulp.src('resources/linux/code-workspace.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('BUILD/usr/share/mime/packages/' + product.applicationName + '-workspace.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('BUILD/usr/share/pixmaps/' + product.linuxIconName + '.png'));
@@ -185,7 +175,7 @@ function prepareRpmPackage(arch) {
const specIcon = gulp.src('resources/linux/rpm/code.xpm', { base: '.' })
.pipe(rename('SOURCES/' + product.applicationName + '.xpm'));
const all = es.merge(code, desktops, appdata, workspaceMime, icon, bash_completion, zsh_completion, spec, specIcon);
const all = es.merge(code, desktops, appdata, icon, bash_completion, zsh_completion, spec, specIcon);
return all.pipe(vfs.dest(getRpmBuildPath(rpmArch)));
};
@@ -259,23 +249,33 @@ function buildSnapPackage(arch) {
const BUILD_TARGETS = [
{ arch: 'x64' },
{ arch: 'armhf' },
{ arch: 'arm' },
{ arch: 'arm64' },
];
BUILD_TARGETS.forEach(({ arch }) => {
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
BUILD_TARGETS.forEach((buildTarget) => {
const arch = buildTarget.arch;
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
{
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
// gulp.task(prepareDebTask);
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
}
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
{
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
// gulp.task(prepareRpmTask);
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
}
{
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
}
});

View File

@@ -55,7 +55,7 @@ function getElectron(arch) {
return () => {
const electronOpts = _.extend({}, exports.config, {
platform: process.platform,
arch: arch === 'armhf' ? 'arm' : arch,
arch,
ffmpegChromium: true,
keepDefaultApp: true
});

View File

@@ -61,7 +61,7 @@ function getElectron(arch: string): () => NodeJS.ReadWriteStream {
return () => {
const electronOpts = _.extend({}, config, {
platform: process.platform,
arch: arch === 'armhf' ? 'arm' : arch,
arch,
ffmpegChromium: true,
keepDefaultApp: true
});

View File

@@ -207,25 +207,25 @@ const externalExtensions = [
// they get packaged separately. Adding extension name here, will make the build to create
// a separate vsix package for the extension and the extension will be excluded from the main package.
// Any extension not included here will be installed by default.
'admin-pack',
'admin-tool-ext-win',
'agent',
'arc',
'asde-deployment',
'azdata',
'azurehybridtoolkit',
'cms',
'dacpac',
'import',
'profiler',
'admin-pack',
'dacpac',
'schema-compare',
'cms',
'query-history',
'kusto',
'liveshare',
'machine-learning',
'profiler',
'query-history',
'schema-compare',
'sql-assessment',
'sql-database-projects',
'machine-learning',
'sql-assessment',
'asde-deployment',
'sql-migration',
'data-workspace'
];
// extensions that require a rebuild since they have native parts
const rebuildExtensions = [

View File

@@ -241,25 +241,25 @@ const externalExtensions = [
// they get packaged separately. Adding extension name here, will make the build to create
// a separate vsix package for the extension and the extension will be excluded from the main package.
// Any extension not included here will be installed by default.
'admin-pack',
'admin-tool-ext-win',
'agent',
'arc',
'asde-deployment',
'azdata',
'azurehybridtoolkit',
'cms',
'dacpac',
'import',
'profiler',
'admin-pack',
'dacpac',
'schema-compare',
'cms',
'query-history',
'kusto',
'liveshare',
'machine-learning',
'profiler',
'query-history',
'schema-compare',
'sql-assessment',
'sql-database-projects',
'machine-learning',
'sql-assessment',
'asde-deployment',
'sql-migration',
'data-workspace'
];
// extensions that require a rebuild since they have native parts

View File

@@ -206,10 +206,6 @@
"name": "vs/workbench/contrib/webview",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/webviewPanel",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/customEditor",
"project": "vscode-workbench"

View File

@@ -1004,7 +1004,7 @@ function createResource(project: string, slug: string, xlfFile: File, apiHostnam
* https://dev.befoolish.co/tx-docs/public/projects/updating-content#what-happens-when-you-update-files
*/
function updateResource(project: string, slug: string, xlfFile: File, apiHostname: string, credentials: string): Promise<any> {
return new Promise<void>((resolve, reject) => {
return new Promise((resolve, reject) => {
const data = JSON.stringify({ content: xlfFile.contents.toString() });
const options = {
hostname: apiHostname,

View File

@@ -53,13 +53,6 @@ const CORE_TYPES = [
'trimLeft',
'trimRight'
];
// Types that are defined in a common layer but are known to be only
// available in native environments should not be allowed in browser
const NATIVE_TYPES = [
'NativeParsedArgs',
'INativeEnvironmentService',
'INativeWindowConfiguration'
];
const RULES = [
// Tests: skip
{
@@ -75,37 +68,6 @@ const RULES = [
'MessageEvent',
'data'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/argv.ts
{
target: '**/{vs,sql}/platform/environment/common/argv.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/environment.ts
{
target: '**/{vs,sql}/platform/environment/common/environment.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/windows/common/windows.ts
{
target: '**/{vs,sql}/platform/windows/common/windows.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -119,7 +81,6 @@ const RULES = [
// Safe access to global
'global'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -129,7 +90,6 @@ const RULES = [
{
target: '**/{vs,sql}/**/common/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -139,7 +99,6 @@ const RULES = [
{
target: '**/{vs,sql}/**/browser/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -148,7 +107,6 @@ const RULES = [
{
target: '**/src/{vs,sql}/editor/contrib/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -174,7 +132,7 @@ const RULES = [
},
// Electron (sandbox)
{
target: '**/{vs,sql}/**/electron-sandbox/**',
target: '**/vs/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
@@ -204,7 +162,7 @@ let hasErrors = false;
function checkFile(program, sourceFile, rule) {
checkNode(sourceFile);
function checkNode(node) {
var _a, _b;
var _a;
if (node.kind !== ts.SyntaxKind.Identifier) {
return ts.forEachChild(node, checkNode); // recurse down
}
@@ -212,12 +170,6 @@ function checkFile(program, sourceFile, rule) {
if ((_a = rule.allowedTypes) === null || _a === void 0 ? void 0 : _a.some(allowed => allowed === text)) {
return; // override
}
if ((_b = rule.disallowedTypes) === null || _b === void 0 ? void 0 : _b.some(disallowed => disallowed === text)) {
const { line, character } = sourceFile.getLineAndCharacterOfPosition(node.getStart());
console.log(`[build/lib/layersChecker.ts]: Reference to '${text}' violates layer '${rule.target}' (${sourceFile.fileName} (${line + 1},${character + 1})`);
hasErrors = true;
return;
}
const checker = program.getTypeChecker();
const symbol = checker.getSymbolAtLocation(node);
if (symbol) {

View File

@@ -55,14 +55,6 @@ const CORE_TYPES = [
'trimRight'
];
// Types that are defined in a common layer but are known to be only
// available in native environments should not be allowed in browser
const NATIVE_TYPES = [
'NativeParsedArgs',
'INativeEnvironmentService',
'INativeWindowConfiguration'
];
const RULES = [
// Tests: skip
@@ -81,40 +73,6 @@ const RULES = [
'MessageEvent',
'data'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/argv.ts
{
target: '**/{vs,sql}/platform/environment/common/argv.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/environment.ts
{
target: '**/{vs,sql}/platform/environment/common/environment.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/windows/common/windows.ts
{
target: '**/{vs,sql}/platform/windows/common/windows.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -130,7 +88,6 @@ const RULES = [
// Safe access to global
'global'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -141,7 +98,6 @@ const RULES = [
{
target: '**/{vs,sql}/**/common/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -152,7 +108,6 @@ const RULES = [
{
target: '**/{vs,sql}/**/browser/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -162,7 +117,6 @@ const RULES = [
{
target: '**/src/{vs,sql}/editor/contrib/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -191,7 +145,7 @@ const RULES = [
// Electron (sandbox)
{
target: '**/{vs,sql}/**/electron-sandbox/**',
target: '**/vs/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
@@ -227,7 +181,6 @@ interface IRule {
skip?: boolean;
allowedTypes?: string[];
disallowedDefinitions?: string[];
disallowedTypes?: string[];
}
let hasErrors = false;
@@ -246,14 +199,6 @@ function checkFile(program: ts.Program, sourceFile: ts.SourceFile, rule: IRule)
return; // override
}
if (rule.disallowedTypes?.some(disallowed => disallowed === text)) {
const { line, character } = sourceFile.getLineAndCharacterOfPosition(node.getStart());
console.log(`[build/lib/layersChecker.ts]: Reference to '${text}' violates layer '${rule.target}' (${sourceFile.fileName} (${line + 1},${character + 1})`);
hasErrors = true;
return;
}
const checker = program.getTypeChecker();
const symbol = checker.getSymbolAtLocation(node);
if (symbol) {

View File

@@ -15,7 +15,7 @@ const yarn = process.platform === 'win32' ? 'yarn.cmd' : 'yarn';
const rootDir = path.resolve(__dirname, '..', '..');
function runProcess(command: string, args: ReadonlyArray<string> = []) {
return new Promise<void>((resolve, reject) => {
return new Promise((resolve, reject) => {
const child = spawn(command, args, { cwd: rootDir, stdio: 'inherit', env: process.env });
child.on('exit', err => !err ? resolve() : process.exit(err ?? 1));
child.on('error', reject);

View File

@@ -60,12 +60,12 @@
"git": {
"name": "electron",
"repositoryUrl": "https://github.com/electron/electron",
"commitHash": "fb03807cd21915ddc3aa2521ba4f5ba14597bd7e"
"commitHash": "03c7a54dc534ce1867d4393b9b1a6989d4a7e005"
}
},
"isOnlyProductionDependency": true,
"license": "MIT",
"version": "9.3.0"
"version": "9.2.1"
},
{
"component": {

View File

@@ -2,7 +2,7 @@
"name": "agent",
"displayName": "SQL Server Agent",
"description": "Manage and troubleshoot SQL Server Agent jobs",
"version": "0.49.0",
"version": "0.48.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",

View File

@@ -1,3 +0,0 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8.7 7.9L15.8 15L15 15.8L7.9 8.7L0.8 15.8L0 15L7.1 7.9L0 0.8L0.8 0L7.9 7.1L15 0L15.8 0.8L8.7 7.9Z" fill="#0078D4"/>
</svg>

Before

Width:  |  Height:  |  Size: 228 B

View File

@@ -1,3 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 2048 2048" width="16" height="16">
<path d="M960 1920q-133 0-255-34t-230-96-194-150-150-195-97-229T0 960q0-133 34-255t96-230 150-194 195-150 229-97T960 0q133 0 255 34t230 96 194 150 150 195 97 229 34 256q0 133-34 255t-96 230-150 194-195 150-229 97-256 34zm0-1792q-115 0-221 30t-198 84-169 130-130 168-84 199-30 221q0 114 30 220t84 199 130 169 168 130 199 84 221 30q114 0 220-30t199-84 169-130 130-168 84-199 30-221q0-114-30-220t-84-199-130-169-168-130-199-84-221-30zm-64 640h128v640H896V768zm0-256h128v128H896V512z" />
</svg>

Before

Width:  |  Height:  |  Size: 581 B

View File

@@ -1,3 +0,0 @@
<svg width="16" height="14" viewBox="0 0 16 14" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M14 0H14.4L14.7 0.2L14.9 0.5C14.9524 0.570883 14.9885 0.652432 15.0058 0.738849C15.023 0.825265 15.0211 0.914429 15 1V14H2.8L0.999997 12.2V1C0.985033 0.85904 1.02046 0.717335 1.1 0.6L1.3 0.3L1.6 0.1H14V0ZM14 1H13V7H3V1H2V11.8L3.2 13H4V9H11V13H14V1ZM4 6H12V1H4V6ZM10 10H5V13H6V11H7V13H10V10Z" fill="#0078D4"/>
</svg>

Before

Width:  |  Height:  |  Size: 421 B

View File

@@ -65,7 +65,13 @@
{
"cell_type": "code",
"source": [
"import sys,os,json,html,getpass,time, tempfile\n",
"import pandas,sys,os,json,html,getpass,time, tempfile\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"def run_command(command):\n",
" print(\"Executing: \" + command)\n",
" !{command}\n",
@@ -132,13 +138,7 @@
" sys.exit(f'Password is required.')\n",
" confirm_password = getpass.getpass(prompt = 'Confirm password')\n",
" if arc_admin_password != confirm_password:\n",
" sys.exit(f'Passwords do not match.')\n",
"\n",
"os.environ[\"SPN_CLIENT_ID\"] = sp_client_id\n",
"os.environ[\"SPN_TENANT_ID\"] = sp_tenant_id\n",
"if \"AZDATA_NB_VAR_SP_CLIENT_SECRET\" in os.environ:\n",
" os.environ[\"SPN_CLIENT_SECRET\"] = os.environ[\"AZDATA_NB_VAR_SP_CLIENT_SECRET\"]\n",
"os.environ[\"SPN_AUTHORITY\"] = \"https://login.microsoftonline.com\""
" sys.exit(f'Passwords do not match.')"
],
"metadata": {
"azdata_cell_guid": "e7e10828-6cae-45af-8c2f-1484b6d4f9ac",
@@ -188,7 +188,7 @@
"os.environ[\"AZDATA_PASSWORD\"] = arc_admin_password\n",
"if os.name == 'nt':\n",
" print(f'If you don\\'t see output produced by azdata, you can run the following command in a terminal window to check the deployment status:\\n\\t {os.environ[\"AZDATA_NB_VAR_KUBECTL\"]} get pods -n {arc_data_controller_namespace}')\n",
"run_command(f'azdata arc dc create --connectivity-mode {arc_data_controller_connectivity_mode} -n {arc_data_controller_name} -ns {arc_data_controller_namespace} -s {arc_subscription} -g {arc_resource_group} -l {arc_data_controller_location} -sc {arc_data_controller_storage_class} --profile-name {arc_profile}')\n",
"run_command(f'azdata arc dc create --connectivity-mode Indirect -n {arc_data_controller_name} -ns {arc_data_controller_namespace} -s {arc_subscription} -g {arc_resource_group} -l {arc_data_controller_location} -sc {arc_data_controller_storage_class} --profile-name {arc_profile}')\n",
"print(f'Azure Arc Data Controller: {arc_data_controller_name} created.') "
],
"metadata": {

View File

@@ -114,8 +114,6 @@
"# Login to the data controller.\n",
"#\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_CONTROLLER_PASSWORD\"]\n",
"os.environ[\"KUBECONFIG\"] = controller_kubeconfig\n",
"os.environ[\"KUBECTL_CONTEXT\"] = controller_kubectl_context\n",
"cmd = f'azdata login -e {controller_endpoint} -u {controller_username}'\n",
"out=run_command()"
],

View File

@@ -114,8 +114,6 @@
"# Login to the data controller.\n",
"#\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_CONTROLLER_PASSWORD\"]\n",
"os.environ[\"KUBECONFIG\"] = controller_kubeconfig\n",
"os.environ[\"KUBECTL_CONTEXT\"] = controller_kubectl_context\n",
"cmd = f'azdata login -e {controller_endpoint} -u {controller_username}'\n",
"out=run_command()"
],

View File

@@ -2,14 +2,14 @@
"name": "arc",
"displayName": "%arc.displayName%",
"description": "%arc.description%",
"version": "0.7.0",
"version": "0.5.1",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",
"icon": "images/extension.png",
"engines": {
"vscode": "*",
"azdata": ">=1.25.0"
"azdata": ">=1.22.0"
},
"activationEvents": [
"onCommand:arc.connectToController",
@@ -136,12 +136,11 @@
"displayName": "%resource.type.azure.arc.display.name%",
"description": "%resource.type.azure.arc.description%",
"platforms": "*",
"icon": "./images/data_controller.svg",
"tags": [
"Hybrid",
"SQL Server",
"PostgreSQL"
],
"icon": {
"light": "./images/data_controller.svg",
"dark": "./images/data_controller.svg"
},
"tags": ["Hybrid", "SQL Server", "PostgreSQL"],
"providers": [
{
"notebookWizard": {
@@ -200,7 +199,7 @@
]
},
{
"title": "%arc.data.controller.create.azureconfig.title%",
"title": "%arc.data.controller.data.controller.create.title%",
"sections": [
{
"title": "%arc.data.controller.project.details.title%",
@@ -214,14 +213,53 @@
"type": "azure_account",
"required": true,
"subscriptionVariableName": "AZDATA_NB_VAR_ARC_SUBSCRIPTION",
"displaySubscriptionVariableName": "AZDATA_NB_VAR_ARC_DISPLAY_SUBSCRIPTION",
"resourceGroupVariableName": "AZDATA_NB_VAR_ARC_RESOURCE_GROUP"
}
]
},
{
"title": "%arc.data.controller.data.controller.details.title%",
"fields": [
{
"type": "readonly_text",
"label": "%arc.data.controller.data.controller.details.description%",
"labelWidth": "600px"
},
{
"type": "text",
"label": "%arc.data.controller.arc.data.controller.namespace%",
"textValidationRequired": true,
"textValidationRegex": "^[a-z0-9]([-a-z0-9]{0,61}[a-z0-9])?$",
"textValidationDescription": "%arc.data.controller.arc.data.controller.namespace.validation.description%",
"defaultValue": "arc",
"required": true,
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_NAMESPACE"
},
{
"type": "text",
"label": "%arc.data.controller.arc.data.controller.name%",
"textValidationRequired": true,
"textValidationRegex": "^[a-z0-9]([-.a-z0-9]{0,251}[a-z0-9])?$",
"textValidationDescription": "%arc.data.controller.arc.data.controller.name.validation.description%",
"defaultValue": "arc-dc",
"required": true,
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_NAME"
},
{
"label": "%arc.storage-class.dc.label%",
"description": "%arc.sql.storage-class.dc.description%",
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_STORAGE_CLASS",
"type": "kube_storage_class",
"required": true
},
{
"type": "azure_locations",
"label": "%arc.data.controller.location%",
"label": "%arc.data.controller.arc.data.controller.location%",
"defaultValue": "eastus",
"required": true,
"locationVariableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_LOCATION",
"displayLocationVariableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_DISPLAY_LOCATION",
"locations": [
"australiaeast",
"centralus",
@@ -239,141 +277,6 @@
}
]
},
{
"title": "%arc.data.controller.connectivitymode%",
"fields": [
{
"type": "readonly_text",
"label": "%arc.data.controller.connectivitymode.description%",
"labelWidth": "600px"
},
{
"type": "options",
"label": "%arc.data.controller.connectivitymode%",
"required": true,
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_CONNECTIVITY_MODE",
"options": {
"values": [
{
"name": "indirect",
"displayName": "%arc.data.controller.indirect%"
},
{
"name": "direct",
"displayName": "%arc.data.controller.direct%"
}
],
"defaultValue": "%arc.data.controller.indirect%",
"optionsType": "radio"
}
},
{
"type": "readonly_text",
"label": "%arc.data.controller.serviceprincipal.description%",
"labelWidth": "600px",
"links": [
{
"text": "%arc.data.controller.readmore%",
"url": "https://docs.microsoft.com/azure/azure-arc/data/upload-metrics"
}
]
},
{
"label": "%arc.data.controller.spclientid%",
"description": "%arc.data.controller.spclientid.description%",
"variableName": "AZDATA_NB_VAR_SP_CLIENT_ID",
"type": "text",
"required": true,
"defaultValue": "",
"placeHolder": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"enabled": {
"target": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_CONNECTIVITY_MODE",
"value": "direct"
},
"validations" : [{
"type": "regex_match",
"regex": "^[0-9A-Fa-f]{8}-([0-9A-Fa-f]{4}-){3}[0-9A-Fa-f]{12}$",
"description": "%arc.data.controller.spclientid.validation.description%"
}]
},
{
"label": "%arc.data.controller.spclientsecret%",
"description": "%arc.data.controller.spclientsecret.description%",
"variableName": "AZDATA_NB_VAR_SP_CLIENT_SECRET",
"type": "password",
"required": true,
"defaultValue": "",
"enabled": {
"target": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_CONNECTIVITY_MODE",
"value": "direct"
}
},
{
"label": "%arc.data.controller.sptenantid%",
"description": "%arc.data.controller.sptenantid.description%",
"variableName": "AZDATA_NB_VAR_SP_TENANT_ID",
"type": "text",
"required": true,
"defaultValue": "",
"enabled": false,
"valueProvider": {
"providerId": "subscription-id-to-tenant-id",
"triggerField": "AZDATA_NB_VAR_ARC_SUBSCRIPTION"
},
"validations" : [{
"type": "regex_match",
"regex": "^[0-9A-Fa-f]{8}-([0-9A-Fa-f]{4}-){3}[0-9A-Fa-f]{12}$",
"description": "%arc.data.controller.sptenantid.validation.description%"
}]
}
]
}
]
},
{
"title": "%arc.data.controller.create.controllerconfig.title%",
"sections": [
{
"title": "%arc.data.controller.details.title%",
"fields": [
{
"type": "readonly_text",
"label": "%arc.data.controller.details.description%",
"labelWidth": "600px"
},
{
"type": "text",
"label": "%arc.data.controller.namespace%",
"validations" : [{
"type": "regex_match",
"regex": "^[a-z0-9]([-a-z0-9]{0,61}[a-z0-9])?$",
"description": "%arc.data.controller.namespace.validation.description%"
}],
"defaultValue": "arc",
"required": true,
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_NAMESPACE"
},
{
"type": "text",
"label": "%arc.data.controller.name%",
"validations" : [{
"type": "regex_match",
"regex": "^[a-z0-9]([-.a-z0-9]{0,251}[a-z0-9])?$",
"description": "%arc.data.controller.name.validation.description%"
}],
"defaultValue": "arc-dc",
"required": true,
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_NAME"
},
{
"label": "%arc.storage-class.dc.label%",
"description": "%arc.sql.storage-class.dc.description%",
"variableName": "AZDATA_NB_VAR_ARC_DATA_CONTROLLER_STORAGE_CLASS",
"type": "kube_storage_class",
"required": true
}
]
},
{
"title": "%arc.data.controller.admin.account.title%",
"fields": [
@@ -400,7 +303,7 @@
]
},
{
"title": "%arc.data.controller.create.summary.title%",
"title": "%arc.data.controller.data.controller.create.summary.title%",
"isSummaryPage": true,
"fieldHeight": "16px",
"sections": [
@@ -553,30 +456,6 @@
},
{
"title": "%arc.data.controller.summary.azure%",
"fields": [
{
"label": "%arc.data.controller.summary.subscription%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_SUBSCRIPTION)",
"inputWidth": "600"
},
{
"label": "%arc.data.controller.summary.resource.group%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_RESOURCE_GROUP)"
},
{
"label": "%arc.data.controller.summary.location%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_DATA_CONTROLLER_LOCATION)"
}
]
},
{
"title": "%arc.data.controller.summary.controller%",
"fields": [
{
"label": "%arc.data.controller.summary.data.controller.namespace%",
@@ -591,10 +470,23 @@
"defaultValue": "$(AZDATA_NB_VAR_ARC_DATA_CONTROLLER_NAME)"
},
{
"label": "%arc.data.controller.connectivitymode%",
"label": "%arc.data.controller.summary.subscription%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_DATA_CONTROLLER_CONNECTIVITY_MODE)"
"defaultValue": "$(AZDATA_NB_VAR_ARC_DISPLAY_SUBSCRIPTION)",
"inputWidth": "600"
},
{
"label": "%arc.data.controller.summary.resource.group%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_RESOURCE_GROUP)"
},
{
"label": "%arc.data.controller.summary.location%",
"type": "readonly_text",
"isEvaluated": true,
"defaultValue": "$(AZDATA_NB_VAR_ARC_DATA_CONTROLLER_DISPLAY_LOCATION)"
}
]
}
@@ -620,11 +512,11 @@
"displayName": "%resource.type.arc.sql.display.name%",
"description": "%resource.type.arc.sql.description%",
"platforms": "*",
"icon": "./images/miaa.svg",
"tags": [
"Hybrid",
"SQL Server"
],
"icon": {
"light": "./images/miaa.svg",
"dark": "./images/miaa.svg"
},
"tags": ["Hybrid", "SQL Server"],
"providers": [
{
"notebookWizard": {
@@ -672,23 +564,21 @@
"variableName": "AZDATA_NB_VAR_SQL_INSTANCE_NAME",
"type": "text",
"defaultValue": "sqlinstance1",
"description": "%arc.sql.invalid.instance.name%",
"required": true,
"validations" : [{
"type": "regex_match",
"regex": "^[a-z]([-a-z0-9]{0,11}[a-z0-9])?$",
"description": "%arc.sql.invalid.instance.name%"
}]
"textValidationRequired": true,
"textValidationRegex": "^[a-z]([-a-z0-9]{0,11}[a-z0-9])?$",
"textValidationDescription": "%arc.sql.invalid.instance.name%"
},
{
"label": "%arc.sql.username%",
"variableName": "AZDATA_NB_VAR_SQL_USERNAME",
"type": "text",
"description": "%arc.sql.invalid.username%",
"required": true,
"validations" : [{
"type": "regex_match",
"regex": "^(?!sa$)",
"description": "%arc.sql.invalid.username%"
}]
"textValidationRequired": true,
"textValidationRegex": "^(?!sa$)",
"textValidationDescription": "%arc.sql.invalid.username%"
},
{
"label": "%arc.password%",
@@ -725,14 +615,7 @@
"variableName": "AZDATA_NB_VAR_SQL_CORES_REQUEST",
"type": "number",
"min": 1,
"required": false,
"validations": [
{
"type": "<=",
"target": "AZDATA_NB_VAR_SQL_CORES_LIMIT",
"description": "%requested.cores.less.than.or.equal.to.cores.limit%"
}
]
"required": false
},
{
"label": "%arc.cores-limit.label%",
@@ -740,14 +623,7 @@
"variableName": "AZDATA_NB_VAR_SQL_CORES_LIMIT",
"type": "number",
"min": 1,
"required": false,
"validations": [
{
"type": ">=",
"target": "AZDATA_NB_VAR_SQL_CORES_REQUEST",
"description": "%cores.limit.greater.than.or.equal.to.requested.cores%"
}
]
"required": false
},
{
"label": "%arc.memory-request.label%",
@@ -755,12 +631,7 @@
"variableName": "AZDATA_NB_VAR_SQL_MEMORY_REQUEST",
"type": "number",
"min": 2,
"required": false,
"validations": [{
"type": "<=",
"target": "AZDATA_NB_VAR_SQL_MEMORY_LIMIT",
"description": "%requested.memory.less.than.or.equal.to.memory.limit%"
}]
"required": false
},
{
"label": "%arc.memory-limit.label%",
@@ -768,12 +639,7 @@
"variableName": "AZDATA_NB_VAR_SQL_MEMORY_LIMIT",
"type": "number",
"min": 2,
"required": false,
"validations": [{
"type": ">=",
"target": "AZDATA_NB_VAR_SQL_MEMORY_REQUEST",
"description": "%memory.limit.greater.than.or.equal.to.requested.memory%"
}]
"required": false
}
]
}
@@ -812,11 +678,11 @@
"displayName": "%resource.type.arc.postgres.display.name%",
"description": "%resource.type.arc.postgres.description%",
"platforms": "*",
"icon": "./images/postgres.svg",
"tags": [
"Hybrid",
"PostgreSQL"
],
"icon": {
"light": "./images/postgres.svg",
"dark": "./images/postgres.svg"
},
"tags": ["Hybrid", "PostgreSQL"],
"providers": [
{
"notebookWizard": {
@@ -864,11 +730,9 @@
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME",
"type": "text",
"description": "%arc.postgres.server.group.name.validation.description%",
"validations" : [{
"type": "regex_match",
"regex": "^[a-z]([-a-z0-9]{0,10}[a-z0-9])?$",
"description": "%arc.postgres.server.group.name.validation.description%"
}],
"textValidationRequired": true,
"textValidationRegex": "^[a-z]([-a-z0-9]{0,10}[a-z0-9])?$",
"textValidationDescription": "%arc.postgres.server.group.name.validation.description%",
"required": true
},
{
@@ -885,10 +749,6 @@
"description": "%arc.postgres.server.group.workers.description%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_WORKERS",
"type": "number",
"validations": [{
"type": "is_integer",
"description": "%should.be.integer%"
}],
"defaultValue": "0",
"min": 0
},
@@ -896,10 +756,6 @@
"label": "%arc.postgres.server.group.port%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PORT",
"type": "number",
"validations": [{
"type": "is_integer",
"description": "%should.be.integer%"
}],
"defaultValue": "5432",
"min": 1,
"max": 65535
@@ -980,48 +836,28 @@
"description": "%arc.postgres.server.group.cores.request.description%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_REQUEST",
"type": "number",
"min": 1,
"validations": [{
"type": "<=",
"target": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_LIMIT",
"description": "%requested.cores.less.than.or.equal.to.cores.limit%"
}]
"min": 1
},
{
"label": "%arc.postgres.server.group.cores.limit.label%",
"description": "%arc.postgres.server.group.cores.limit.description%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_LIMIT",
"type": "number",
"min": 1,
"validations": [{
"type": ">=",
"target": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_REQUEST",
"description": "%cores.limit.greater.than.or.equal.to.requested.cores%"
}]
"min": 1
},
{
"label": "%arc.postgres.server.group.memory.request.label%",
"description": "%arc.postgres.server.group.memory.request.description%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_REQUEST",
"type": "number",
"min": 0.25,
"validations": [{
"type": "<=",
"target": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_LIMIT",
"description": "%requested.memory.less.than.or.equal.to.memory.limit%"
}]
"min": 0.25
},
{
"label": "%arc.postgres.server.group.memory.limit.label%",
"description": "%arc.postgres.server.group.memory.limit.description%",
"variableName": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_LIMIT",
"type": "number",
"min": 0.25,
"validations": [{
"type": ">=",
"target": "AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_REQUEST",
"description": "%memory.limit.greater.than.or.equal.to.requested.memory%"
}]
"min": 0.25
}
]
}
@@ -1060,8 +896,7 @@
"dependencies": {
"request": "^2.88.0",
"uuid": "^8.3.0",
"vscode-nls": "^4.1.2",
"yamljs": "^0.3.0"
"vscode-nls": "^4.1.2"
},
"devDependencies": {
"@types/mocha": "^5.2.5",
@@ -1069,7 +904,6 @@
"@types/request": "^2.48.3",
"@types/sinon": "^9.0.4",
"@types/uuid": "^8.3.0",
"@types/yamljs": "^0.2.31",
"mocha": "^5.2.0",
"mocha-junit-reporter": "^1.17.0",
"mocha-multi-reporters": "^1.1.7",

View File

@@ -20,35 +20,21 @@
"arc.data.controller.kube.cluster.context": "Cluster context",
"arc.data.controller.cluster.config.profile.title": "Choose the config profile",
"arc.data.controller.cluster.config.profile": "Config profile",
"arc.data.controller.create.azureconfig.title": "Azure and Connectivity Configuration",
"arc.data.controller.connectivitymode.description": "Select the connectivity mode for the controller.",
"arc.data.controller.create.controllerconfig.title": "Controller Configuration",
"arc.data.controller.project.details.title": "Azure details",
"arc.data.controller.data.controller.create.title": "Provide details to create Azure Arc data controller",
"arc.data.controller.project.details.title": "Project details",
"arc.data.controller.project.details.description": "Select the subscription to manage deployed resources and costs. Use resource groups like folders to organize and manage all your resources.",
"arc.data.controller.details.title": "Data controller details",
"arc.data.controller.details.description": "Provide a namespace, name and storage class for your Azure Arc data controller. This name will be used to identify your Arc instance for remote management and monitoring.",
"arc.data.controller.namespace": "Data controller namespace",
"arc.data.controller.namespace.validation.description": "Namespace must consist of lower case alphanumeric characters or '-', start/end with an alphanumeric character, and be 63 characters or fewer in length.",
"arc.data.controller.name": "Data controller name",
"arc.data.controller.name.validation.description": "Name must consist of lower case alphanumeric characters, '-' or '.', start/end with an alphanumeric character and be 253 characters or less in length.",
"arc.data.controller.location": "Location",
"arc.data.controller.data.controller.details.title": "Data controller details",
"arc.data.controller.data.controller.details.description": "Provide an Azure region and a name for your Azure Arc data controller. This name will be used to identify your Arc location for remote management and monitoring.",
"arc.data.controller.arc.data.controller.namespace": "Data controller namespace",
"arc.data.controller.arc.data.controller.namespace.validation.description": "Namespace must consist of lower case alphanumeric characters or '-', start/end with an alphanumeric character, and be 63 characters or fewer in length.",
"arc.data.controller.arc.data.controller.name": "Data controller name",
"arc.data.controller.arc.data.controller.name.validation.description": "Name must consist of lower case alphanumeric characters, '-' or '.', start/end with an alphanumeric character and be 253 characters or less in length.",
"arc.data.controller.arc.data.controller.location": "Location",
"arc.data.controller.admin.account.title": "Administrator account",
"arc.data.controller.admin.account.name": "Data controller login",
"arc.data.controller.admin.account.password": "Password",
"arc.data.controller.admin.account.confirm.password": "Confirm password",
"arc.data.controller.connectivitymode": "Connectivity Mode",
"arc.data.controller.direct": "Direct",
"arc.data.controller.indirect": "Indirect",
"arc.data.controller.serviceprincipal.description": "When deploying a controller in direct connected mode a Service Principal is required for uploading metrics to Azure. {0} about how to create this Service Principal and assign it the correct roles.",
"arc.data.controller.spclientid": "Service Principal Client ID",
"arc.data.controller.spclientid.description": "The Application (client) ID of the created Service Principal",
"arc.data.controller.spclientid.validation.description": "The client ID must be a GUID in the format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"arc.data.controller.spclientsecret": "Service Principal Client Secret",
"arc.data.controller.spclientsecret.description": "The password generated during creation of the Service Principal",
"arc.data.controller.sptenantid": "Service Principal Tenant ID",
"arc.data.controller.sptenantid.description": "The Tenant ID of the Service Principal. This must be the same as the Tenant ID of the subscription selected to create this controller for.",
"arc.data.controller.sptenantid.validation.description": "The tenant ID must be a GUID in the format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"arc.data.controller.create.summary.title": "Review your configuration",
"arc.data.controller.data.controller.create.summary.title": "Review your configuration",
"arc.data.controller.summary.arc.data.controller": "Azure Arc data controller",
"arc.data.controller.summary.estimated.cost.per.month": "Estimated cost per month",
"arc.data.controller.summary.arc.by.microsoft" : "by Microsoft",
@@ -69,10 +55,8 @@
"arc.data.controller.summary.resource.group": "Resource group",
"arc.data.controller.summary.data.controller.name": "Data controller name",
"arc.data.controller.summary.data.controller.namespace": "Data controller namespace",
"arc.data.controller.summary.controller": "Controller",
"arc.data.controller.summary.location": "Location",
"arc.data.controller.agreement": "I accept {0} and {1}.",
"arc.data.controller.readmore": "Read more",
"arc.data.controller.arc.data.controller.agreement": "I accept {0} and {1}.",
"microsoft.agreement.privacy.statement":"Microsoft Privacy Statement",
"deploy.script.action":"Script to notebook",
"deploy.done.action":"Deploy",
@@ -145,11 +129,6 @@
"arc.postgres.server.group.memory.limit.label": "Memory limit (GB per node)",
"arc.postgres.server.group.memory.limit.description": "The memory limit of the Postgres instance per node in GB.",
"arc.agreement": "I accept {0} and {1}.",
"arc.agreement.sql.terms.conditions": "Azure SQL managed instance - Azure Arc terms and conditions",
"arc.agreement.postgres.terms.conditions": "Azure Arc enabled PostgreSQL Hyperscale terms and conditions",
"should.be.integer": "Value must be an integer",
"requested.cores.less.than.or.equal.to.cores.limit": "Requested cores must be less than or equal to cores limit",
"cores.limit.greater.than.or.equal.to.requested.cores": "Cores limit must be greater than or equal to requested cores",
"requested.memory.less.than.or.equal.to.memory.limit": "Requested memory must be less than or equal to memory limit",
"memory.limit.greater.than.or.equal.to.requested.memory": "Memory limit must be greater than or equal to requested memory"
"arc.agreement.sql.terms.conditions":"Azure SQL managed instance - Azure Arc terms and conditions",
"arc.agreement.postgres.terms.conditions":"Azure Arc enabled PostgreSQL Hyperscale terms and conditions"
}

View File

@@ -4,17 +4,11 @@
*--------------------------------------------------------------------------------------------*/
import * as arc from 'arc';
import * as rd from 'resource-deployment';
import * as loc from '../localizedConstants';
import { PasswordToControllerDialog } from '../ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
import { ControllerTreeNode } from '../ui/tree/controllerTreeNode';
import { UserCancelledError } from './utils';
export class UserCancelledError extends Error implements rd.ErrorWithType {
public get type(): rd.ErrorType {
return rd.ErrorType.userCancelled;
}
}
export function arcApi(treeDataProvider: AzureArcTreeDataProvider): arc.IExtension {
return {
getRegisteredDataControllers: () => getRegisteredDataControllers(treeDataProvider),
@@ -22,13 +16,12 @@ export function arcApi(treeDataProvider: AzureArcTreeDataProvider): arc.IExtensi
reacquireControllerPassword: (controllerInfo: arc.ControllerInfo) => reacquireControllerPassword(treeDataProvider, controllerInfo)
};
}
export async function reacquireControllerPassword(treeDataProvider: AzureArcTreeDataProvider, controllerInfo: arc.ControllerInfo): Promise<string> {
const dialog = new PasswordToControllerDialog(treeDataProvider);
dialog.showDialog(controllerInfo);
const model = await dialog.waitForClose();
if (!model) {
throw new UserCancelledError(loc.userCancelledError);
throw new UserCancelledError();
}
return model.password;
}

View File

@@ -1,39 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as os from 'os';
import * as path from 'path';
import * as yamljs from 'yamljs';
import * as loc from '../localizedConstants';
import { throwUnless } from './utils';
export interface KubeClusterContext {
name: string;
isCurrentContext: boolean;
}
export function getKubeConfigClusterContexts(configFile: string): Promise<KubeClusterContext[]> {
const config: any = yamljs.load(configFile);
const rawContexts = <any[]>config['contexts'];
throwUnless(rawContexts && rawContexts.length, loc.noContextFound(configFile));
const currentContext = <string>config['current-context'];
throwUnless(currentContext, loc.noCurrentContextFound(configFile));
const contexts: KubeClusterContext[] = [];
rawContexts.forEach(rawContext => {
const name = <string>rawContext['name'];
throwUnless(name, loc.noNameInContext(configFile));
if (name) {
contexts.push({
name: name,
isCurrentContext: name === currentContext
});
}
});
return Promise.resolve(contexts);
}
export function getDefaultKubeConfigPath(): string {
return path.join(os.homedir(), '.kube', 'config');
}

View File

@@ -9,6 +9,8 @@ import * as vscode from 'vscode';
import { ConnectionMode, IconPath, IconPathHelper } from '../constants';
import * as loc from '../localizedConstants';
export class UserCancelledError extends Error { }
/**
* Converts the resource type name into the localized Display Name for that type.
* @param resourceType The resource type name to convert
@@ -65,7 +67,7 @@ export function getResourceTypeIcon(resourceType: string | undefined): IconPath
/**
* Returns the text to display for known connection modes
* @param connectionMode The string representing the connection mode
* @param connectionMode The string repsenting the connection mode
*/
export function getConnectionModeDisplayText(connectionMode: string | undefined): string {
connectionMode = connectionMode ?? '';
@@ -216,63 +218,6 @@ export function createCredentialId(controllerId: string, resourceType: string, i
}
/**
* Calculates the gibibyte (GiB) conversion of a quantity that could currently be represented by a range
* of SI suffixes (E, P, T, G, M, K, m) or their power-of-two equivalents (Ei, Pi, Ti, Gi, Mi, Ki)
* @param value The string of a quantity to be converted
* @returns String of GiB conversion
*/
export function convertToGibibyteString(value: string): string {
if (!value) {
throw new Error(`Value provided is not a valid Kubernetes resource quantity`);
}
let base10ToBase2Multiplier;
let floatValue = parseFloat(value);
let splitValue = value.split(String(floatValue));
let unit = splitValue[1];
if (unit === 'K') {
base10ToBase2Multiplier = 1000 / 1024;
floatValue = (floatValue * base10ToBase2Multiplier) / Math.pow(1024, 2);
} else if (unit === 'M') {
base10ToBase2Multiplier = Math.pow(1000, 2) / Math.pow(1024, 2);
floatValue = (floatValue * base10ToBase2Multiplier) / 1024;
} else if (unit === 'G') {
base10ToBase2Multiplier = Math.pow(1000, 3) / Math.pow(1024, 3);
floatValue = floatValue * base10ToBase2Multiplier;
} else if (unit === 'T') {
base10ToBase2Multiplier = Math.pow(1000, 4) / Math.pow(1024, 4);
floatValue = (floatValue * base10ToBase2Multiplier) * 1024;
} else if (unit === 'P') {
base10ToBase2Multiplier = Math.pow(1000, 5) / Math.pow(1024, 5);
floatValue = (floatValue * base10ToBase2Multiplier) * Math.pow(1024, 2);
} else if (unit === 'E') {
base10ToBase2Multiplier = Math.pow(1000, 6) / Math.pow(1024, 6);
floatValue = (floatValue * base10ToBase2Multiplier) * Math.pow(1024, 3);
} else if (unit === 'm') {
floatValue = (floatValue / 1000) / Math.pow(1024, 3);
} else if (unit === '') {
floatValue = floatValue / Math.pow(1024, 3);
} else if (unit === 'Ki') {
floatValue = floatValue / Math.pow(1024, 2);
} else if (unit === 'Mi') {
floatValue = floatValue / 1024;
} else if (unit === 'Gi') {
floatValue = floatValue;
} else if (unit === 'Ti') {
floatValue = floatValue * 1024;
} else if (unit === 'Pi') {
floatValue = floatValue * Math.pow(1024, 2);
} else if (unit === 'Ei') {
floatValue = floatValue * Math.pow(1024, 3);
} else {
throw new Error(`${value} is not a valid Kubernetes resource quantity`);
}
return String(floatValue);
}
/*
* Throws an Error with given {@link message} unless {@link condition} is true.
* This also tells the typescript compiler that the condition is 'truthy' in the remainder of the scope
* where this function was called.
@@ -280,18 +225,8 @@ export function convertToGibibyteString(value: string): string {
* @param condition
* @param message
*/
export function throwUnless(condition: any, message?: string): asserts condition {
export function throwUnless(condition: boolean, message?: string): asserts condition {
if (!condition) {
throw new Error(message);
}
}
export async function tryExecuteAction<T>(action: () => T | PromiseLike<T>): Promise<{ result: T | undefined, error: any }> {
let error: any, result: T | undefined;
try {
result = await action();
} catch (e) {
error = e;
}
return { result, error };
}

View File

@@ -40,10 +40,7 @@ export class IconPathHelper {
public static controller: IconPath;
public static health: IconPath;
public static success: IconPath;
public static save: IconPath;
public static discard: IconPath;
public static fail: IconPath;
public static information: IconPath;
public static setExtensionContext(context: vscode.ExtensionContext) {
IconPathHelper.context = context;
@@ -119,22 +116,10 @@ export class IconPathHelper {
light: context.asAbsolutePath('images/success.svg'),
dark: context.asAbsolutePath('images/success.svg'),
};
IconPathHelper.save = {
light: context.asAbsolutePath('images/save.svg'),
dark: context.asAbsolutePath('images/save.svg'),
};
IconPathHelper.discard = {
light: context.asAbsolutePath('images/discard.svg'),
dark: context.asAbsolutePath('images/discard.svg'),
};
IconPathHelper.fail = {
light: context.asAbsolutePath('images/fail.svg'),
dark: context.asAbsolutePath('images/fail.svg'),
};
IconPathHelper.information = {
light: context.asAbsolutePath('images/information.svg'),
dark: context.asAbsolutePath('images/information.svg'),
};
}
}

View File

@@ -67,7 +67,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<arc.IE
// register option sources
const rdApi = <rd.IExtension>vscode.extensions.getExtension(rd.extension.name)?.exports;
context.subscriptions.push(rdApi.registerOptionsSourceProvider(new ArcControllersOptionsSourceProvider(treeDataProvider)));
rdApi.registerOptionsSourceProvider(new ArcControllersOptionsSourceProvider(treeDataProvider));
return arcApi(treeDataProvider);
}

View File

@@ -32,8 +32,6 @@ export const resourceHealth = localize('arc.resourceHealth', "Resource health");
export const newInstance = localize('arc.createNew', "New Instance");
export const deleteText = localize('arc.delete', "Delete");
export const saveText = localize('arc.save', "Save");
export const discardText = localize('arc.discard', "Discard");
export const resetPassword = localize('arc.resetPassword', "Reset Password");
export const openInAzurePortal = localize('arc.openInAzurePortal', "Open in Azure Portal");
export const resourceGroup = localize('arc.resourceGroup', "Resource Group");
@@ -61,10 +59,6 @@ export const yes = localize('arc.yes', "Yes");
export const no = localize('arc.no', "No");
export const feedback = localize('arc.feedback', "Feedback");
export const selectConnectionString = localize('arc.selectConnectionString', "Select from available client connection strings below.");
export const addingWokerNodes = localize('arc.addingWokerNodes', "adding worker nodes");
export const workerNodesDescription = localize('arc.workerNodesDescription', "Expand your server group and scale your database by adding worker nodes.");
export const postgresConfigurationInformation = localize('arc.postgres.configurationInformation', "You can configure the number of CPU cores and storage size that will apply to both worker nodes and coordinator node. Each worker node will have the same configuration. Adjust the number of CPU cores and memory settings for your server group.");
export const workerNodesInformation = localize('arc.workerNodeInformation', "In preview it is not possible to reduce the number of worker nodes. Please refer to documentation linked above for more information.");
export const vCores = localize('arc.vCores', "vCores");
export const ram = localize('arc.ram', "RAM");
export const refresh = localize('arc.refresh', "Refresh");
@@ -120,27 +114,9 @@ export const databaseName = localize('arc.databaseName', "Database name");
export const enterNewPassword = localize('arc.enterNewPassword', "Enter a new password");
export const confirmNewPassword = localize('arc.confirmNewPassword', "Confirm the new password");
export const learnAboutPostgresClients = localize('arc.learnAboutPostgresClients', "Learn more about Azure PostgreSQL Hyperscale client interfaces");
export const scalingCompute = localize('arc.scalingCompute', "scaling compute vCores and memory.");
export const postgresComputeAndStorageDescriptionPartOne = localize('arc.postgresComputeAndStorageDescriptionPartOne', "You can scale your Azure Arc enabled");
export const miaaComputeAndStorageDescriptionPartOne = localize('arc.miaaComputeAndStorageDescriptionPartOne', "You can scale your Azure SQL managed instance - Azure Arc by");
export const postgresComputeAndStorageDescriptionPartTwo = localize('arc.postgres.computeAndStorageDescriptionPartTwo', "PostgreSQL Hyperscale server group by");
export const computeAndStorageDescriptionPartThree = localize('arc.computeAndStorageDescriptionPartThree', "without downtime and by");
export const computeAndStorageDescriptionPartFour = localize('arc.computeAndStorageDescriptionPartFour', "Before doing so, you need to ensure");
export const computeAndStorageDescriptionPartFive = localize('arc.computeAndStorageDescriptionPartFive', "there are sufficient resources available");
export const computeAndStorageDescriptionPartSix = localize('arc.computeAndStorageDescriptionPartSix', "in your Kubernetes cluster to honor this configuration.");
export const node = localize('arc.node', "node");
export const nodes = localize('arc.nodes', "nodes");
export const workerNodes = localize('arc.workerNodes', "Worker Nodes");
export const storagePerNode = localize('arc.storagePerNode', "storage per node");
export const workerNodeCount = localize('arc.workerNodeCount', "Worker node count:");
export const configurationPerNode = localize('arc.configurationPerNode', "Configuration (per node)");
export const coresLimit = localize('arc.coresLimit', "CPU limit:");
export const coresRequest = localize('arc.coresRequest', "CPU request:");
export const memoryLimit = localize('arc.memoryLimit', "Memory limit (in GB):");
export const memoryRequest = localize('arc.memoryRequest', "Memory request (in GB):");
export const workerValidationErrorMessage = localize('arc.workerValidationErrorMessage', "The number of workers cannot be decreased.");
export const memoryRequestValidationErrorMessage = localize('arc.memoryRequestValidationErrorMessage', "Memory request must be at least 0.25Gib");
export const memoryLimitValidationErrorMessage = localize('arc.memoryLimitValidationErrorMessage', "Memory limit must be at least 0.25Gib");
export const arcResources = localize('arc.arcResources', "Azure Arc Resources");
export const enterANonEmptyPassword = localize('arc.enterANonEmptyPassword', "Enter a non empty password or press escape to exit.");
export const thePasswordsDoNotMatch = localize('arc.thePasswordsDoNotMatch', "The passwords do not match. Confirm the password or press escape to exit.");
@@ -154,9 +130,7 @@ export const podsReady = localize('arc.podsReady', "pods ready");
export function databaseCreated(name: string): string { return localize('arc.databaseCreated', "Database {0} created", name); }
export function deletingInstance(name: string): string { return localize('arc.deletingInstance', "Deleting instance '{0}'...", name); }
export function updatingInstance(name: string): string { return localize('arc.updatingInstance', "Updating instance '{0}'...", name); }
export function instanceDeleted(name: string): string { return localize('arc.instanceDeleted', "Instance '{0}' deleted", name); }
export function instanceUpdated(name: string): string { return localize('arc.instanceUpdated', "Instance '{0}' updated", name); }
export function copiedToClipboard(name: string): string { return localize('arc.copiedToClipboard', "{0} copied to clipboard", name); }
export function clickTheTroubleshootButton(resourceType: string): string { return localize('arc.clickTheTroubleshootButton', "Click the troubleshoot button to open the Azure Arc {0} troubleshooting notebook.", resourceType); }
export function numVCores(vCores: string | undefined): string {
@@ -171,7 +145,6 @@ export function numVCores(vCores: string | undefined): string {
}
}
export function updated(when: string): string { return localize('arc.updated', "Updated {0}", when); }
export function validationMin(min: number): string { return localize('arc.validationMin', "Value must be greater than or equal to {0}.", min); }
// Errors
export const connectionRequired = localize('arc.connectionRequired', "A connection is required to show all properties. Click refresh to re-enter connection information");
@@ -179,8 +152,6 @@ export const couldNotFindControllerRegistration = localize('arc.couldNotFindCont
export function refreshFailed(error: any): string { return localize('arc.refreshFailed', "Refresh failed. {0}", getErrorMessage(error)); }
export function openDashboardFailed(error: any): string { return localize('arc.openDashboardFailed', "Error opening dashboard. {0}", getErrorMessage(error)); }
export function instanceDeletionFailed(name: string, error: any): string { return localize('arc.instanceDeletionFailed', "Failed to delete instance {0}. {1}", name, getErrorMessage(error)); }
export function instanceUpdateFailed(name: string, error: any): string { return localize('arc.instanceUpdateFailed', "Failed to update instance {0}. {1}", name, getErrorMessage(error)); }
export function pageDiscardFailed(error: any): string { return localize('arc.pageDiscardFailed', "Failed to discard user input. {0}", getErrorMessage(error)); }
export function databaseCreationFailed(name: string, error: any): string { return localize('arc.databaseCreationFailed', "Failed to create database {0}. {1}", name, getErrorMessage(error)); }
export function connectToControllerFailed(url: string, error: any): string { return localize('arc.connectToControllerFailed', "Could not connect to controller {0}. {1}", url, getErrorMessage(error)); }
export function connectToSqlFailed(serverName: string, error: any): string { return localize('arc.connectToSqlFailed', "Could not connect to SQL managed instance - Azure Arc Instance {0}. {1}", serverName, getErrorMessage(error)); }
@@ -202,7 +173,3 @@ export const variableValueFetchForUnsupportedVariable = (variableName: string) =
export const isPasswordFetchForUnsupportedVariable = (variableName: string) => localize('getIsPassword.unknownVariableName', "Attempt to get isPassword for unknown variable:{0}", variableName);
export const noControllerInfoFound = (name: string) => localize('noControllerInfoFound', "Controller Info could not be found with name: {0}", name);
export const noPasswordFound = (controllerName: string) => localize('noPasswordFound', "Password could not be retrieved for controller: {0} and user did not provide a password. Please retry later.", controllerName);
export const noContextFound = (configFile: string) => localize('noContextFound', "No 'contexts' found in the config file: {0}", configFile);
export const noCurrentContextFound = (configFile: string) => localize('noCurrentContextFound', "No context is marked as 'current-context' in the config file: {0}", configFile);
export const noNameInContext = (configFile: string) => localize('noNameInContext', "No name field was found in a cluster context in the config file: {0}", configFile);
export const userCancelledError = localize('userCancelledError', "User cancelled the dialog");

View File

@@ -6,7 +6,7 @@
import { ControllerInfo, ResourceType } from 'arc';
import * as azdataExt from 'azdata-ext';
import * as vscode from 'vscode';
import { UserCancelledError } from '../common/api';
import { UserCancelledError } from '../common/utils';
import * as loc from '../localizedConstants';
import { ConnectToControllerDialog } from '../ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
@@ -71,7 +71,7 @@ export class ControllerModel {
await this.treeDataProvider.addOrUpdateController(model.controllerModel, model.password, false);
this._password = model.password;
} else {
throw new UserCancelledError(loc.userCancelledError);
throw new UserCancelledError();
}
}
}

View File

@@ -7,9 +7,8 @@ import { MiaaResourceInfo } from 'arc';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as vscode from 'vscode';
import { UserCancelledError } from '../common/api';
import { Deferred } from '../common/promise';
import { createCredentialId, parseIpAndPort } from '../common/utils';
import { createCredentialId, parseIpAndPort, UserCancelledError } from '../common/utils';
import { credentialNamespace } from '../constants';
import * as loc from '../localizedConstants';
import { ConnectToSqlDialog } from '../ui/dialogs/connectSqlDialog';

View File

@@ -9,7 +9,6 @@ import * as vscode from 'vscode';
import * as loc from '../localizedConstants';
import { ControllerModel, Registration } from './controllerModel';
import { ResourceModel } from './resourceModel';
import { Deferred } from '../common/promise';
import { parseIpAndPort } from '../common/utils';
export class PostgresModel extends ResourceModel {
@@ -20,8 +19,6 @@ export class PostgresModel extends ResourceModel {
public onConfigUpdated = this._onConfigUpdated.event;
public configLastUpdated?: Date;
private _refreshPromise?: Deferred<void>;
constructor(private _controllerModel: ControllerModel, info: ResourceInfo, registration: Registration) {
super(info, registration);
this._azdataApi = <azdataExt.IExtension>vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
@@ -58,10 +55,7 @@ export class PostgresModel extends ResourceModel {
const cpuRequest = this._config.spec.scheduling?.default?.resources?.requests?.cpu;
const ramRequest = this._config.spec.scheduling?.default?.resources?.requests?.memory;
const storage = this._config.spec.storage?.data?.size;
// scale.shards was renamed to scale.workers. Check both for backwards compatibility.
const scale = this._config.spec.scale;
const nodes = (scale?.workers ?? scale?.shards ?? 0) + 1; // An extra node for the coordinator
const nodes = (this._config.spec.scale?.shards ?? 0) + 1; // An extra node for the coordinator
let configuration: string[] = [];
configuration.push(`${nodes} ${nodes > 1 ? loc.nodes : loc.node}`);
@@ -84,23 +78,9 @@ export class PostgresModel extends ResourceModel {
/** Refreshes the model */
public async refresh() {
// Only allow one refresh to be happening at a time
if (this._refreshPromise) {
return this._refreshPromise.promise;
}
this._refreshPromise = new Deferred();
try {
await this._controllerModel.azdataLogin();
this._config = (await this._azdataApi.azdata.arc.postgres.server.show(this.info.name)).result;
this.configLastUpdated = new Date();
this._onConfigUpdated.fire(this._config);
this._refreshPromise.resolve();
} catch (err) {
this._refreshPromise.reject(err);
throw err;
} finally {
this._refreshPromise = undefined;
}
await this._controllerModel.azdataLogin();
this._config = (await this._azdataApi.azdata.arc.postgres.server.show(this.info.name)).result;
this.configLastUpdated = new Date();
this._onConfigUpdated.fire(this._config);
}
}

View File

@@ -17,7 +17,7 @@ import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
*/
export class ArcControllersOptionsSourceProvider implements rd.IOptionsSourceProvider {
private _cacheManager = new CacheManager<string, string>();
readonly id = 'arc.controllers';
readonly optionsSourceId = 'arc.controllers';
constructor(private _treeProvider: AzureArcTreeDataProvider) { }
async getOptions(): Promise<string[] | azdata.CategoryValue[]> {

2
extensions/arc/src/test/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
/env
/__pycache__

View File

@@ -0,0 +1,21 @@
# Tests for deploying Arc resources via Jupyter notebook
## Prerequisites
- Python >= 3.6
- Pip package manager
- Azdata CLI installed and logged into an Arc controller
## Running the tests
### 1. (Optional, recommended) Create and activate a Python virtual environment
- `python -m venv env`
- `source env/bin/activate` (Linux)
- `env\Scripts\activate.bat` (Windows)
### 2. Upgrade pip
- `pip install --upgrade pip`
### 3. Install the dependencies
- `pip install -r requirements.txt`
### 4. Run the tests
- `pytest`

View File

@@ -1,62 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import 'mocha';
import * as path from 'path';
import * as sinon from 'sinon';
import * as yamljs from 'yamljs';
import { getDefaultKubeConfigPath, getKubeConfigClusterContexts, KubeClusterContext } from '../../common/kubeUtils';
import { tryExecuteAction } from '../../common/utils';
const kubeConfig =
{
'contexts': [
{
'context': {
'cluster': 'docker-desktop',
'user': 'docker-desktop'
},
'name': 'docker-for-desktop'
},
{
'context': {
'cluster': 'kubernetes',
'user': 'kubernetes-admin'
},
'name': 'kubernetes-admin@kubernetes'
}
],
'current-context': 'docker-for-desktop'
};
describe('KubeUtils', function (): void {
const configFile = 'kubeConfig';
afterEach('KubeUtils cleanup', () => {
sinon.restore();
});
it('getDefaultKubeConfigPath', async () => {
getDefaultKubeConfigPath().should.endWith(path.join('.kube', 'config'));
});
describe('get Kube Config Cluster Contexts', () => {
it('success', async () => {
sinon.stub(yamljs, 'load').returns(<any>kubeConfig);
const verifyContexts = (contexts: KubeClusterContext[], testName: string) => {
contexts.length.should.equal(2, `test: ${testName} failed`);
contexts[0].name.should.equal('docker-for-desktop', `test: ${testName} failed`);
contexts[0].isCurrentContext.should.be.true(`test: ${testName} failed`);
contexts[1].name.should.equal('kubernetes-admin@kubernetes', `test: ${testName} failed`);
contexts[1].isCurrentContext.should.be.false(`test: ${testName} failed`);
};
verifyContexts(await getKubeConfigClusterContexts(configFile), 'getKubeConfigClusterContexts');
});
it('throws error when unable to load config file', async () => {
const error = new Error('unknown error accessing file');
sinon.stub(yamljs, 'load').throws(error); //erroring config file load
((await tryExecuteAction(() => getKubeConfigClusterContexts(configFile))).error).should.equal(error, `test: getKubeConfigClusterContexts failed`);
});
});
});

View File

@@ -7,7 +7,7 @@ import { ResourceType } from 'arc';
import 'mocha';
import * as should from 'should';
import * as vscode from 'vscode';
import { getAzurecoreApi, getConnectionModeDisplayText, getDatabaseStateDisplayText, getErrorMessage, getResourceTypeIcon, parseEndpoint, parseIpAndPort, promptAndConfirmPassword, promptForInstanceDeletion, resourceTypeToDisplayName, convertToGibibyteString } from '../../common/utils';
import { getAzurecoreApi, getConnectionModeDisplayText, getDatabaseStateDisplayText, getErrorMessage, getResourceTypeIcon, parseEndpoint, parseIpAndPort, promptAndConfirmPassword, promptForInstanceDeletion, resourceTypeToDisplayName } from '../../common/utils';
import { ConnectionMode as ConnectionMode, IconPathHelper } from '../../constants';
import * as loc from '../../localizedConstants';
import { MockInputBox } from '../stubs';
@@ -254,116 +254,3 @@ describe('parseIpAndPort', function (): void {
should(() => parseIpAndPort(ip)).throwError();
});
});
describe('convertToGibibyteString Method Tests', function () {
const tolerance = 0.001;
it('Value is in KB', function (): void {
const value = '44000K';
const conversion = 0.04097819;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in MB', function (): void {
const value = '1100M';
const conversion = 1.02445483;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in GB', function (): void {
const value = '1G';
const conversion = 0.931322575;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in TB', function (): void {
const value = '1T';
const conversion = 931.32257;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in PB', function (): void {
const value = '0.1P';
const conversion = 93132.25746;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in EB', function (): void {
const value = '1E';
const conversion = 931322574.6154;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in mB', function (): void {
const value = '1073741824000m';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in B', function (): void {
const value = '1073741824';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in KiB', function (): void {
const value = '1048576Ki';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in MiB', function (): void {
const value = '256Mi';
const conversion = 0.25;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in GiB', function (): void {
const value = '1000Gi';
const conversion = 1000;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in TiB', function (): void {
const value = '1Ti';
const conversion = 1024;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in PiB', function (): void {
const value = '1Pi';
const conversion = 1048576;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in EiB', function (): void {
const value = '1Ei';
const conversion = 1073741824;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is empty', function (): void {
const value = '';
const error = new Error(`Value provided is not a valid Kubernetes resource quantity`);
should(() => convertToGibibyteString(value)).throwError(error);
});
it('Value is not a valid Kubernetes resource quantity', function (): void {
const value = '1J';
const error = new Error(`${value} is not a valid Kubernetes resource quantity`);
should(() => convertToGibibyteString(value)).throwError(error);
});
});

View File

@@ -49,7 +49,6 @@ export class FakeAzdataApi implements azdataExt.IAzdataApi {
replaceEngineSettings?: boolean,
workers?: number
},
_engineVersion?: string,
_additionalEnvVars?: { [key: string]: string }): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); }
}
},
@@ -57,16 +56,7 @@ export class FakeAzdataApi implements azdataExt.IAzdataApi {
mi: {
delete(_name: string): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); },
async list(): Promise<azdataExt.AzdataOutput<azdataExt.SqlMiListResult[]>> { return <any>{ result: self.miaaInstances }; },
show(_name: string): Promise<azdataExt.AzdataOutput<azdataExt.SqlMiShowResult>> { throw new Error('Method not implemented.'); },
edit(
_name: string,
_args: {
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string,
noWait?: boolean
}): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); }
show(_name: string): Promise<azdataExt.AzdataOutput<azdataExt.SqlMiShowResult>> { throw new Error('Method not implemented.'); }
}
}
};

View File

@@ -1,91 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as vscode from 'vscode';
export class FakeRadioButton implements azdata.RadioButtonComponent {
private _onDidClickEmitter = new vscode.EventEmitter<any>();
onDidClick = this._onDidClickEmitter.event;
constructor(props: azdata.RadioButtonProperties) {
this.label = props.label;
this.value = props.value;
this.checked = props.checked;
this.enabled = props.enabled;
}
//#region RadioButtonProperties implementation
label?: string;
value?: string;
checked?: boolean;
//#endregion
click() {
this.checked = true;
this._onDidClickEmitter.fire(this);
}
//#region Component Implementation
id: string = '';
updateProperties(_properties: { [key: string]: any; }): Thenable<void> {
throw new Error('Method not implemented.');
}
updateProperty(_key: string, _value: any): Thenable<void> {
throw new Error('Method not implemented.');
}
updateCssStyles(_cssStyles: { [key: string]: string; }): Thenable<void> {
throw new Error('Method not implemented.');
}
onValidityChanged: vscode.Event<boolean> = <vscode.Event<boolean>>{};
valid: boolean = false;
validate(): Thenable<boolean> {
throw new Error('Method not implemented.');
}
focus(): Thenable<void> {
throw new Error('Method not implemented.');
}
ariaHidden?: boolean | undefined;
//#endregion
//#region ComponentProperties Implementation
height?: number | string;
width?: number | string;
/**
* The position CSS property. Empty by default.
* This is particularly useful if laying out components inside a FlexContainer and
* the size of the component is meant to be a fixed size. In this case the position must be
* set to 'absolute', with the parent FlexContainer having 'relative' position.
* Without this the component will fail to correctly size itself
*/
position?: azdata.PositionType;
/**
* Whether the component is enabled in the DOM
*/
enabled?: boolean;
/**
* Corresponds to the display CSS property for the element
*/
display?: azdata.DisplayType;
/**
* Corresponds to the aria-label accessibility attribute for this component
*/
ariaLabel?: string;
/**
* Corresponds to the role accessibility attribute for this component
*/
ariaRole?: string;
/**
* Corresponds to the aria-selected accessibility attribute for this component
*/
ariaSelected?: boolean;
/**
* Matches the CSS style key and its available values.
*/
CSSStyles?: { [key: string]: string };
//#endregion
}

View File

@@ -11,8 +11,7 @@ import * as sinon from 'sinon';
import * as TypeMoq from 'typemoq';
import { v4 as uuid } from 'uuid';
import * as vscode from 'vscode';
import * as loc from '../../localizedConstants';
import { UserCancelledError } from '../../common/api';
import { UserCancelledError } from '../../common/utils';
import { ControllerModel } from '../../models/controllerModel';
import { ConnectToControllerDialog } from '../../ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from '../../ui/tree/azureArcTreeDataProvider';
@@ -40,11 +39,11 @@ describe('ControllerModel', function (): void {
// Returning an undefined model here indicates that the dialog closed without clicking "Ok" - usually through the user clicking "Cancel"
sinon.stub(ConnectToControllerDialog.prototype, 'waitForClose').returns(Promise.resolve(undefined));
const model = new ControllerModel(new AzureArcTreeDataProvider(mockExtensionContext.object), { id: uuid(), url: '127.0.0.1', username: 'admin', name: 'arc', rememberPassword: true, resources: [] });
await should(model.azdataLogin()).be.rejectedWith(new UserCancelledError(loc.userCancelledError));
await should(model.azdataLogin()).be.rejectedWith(new UserCancelledError());
});
it('Reads password from cred store', async function (): Promise<void> {
const password = 'password123'; // [SuppressMessage("Microsoft.Security", "CS001:SecretInline", Justification="Test password, not actually used")]
const password = 'password123';
// Set up cred store to return our password
const credProviderMock = TypeMoq.Mock.ofType<azdata.CredentialProvider>();
@@ -65,7 +64,7 @@ describe('ControllerModel', function (): void {
});
it('Prompt for password when not in cred store', async function (): Promise<void> {
const password = 'password123'; // [SuppressMessage("Microsoft.Security", "CS001:SecretInline", Justification="Stub value for testing")]
const password = 'password123';
// Set up cred store to return empty password
const credProviderMock = TypeMoq.Mock.ofType<azdata.CredentialProvider>();
@@ -91,7 +90,7 @@ describe('ControllerModel', function (): void {
});
it('Prompt for password when rememberPassword is true but prompt reconnect is true', async function (): Promise<void> {
const password = 'password123'; // [SuppressMessage("Microsoft.Security", "CS001:SecretInline", Justification="Stub value for testing")]
const password = 'password123';
// Set up cred store to return a password to start with
const credProviderMock = TypeMoq.Mock.ofType<azdata.CredentialProvider>();
credProviderMock.setup(x => x.readCredential(TypeMoq.It.isAny())).returns(() => Promise.resolve({ credentialId: 'id', password: 'originalPassword' }));
@@ -117,7 +116,7 @@ describe('ControllerModel', function (): void {
});
it('Prompt for password when we already have a password but prompt reconnect is true', async function (): Promise<void> {
const password = 'password123'; // [SuppressMessage("Microsoft.Security", "CS001:SecretInline", Justification="Stub value for testing")]
const password = 'password123';
// Set up cred store to return a password to start with
const credProviderMock = TypeMoq.Mock.ofType<azdata.CredentialProvider>();
credProviderMock.setup(x => x.readCredential(TypeMoq.It.isAny())).returns(() => Promise.resolve({ credentialId: 'id', password: 'originalPassword' }));

View File

@@ -0,0 +1,2 @@
pytest==5.3.5
notebook==6.0.3

View File

@@ -3,66 +3,8 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as TypeMoq from 'typemoq';
import * as vscode from 'vscode';
export function createModelViewMock() {
const mockModelBuilder = TypeMoq.Mock.ofType<azdata.ModelBuilder>();
const mockTextBuilder = setupMockComponentBuilder<azdata.TextComponent, azdata.TextComponentProperties>();
const mockInputBoxBuilder = setupMockComponentBuilder<azdata.InputBoxComponent, azdata.InputBoxProperties>();
const mockRadioButtonBuilder = setupMockComponentBuilder<azdata.RadioButtonComponent, azdata.RadioButtonProperties>();
const mockDivBuilder = setupMockContainerBuilder<azdata.DivContainer, azdata.DivContainerProperties, azdata.DivBuilder>();
const mockLoadingBuilder = setupMockLoadingBuilder();
mockModelBuilder.setup(b => b.loadingComponent()).returns(() => mockLoadingBuilder.object);
mockModelBuilder.setup(b => b.text()).returns(() => mockTextBuilder.object);
mockModelBuilder.setup(b => b.inputBox()).returns(() => mockInputBoxBuilder.object);
mockModelBuilder.setup(b => b.radioButton()).returns(() => mockRadioButtonBuilder.object);
mockModelBuilder.setup(b => b.divContainer()).returns(() => mockDivBuilder.object);
const mockModelView = TypeMoq.Mock.ofType<azdata.ModelView>();
mockModelView.setup(mv => mv.modelBuilder).returns(() => mockModelBuilder.object);
return { mockModelView, mockModelBuilder, mockTextBuilder, mockInputBoxBuilder, mockRadioButtonBuilder, mockDivBuilder };
}
function setupMockLoadingBuilder(
loadingBuilderGetter?: (item: azdata.Component) => azdata.LoadingComponentBuilder,
mockLoadingBuilder?: TypeMoq.IMock<azdata.LoadingComponentBuilder>
): TypeMoq.IMock<azdata.LoadingComponentBuilder> {
mockLoadingBuilder = mockLoadingBuilder ?? setupMockComponentBuilder<azdata.LoadingComponent, azdata.LoadingComponentProperties, azdata.LoadingComponentBuilder>();
let item: azdata.Component;
mockLoadingBuilder.setup(b => b.withItem(TypeMoq.It.isAny())).callback((_item) => item = _item).returns(() => loadingBuilderGetter ? loadingBuilderGetter(item) : mockLoadingBuilder!.object);
return mockLoadingBuilder;
}
export function setupMockComponentBuilder<T extends azdata.Component, P extends azdata.ComponentProperties, B extends azdata.ComponentBuilder<T, P> = azdata.ComponentBuilder<T, P>>(
componentGetter?: (props: P) => T,
mockComponentBuilder?: TypeMoq.IMock<B>,
): TypeMoq.IMock<B> {
mockComponentBuilder = mockComponentBuilder ?? TypeMoq.Mock.ofType<B>();
const returnComponent = TypeMoq.Mock.ofType<T>();
// Need to setup 'then' for when a mocked object is resolved otherwise the test will hang : https://github.com/florinn/typemoq/issues/66
returnComponent.setup((x: any) => x.then).returns(() => { });
let compProps: P;
mockComponentBuilder.setup(b => b.withProperties(TypeMoq.It.isAny())).callback((props: P) => compProps = props).returns(() => mockComponentBuilder!.object);
mockComponentBuilder.setup(b => b.component()).returns(() => {
return componentGetter ? componentGetter(compProps) : Object.assign<T, P>(Object.assign({}, returnComponent.object), compProps);
});
// For now just have these be passthrough - can hook up additional functionality later if needed
mockComponentBuilder.setup(b => b.withValidation(TypeMoq.It.isAny())).returns(() => mockComponentBuilder!.object);
return mockComponentBuilder;
}
export function setupMockContainerBuilder<T extends azdata.Container<any, any>, P extends azdata.ComponentProperties, B extends azdata.ContainerBuilder<T, any, any, any> = azdata.ContainerBuilder<T, any, any, any>>(
mockContainerBuilder?: TypeMoq.IMock<B>
): TypeMoq.IMock<B> {
mockContainerBuilder = mockContainerBuilder ?? setupMockComponentBuilder<T, P, B>();
// For now just have these be passthrough - can hook up additional functionality later if needed
mockContainerBuilder.setup(b => b.withItems(TypeMoq.It.isAny(), undefined)).returns(() => mockContainerBuilder!.object);
mockContainerBuilder.setup(b => b.withLayout(TypeMoq.It.isAny())).returns(() => mockContainerBuilder!.object);
return mockContainerBuilder;
}
export class MockInputBox implements vscode.InputBox {
private _value: string = '';
public get value(): string {

View File

@@ -0,0 +1,111 @@
##---------------------------------------------------------------------------------------------
## Copyright (c) Microsoft Corporation. All rights reserved.
## Licensed under the Source EULA. See License.txt in the project root for license information.
##--------------------------------------------------------------------------------------------
import json
import nbformat
import os
import random
import string
import sys
import uuid
from nbconvert.preprocessors import ExecutePreprocessor
from subprocess import Popen, PIPE, TimeoutExpired
## Variables
notebook_path = '../../notebooks/arcDeployment/'
## Helper functions
def generate_name(prefix, length=8):
return (prefix + '-' + ''.join(
[random.choice(string.ascii_lowercase)
for n in range(length - len(prefix) - 1)]))
def clear_env():
for k in [k for k in os.environ.keys() if k.startswith('AZDATA_NB_VAR_')]:
del os.environ[k]
def azdata(commands, timeout=None, stdin=None):
commands.insert(0, "azdata")
print('Executing command: \n', ' '.join(commands))
proc = Popen(commands, stdin=PIPE if stdin is not None else None, stdout=PIPE, stderr=PIPE, shell=os.name=='nt')
try:
(stdout, stderr) = proc.communicate(input=stdin, timeout=timeout)
except TimeoutExpired:
# https://docs.python.org/3.5/library/subprocess.html#subprocess.Popen.communicate
# The child process is not killed if the timeout expires, so in order to
# cleanup properly we should kill the child process and finish communication.
proc.kill()
(stdout, stderr) = proc.communicate(timeout=timeout)
sys.stdout.buffer.write(stdout)
sys.stderr.buffer.write(stderr)
raise
sys.stdout.buffer.write(stdout)
if proc.returncode != 0:
raise Exception(stderr)
else:
sys.stderr.buffer.write(stderr)
return (stdout.decode(sys.stdout.encoding),
stderr.decode(sys.stderr.encoding))
## Tests
def test_postgres_create():
# Load the notebook
with open(notebook_path + 'deploy.postgres.existing.arc.ipynb') as f:
nb = nbformat.read(f, as_version=nbformat.NO_CONVERT)
name = generate_name('pg')
try:
# Setup the environment
os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME'] = name
subscription = os.environ['AZDATA_NB_VAR_ARC_SUBSCRIPTION'] = str(uuid.uuid4())
resource_group = os.environ['AZDATA_NB_VAR_ARC_RESOURCE_GROUP_NAME'] = 'test'
namespace = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAMESPACE'] = 'default'
workers = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_WORKERS'] = '1'
service_type = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_SERVICE_TYPE'] = 'NodePort'
data_size = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_DATA_SIZE'] = '512'
port = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PORT'] = '5431'
extensions = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_EXTENSIONS'] = 'pg_cron,postgis'
cpu_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CPU_MIN'] = '1'
cpu_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CPU_MAX'] = '2'
memory_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_MIN'] = '256'
memory_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_MAX'] = '1023'
backup_sizes = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_SIZES'] = '512,1023'
backup_full_interval = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_FULL_INTERVAL'] = '20'
backup_delta_interval = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_DELTA_INTERVAL'] = '10'
backup_retention_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_RETENTION_MIN'] = '1,1GB;2,2GB'
backup_retention_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_RETENTION_MAX'] = '2,2GB;3,3GB'
# Execute the notebook that creates Postgres
ExecutePreprocessor(timeout=1200).preprocess(nb, {'metadata': {'path': notebook_path}})
# Verify that Postgres was created successfully
(out, _) = azdata(['postgres', 'server', 'show', '-n', name])
db = json.loads(out)
assert db['metadata']['name'] == name
assert db['metadata']['namespace'] == namespace
assert db['spec']['scale']['shards'] == int(workers)
assert db['spec']['service']['type'] == service_type
assert db['spec']['storage']['volumeSize'] == data_size + 'Mi'
assert db['spec']['service']['port'] == int(port)
assert [p['name'] for p in db['spec']['engine']['plugins']] == ['pg_cron' ,'postgis']
assert db['spec']['scheduling']['default']['resources']['requests']['cpu'] == cpu_min
assert db['spec']['scheduling']['default']['resources']['limits']['cpu'] == cpu_max
assert db['spec']['scheduling']['default']['resources']['requests']['memory'] == memory_min + 'Mi'
assert db['spec']['scheduling']['default']['resources']['limits']['memory'] == memory_max + 'Mi'
assert [t['storage']['volumeSize'] for t in db['spec']['backups']['tiers']] == [b + 'Mi' for b in backup_sizes.split(',')]
assert db['spec']['backups']['fullMinutes'] == int(backup_full_interval)
assert db['spec']['backups']['deltaMinutes'] == int(backup_delta_interval)
for i in range(len(db['spec']['backups']['tiers'])):
assert db['spec']['backups']['tiers'][i]['retention']['minimums'] == backup_retention_min.split(';')[i].split(',')
assert db['spec']['backups']['tiers'][i]['retention']['maximums'] == backup_retention_max.split(';')[i].split(',')
except Exception:
# Capture cell outputs to help with debugging
print([c['outputs'] for c in nb['cells'] if c.get('outputs')])
raise
finally:
clear_env()
azdata(['postgres', 'server', 'delete', '-n', name])

View File

@@ -1,88 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as should from 'should';
import { getErrorMessage } from '../../../common/utils';
import { RadioOptionsGroup, RadioOptionsInfo } from '../../../ui/components/radioOptionsGroup';
import { FakeRadioButton } from '../../mocks/fakeRadioButton';
import { setupMockComponentBuilder, createModelViewMock } from '../../stubs';
const loadingError = new Error('Error loading options');
const radioOptionsInfo = <RadioOptionsInfo>{
values: [
'value1',
'value2'
],
defaultValue: 'value2'
};
const divItems: azdata.Component[] = [];
let radioOptionsGroup: RadioOptionsGroup;
describe('radioOptionsGroup', function (): void {
beforeEach(async () => {
const { mockModelView, mockRadioButtonBuilder, mockDivBuilder } = createModelViewMock();
mockRadioButtonBuilder.reset(); // reset any previous mock so that we can set our own.
setupMockComponentBuilder<azdata.RadioButtonComponent, azdata.RadioButtonProperties>(
(props) => new FakeRadioButton(props),
mockRadioButtonBuilder,
);
mockDivBuilder.reset(); // reset previous setups so new setups we are about to create will replace the setups instead creating a recording chain
// create new setups for the DivContainer with custom behavior
setupMockComponentBuilder<azdata.DivContainer, azdata.DivContainerProperties, azdata.DivBuilder>(
() => <azdata.DivContainer>{
addItem: (item) => { divItems.push(item); },
clearItems: () => { divItems.length = 0; },
get items() { return divItems; },
},
mockDivBuilder
);
radioOptionsGroup = new RadioOptionsGroup(mockModelView.object, (_disposable) => { });
await radioOptionsGroup.load(async () => radioOptionsInfo);
});
it('verify construction and load', async () => {
should(radioOptionsGroup).not.be.undefined();
should(radioOptionsGroup.value).not.be.undefined();
radioOptionsGroup.value!.should.equal('value2', 'radio options group should be the default checked value');
// verify all the radioButtons created in the group
verifyRadioGroup();
});
it('onClick', async () => {
// click the radioButton corresponding to 'value1'
(divItems as FakeRadioButton[]).filter(r => r.value === 'value1').pop()!.click();
radioOptionsGroup.value!.should.equal('value1', 'radio options group should correspond to the radioButton that we clicked');
// verify all the radioButtons created in the group
verifyRadioGroup();
});
it('load throws', async () => {
radioOptionsGroup.load(() => { throw loadingError; });
//in error case radioButtons array wont hold radioButtons but holds a TextComponent with value equal to error string
divItems.length.should.equal(1, 'There is should be only one element in the divContainer when loading error happens');
const label = divItems[0] as azdata.TextComponent;
should(label.value).not.be.undefined();
label.value!.should.deepEqual(getErrorMessage(loadingError));
should(label.CSSStyles).not.be.undefined();
should(label.CSSStyles!.color).not.be.undefined();
label.CSSStyles!.color.should.equal('Red');
});
});
function verifyRadioGroup() {
const radioButtons = divItems as FakeRadioButton[];
radioButtons.length.should.equal(radioOptionsInfo.values!.length);
radioButtons.forEach(rb => {
should(rb.label).not.be.undefined();
should(rb.value).not.be.undefined();
should(rb.enabled).not.be.undefined();
rb.label!.should.equal(rb.value);
rb.enabled!.should.be.true();
});
}

View File

@@ -1,72 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as vscode from 'vscode';
import { getErrorMessage } from '../../common/utils';
export interface RadioOptionsInfo {
values?: string[],
defaultValue: string
}
export class RadioOptionsGroup {
static id: number = 1;
private _divContainer!: azdata.DivContainer;
private _loadingBuilder: azdata.LoadingComponentBuilder;
private _currentRadioOption!: azdata.RadioButtonComponent;
constructor(private _view: azdata.ModelView, private _onNewDisposableCreated: (disposable: vscode.Disposable) => void, private _groupName: string = `RadioOptionsGroup${RadioOptionsGroup.id++}`) {
const divBuilder = this._view.modelBuilder.divContainer();
const divBuilderWithProperties = divBuilder.withProperties<azdata.DivContainerProperties>({ clickable: false });
this._divContainer = divBuilderWithProperties.component();
const loadingComponentBuilder = this._view.modelBuilder.loadingComponent();
this._loadingBuilder = loadingComponentBuilder.withItem(this._divContainer);
}
public component(): azdata.LoadingComponent {
return this._loadingBuilder.component();
}
async load(optionsInfoGetter: () => Promise<RadioOptionsInfo>): Promise<void> {
this.component().loading = true;
this._divContainer.clearItems();
try {
const optionsInfo = await optionsInfoGetter();
const options = optionsInfo.values!;
let defaultValue: string = optionsInfo.defaultValue!;
options.forEach((option: string) => {
const radioOption = this._view!.modelBuilder.radioButton().withProperties<azdata.RadioButtonProperties>({
label: option,
checked: option === defaultValue,
name: this._groupName,
value: option,
enabled: true
}).component();
if (radioOption.checked) {
this._currentRadioOption = radioOption;
}
this._onNewDisposableCreated(radioOption.onDidClick(() => {
if (this._currentRadioOption !== radioOption) {
// uncheck the previously saved radio option, the ui gets handled correctly even if we did not do this due to the use of the 'groupName',
// however, the checked properties on the radio button do not get updated, so while the stuff works even if we left the previous option checked,
// it is just better to keep things clean.
this._currentRadioOption.checked = false;
this._currentRadioOption = radioOption;
}
}));
this._divContainer.addItem(radioOption);
});
}
catch (e) {
const errorLabel = this._view!.modelBuilder.text().withProperties({ value: getErrorMessage(e), CSSStyles: { 'color': 'Red' } }).component();
this._divContainer.addItem(errorLabel);
}
this.component().loading = false;
}
get value(): string | undefined {
return this._currentRadioOption?.value;
}
}

View File

@@ -1,369 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as vscode from 'vscode';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as loc from '../../../localizedConstants';
import { IconPathHelper, cssStyles } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
import { convertToGibibyteString } from '../../../common/utils';
import { MiaaModel } from '../../../models/miaaModel';
export class MiaaComputeAndStoragePage extends DashboardPage {
private configurationContainer?: azdata.DivContainer;
private coresLimitBox?: azdata.InputBoxComponent;
private coresRequestBox?: azdata.InputBoxComponent;
private memoryLimitBox?: azdata.InputBoxComponent;
private memoryRequestBox?: azdata.InputBoxComponent;
private discardButton?: azdata.ButtonComponent;
private saveButton?: azdata.ButtonComponent;
private saveArgs: {
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string
} = {};
private readonly _azdataApi: azdataExt.IExtension;
constructor(protected modelView: azdata.ModelView, private _miaaModel: MiaaModel) {
super(modelView);
this._azdataApi = vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
this.initializeConfigurationBoxes();
this.disposables.push(this._miaaModel.onConfigUpdated(
() => this.eventuallyRunOnInitialized(() => this.handleServiceUpdated())));
}
protected get title(): string {
return loc.computeAndStorage;
}
protected get id(): string {
return 'miaa-compute-and-storage';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.computeStorage;
}
protected get container(): azdata.Component {
const root = this.modelView.modelBuilder.divContainer().component();
const content = this.modelView.modelBuilder.divContainer().component();
root.addItem(content, { CSSStyles: { 'margin': '20px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorage,
CSSStyles: { ...cssStyles.title }
}).component());
const infoComputeStorage_p1 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.miaaComputeAndStorageDescriptionPartOne,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px', 'max-width': 'auto' }
}).component();
const memoryVCoreslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.scalingCompute,
url: 'https://docs.microsoft.com/azure/azure-arc/data/configure-managed-instance',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p4 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFour,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p5 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFive,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p6 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartSix,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const computeInfoAndLinks = this.modelView.modelBuilder.flexContainer()
.withLayout({ flexWrap: 'wrap' })
.withItems([
infoComputeStorage_p1,
memoryVCoreslink,
infoComputeStorage_p4,
infoComputeStorage_p5,
infoComputeStorage_p6
], { CSSStyles: { 'margin-right': '5px' } }).component();
content.addItem(computeInfoAndLinks, { CSSStyles: { 'min-height': '30px' } });
this.configurationContainer = this.modelView.modelBuilder.divContainer().component();
this.configurationContainer.addItems(this.createUserInputSection(), { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.configurationContainer, { CSSStyles: { 'margin-top': '30px' } });
this.initialized = true;
return root;
}
protected get toolbarContainer(): azdata.ToolbarContainer {
// Save Edits
this.saveButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.saveText,
iconPath: IconPathHelper.save,
enabled: false
}).component();
this.disposables.push(
this.saveButton.onDidClick(async () => {
this.saveButton!.enabled = false;
try {
await vscode.window.withProgress(
{
location: vscode.ProgressLocation.Notification,
title: loc.updatingInstance(this._miaaModel.info.name),
cancellable: false
},
async (_progress, _token): Promise<void> => {
try {
await this._azdataApi.azdata.arc.sql.mi.edit(
this._miaaModel.info.name, this.saveArgs);
} catch (err) {
this.saveButton!.enabled = true;
throw err;
}
await this._miaaModel.refresh();
}
);
vscode.window.showInformationMessage(loc.instanceUpdated(this._miaaModel.info.name));
this.discardButton!.enabled = false;
} catch (error) {
vscode.window.showErrorMessage(loc.instanceUpdateFailed(this._miaaModel.info.name, error));
}
}));
// Discard
this.discardButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.discardText,
iconPath: IconPathHelper.discard,
enabled: false
}).component();
this.disposables.push(
this.discardButton.onDidClick(async () => {
this.discardButton!.enabled = false;
try {
this.editCores();
this.editMemory();
} catch (error) {
vscode.window.showErrorMessage(loc.pageDiscardFailed(error));
} finally {
this.saveButton!.enabled = false;
}
}));
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems([
{ component: this.saveButton },
{ component: this.discardButton }
]).component();
}
private initializeConfigurationBoxes() {
this.coresLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresLimitBox!))) {
this.saveArgs.coresLimit = undefined;
} else {
this.saveArgs.coresLimit = this.coresLimitBox!.value;
}
})
);
this.coresRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresRequestBox!))) {
this.saveArgs.coresRequest = undefined;
} else {
this.saveArgs.coresRequest = this.coresRequestBox!.value;
}
})
);
this.memoryLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 2,
validationErrorMessage: loc.memoryLimitValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryLimitBox!))) {
this.saveArgs.memoryLimit = undefined;
} else {
this.saveArgs.memoryLimit = this.memoryLimitBox!.value + 'Gi';
}
})
);
this.memoryRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 2,
validationErrorMessage: loc.memoryRequestValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryRequestBox!))) {
this.saveArgs.memoryRequest = undefined;
} else {
this.saveArgs.memoryRequest = this.memoryRequestBox!.value + 'Gi';
}
})
);
}
private createUserInputSection(): azdata.Component[] {
if (this._miaaModel.configLastUpdated) {
this.editCores();
this.editMemory();
}
return [
this.createConfigurationSectionContainer(loc.coresRequest, this.coresRequestBox!),
this.createConfigurationSectionContainer(loc.coresLimit, this.coresLimitBox!),
this.createConfigurationSectionContainer(loc.memoryRequest, this.memoryRequestBox!),
this.createConfigurationSectionContainer(loc.memoryLimit, this.memoryLimitBox!)
];
}
private createConfigurationSectionContainer(key: string, input: azdata.Component): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: key,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(input, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private handleOnTextChanged(component: azdata.InputBoxComponent): boolean {
if ((!component.value)) {
// if there is no text found in the inputbox component return false
return false;
} else if ((!component.valid)) {
// if value given by user is not valid enable discard button for user
// to clear all inputs and return false
this.discardButton!.enabled = true;
return false;
} else {
// if a valid value has been entered into the input box, enable save and discard buttons
// so that user could choose to either edit instance or clear all inputs
// return true
this.saveButton!.enabled = true;
this.discardButton!.enabled = true;
return true;
}
}
private editCores(): void {
let currentCPUSize = this._miaaModel.config?.spec?.requests?.vcores;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresRequestBox!.validationErrorMessage = loc.validationMin(this.coresRequestBox!.min!);
this.coresRequestBox!.placeHolder = currentCPUSize;
this.coresRequestBox!.value = '';
this.saveArgs.coresRequest = undefined;
currentCPUSize = this._miaaModel.config?.spec?.limits?.vcores;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresLimitBox!.validationErrorMessage = loc.validationMin(this.coresLimitBox!.min!);
this.coresLimitBox!.placeHolder = currentCPUSize;
this.coresLimitBox!.value = '';
this.saveArgs.coresLimit = undefined;
}
private editMemory(): void {
let currentMemSizeConversion: string;
let currentMemorySize = this._miaaModel.config?.spec?.requests?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryRequestBox!.placeHolder = currentMemSizeConversion!;
this.memoryRequestBox!.value = '';
this.saveArgs.memoryRequest = undefined;
currentMemorySize = this._miaaModel.config?.spec?.limits?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryLimitBox!.placeHolder = currentMemSizeConversion!;
this.memoryLimitBox!.value = '';
this.saveArgs.memoryLimit = undefined;
}
private handleServiceUpdated() {
this.editCores();
this.editMemory();
}
}

View File

@@ -91,7 +91,8 @@ export class MiaaConnectionStringsPage extends DashboardPage {
$serverName = "${externalEndpoint.ip},${externalEndpoint.port}";
$conn = sqlsrv_connect($serverName, $connectionInfo);`),
new InputKeyValue(this.modelView.modelBuilder, 'Python', `dbname='master' user='${username}' host='${externalEndpoint.ip}' password='{your_password_here}' port='${externalEndpoint.port}' sslmode='true'`),
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${externalEndpoint.ip}; user=${username} password={your_password_here} port=${externalEndpoint.port} sslmode=require`)
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${externalEndpoint.ip}; user=${username} password={your_password_here} port=${externalEndpoint.port} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'Web App', `Database=master; Data Source=${externalEndpoint.ip}; User Id=${username}; Password={your_password_here}`)
];
}

View File

@@ -10,7 +10,6 @@ import { ControllerModel } from '../../../models/controllerModel';
import * as loc from '../../../localizedConstants';
import { MiaaConnectionStringsPage } from './miaaConnectionStringsPage';
import { MiaaModel } from '../../../models/miaaModel';
import { MiaaComputeAndStoragePage } from './miaaComputeAndStoragePage';
export class MiaaDashboard extends Dashboard {
@@ -28,14 +27,12 @@ export class MiaaDashboard extends Dashboard {
protected async registerTabs(modelView: azdata.ModelView): Promise<(azdata.DashboardTab | azdata.DashboardTabGroup)[]> {
const overviewPage = new MiaaDashboardOverviewPage(modelView, this._controllerModel, this._miaaModel);
const connectionStringsPage = new MiaaConnectionStringsPage(modelView, this._controllerModel, this._miaaModel);
const computeAndStoragePage = new MiaaComputeAndStoragePage(modelView, this._miaaModel);
return [
overviewPage.tab,
{
title: loc.settings,
tabs: [
connectionStringsPage.tab,
computeAndStoragePage.tab
connectionStringsPage.tab
]
},
];

View File

@@ -1,502 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as vscode from 'vscode';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as loc from '../../../localizedConstants';
import { IconPathHelper, cssStyles } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
import { PostgresModel } from '../../../models/postgresModel';
import { convertToGibibyteString } from '../../../common/utils';
export class PostgresComputeAndStoragePage extends DashboardPage {
private workerContainer?: azdata.DivContainer;
private workerBox?: azdata.InputBoxComponent;
private coresLimitBox?: azdata.InputBoxComponent;
private coresRequestBox?: azdata.InputBoxComponent;
private memoryLimitBox?: azdata.InputBoxComponent;
private memoryRequestBox?: azdata.InputBoxComponent;
private discardButton?: azdata.ButtonComponent;
private saveButton?: azdata.ButtonComponent;
private saveArgs: {
workers?: number,
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string
} = {};
private readonly _azdataApi: azdataExt.IExtension;
constructor(protected modelView: azdata.ModelView, private _postgresModel: PostgresModel) {
super(modelView);
this._azdataApi = vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
this.initializeConfigurationBoxes();
this.disposables.push(this._postgresModel.onConfigUpdated(
() => this.eventuallyRunOnInitialized(() => this.handleServiceUpdated())));
}
protected get title(): string {
return loc.computeAndStorage;
}
protected get id(): string {
return 'postgres-compute-and-storage';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.computeStorage;
}
protected get container(): azdata.Component {
const root = this.modelView.modelBuilder.divContainer().component();
const content = this.modelView.modelBuilder.divContainer().component();
root.addItem(content, { CSSStyles: { 'margin': '20px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorage,
CSSStyles: { ...cssStyles.title }
}).component());
const infoComputeStorage_p1 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.postgresComputeAndStorageDescriptionPartOne,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px', 'max-width': 'auto' }
}).component();
const infoComputeStorage_p2 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.postgresComputeAndStorageDescriptionPartTwo,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const workerNodeslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.addingWokerNodes,
url: 'https://docs.microsoft.com/azure/azure-arc/data/scale-up-down-postgresql-hyperscale-server-group-using-cli',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p3 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartThree,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const memoryVCoreslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.scalingCompute,
url: 'https://docs.microsoft.com/azure/azure-arc/data/scale-up-down-postgresql-hyperscale-server-group-using-cli',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p4 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFour,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p5 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFive,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p6 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartSix,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const computeInfoAndLinks = this.modelView.modelBuilder.flexContainer()
.withLayout({ flexWrap: 'wrap' })
.withItems([
infoComputeStorage_p1,
infoComputeStorage_p2,
workerNodeslink,
infoComputeStorage_p3,
memoryVCoreslink,
infoComputeStorage_p4,
infoComputeStorage_p5,
infoComputeStorage_p6
], { CSSStyles: { 'margin-right': '5px' } })
.component();
content.addItem(computeInfoAndLinks, { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.workerNodes,
CSSStyles: { ...cssStyles.title, 'margin-top': '25px' }
}).component());
this.workerContainer = this.modelView.modelBuilder.divContainer().component();
this.workerContainer.addItems(this.createUserInputSection(), { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.workerContainer, { CSSStyles: { 'min-height': '30px' } });
this.initialized = true;
return root;
}
protected get toolbarContainer(): azdata.ToolbarContainer {
// Save Edits
this.saveButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.saveText,
iconPath: IconPathHelper.save,
enabled: false
}).component();
this.disposables.push(
this.saveButton.onDidClick(async () => {
this.saveButton!.enabled = false;
try {
await vscode.window.withProgress(
{
location: vscode.ProgressLocation.Notification,
title: loc.updatingInstance(this._postgresModel.info.name),
cancellable: false
},
async (_progress, _token): Promise<void> => {
try {
await this._azdataApi.azdata.arc.postgres.server.edit(
this._postgresModel.info.name,
this.saveArgs,
this._postgresModel.engineVersion);
} catch (err) {
// If an error occurs while editing the instance then re-enable the save button since
// the edit wasn't successfully applied
this.saveButton!.enabled = true;
throw err;
}
await this._postgresModel.refresh();
}
);
vscode.window.showInformationMessage(loc.instanceUpdated(this._postgresModel.info.name));
this.discardButton!.enabled = false;
} catch (error) {
vscode.window.showErrorMessage(loc.instanceUpdateFailed(this._postgresModel.info.name, error));
}
}));
// Discard
this.discardButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.discardText,
iconPath: IconPathHelper.discard,
enabled: false
}).component();
this.disposables.push(
this.discardButton.onDidClick(async () => {
this.discardButton!.enabled = false;
try {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
} catch (error) {
vscode.window.showErrorMessage(loc.pageDiscardFailed(error));
} finally {
this.saveButton!.enabled = false;
}
}));
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems([
{ component: this.saveButton },
{ component: this.discardButton }
]).component();
}
private initializeConfigurationBoxes() {
this.workerBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
validationErrorMessage: loc.workerValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.workerBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.workerBox!))) {
this.saveArgs.workers = undefined;
} else {
this.saveArgs.workers = parseInt(this.workerBox!.value!);
}
})
);
this.coresLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresLimitBox!))) {
this.saveArgs.coresLimit = undefined;
} else {
this.saveArgs.coresLimit = this.coresLimitBox!.value;
}
})
);
this.coresRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresRequestBox!))) {
this.saveArgs.coresRequest = undefined;
} else {
this.saveArgs.coresRequest = this.coresRequestBox!.value;
}
})
);
this.memoryLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 0.25,
validationErrorMessage: loc.memoryLimitValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryLimitBox!))) {
this.saveArgs.memoryLimit = undefined;
} else {
this.saveArgs.memoryLimit = this.memoryLimitBox!.value + 'Gi';
}
})
);
this.memoryRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 0.25,
validationErrorMessage: loc.memoryRequestValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryRequestBox!))) {
this.saveArgs.memoryRequest = undefined;
} else {
this.saveArgs.memoryRequest = this.memoryRequestBox!.value + 'Gi';
}
})
);
}
private createUserInputSection(): azdata.Component[] {
if (this._postgresModel.configLastUpdated) {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
}
return [
this.createWorkerNodesSectionContainer(),
this.createCoresMemorySection(),
this.createConfigurationSectionContainer(loc.coresRequest, this.coresRequestBox!),
this.createConfigurationSectionContainer(loc.coresLimit, this.coresLimitBox!),
this.createConfigurationSectionContainer(loc.memoryRequest, this.memoryRequestBox!),
this.createConfigurationSectionContainer(loc.memoryLimit, this.memoryLimitBox!)
];
}
private createWorkerNodesSectionContainer(): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.workerNodeCount,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
const information = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
iconPath: IconPathHelper.information,
title: loc.workerNodesInformation,
width: '12px',
height: '12px',
enabled: false
}).component();
keyContainer.addItem(information, { CSSStyles: { 'margin-left': '5px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(this.workerBox!, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private createConfigurationSectionContainer(key: string, input: azdata.Component): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: key,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(input, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private handleOnTextChanged(component: azdata.InputBoxComponent): boolean {
if ((!component.value)) {
// if there is no text found in the inputbox component return false
return false;
} else if ((!component.valid)) {
// if value given by user is not valid enable discard button for user
// to clear all inputs and return false
this.discardButton!.enabled = true;
return false;
} else {
// if a valid value has been entered into the input box, enable save and discard buttons
// so that user could choose to either edit instance or clear all inputs
// return true
this.saveButton!.enabled = true;
this.discardButton!.enabled = true;
return true;
}
}
private editWorkerNodeCount() {
// scale.shards was renamed to scale.workers. Check both for backwards compatibility.
let scale = this._postgresModel.config?.spec.scale;
let currentWorkers = scale?.workers ?? scale?.shards ?? 0;
this.workerBox!.min = currentWorkers;
this.workerBox!.placeHolder = currentWorkers.toString();
this.workerBox!.value = '';
this.saveArgs.workers = undefined;
}
private createCoresMemorySection(): azdata.DivContainer {
const titleFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const titleComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.configurationPerNode,
CSSStyles: { ...cssStyles.title, 'font-weight': 'bold', 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const titleContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
titleContainer.addItem(titleComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
const information = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
iconPath: IconPathHelper.information,
title: loc.postgresConfigurationInformation,
width: '12px',
height: '12px',
enabled: false
}).component();
titleContainer.addItem(information, { CSSStyles: { 'margin-left': '5px', 'margin-bottom': '15px' } });
flexContainer.addItem(titleContainer, titleFlex);
let configurationSection = this.modelView.modelBuilder.divContainer().component();
configurationSection.addItem(flexContainer);
return configurationSection;
}
private editCores() {
let currentCPUSize = this._postgresModel.config?.spec.scheduling?.default?.resources?.requests?.cpu;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresRequestBox!.validationErrorMessage = loc.validationMin(this.coresRequestBox!.min!);
this.coresRequestBox!.placeHolder = currentCPUSize;
this.coresRequestBox!.value = '';
this.saveArgs.coresRequest = undefined;
currentCPUSize = this._postgresModel.config?.spec.scheduling?.default?.resources?.limits?.cpu;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresLimitBox!.validationErrorMessage = loc.validationMin(this.coresLimitBox!.min!);
this.coresLimitBox!.placeHolder = currentCPUSize;
this.coresLimitBox!.value = '';
this.saveArgs.coresLimit = undefined;
}
private editMemory() {
let currentMemSizeConversion: string;
let currentMemorySize = this._postgresModel.config?.spec.scheduling?.default?.resources?.requests?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryRequestBox!.placeHolder = currentMemSizeConversion!;
this.memoryRequestBox!.value = '';
this.saveArgs.memoryRequest = undefined;
currentMemorySize = this._postgresModel.config?.spec.scheduling?.default?.resources?.limits?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryLimitBox!.placeHolder = currentMemSizeConversion!;
this.memoryLimitBox!.value = '';
this.saveArgs.memoryLimit = undefined;
}
private handleServiceUpdated() {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
}
}

View File

@@ -49,7 +49,7 @@ export class PostgresConnectionStringsPage extends DashboardPage {
const link = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.learnAboutPostgresClients,
url: 'https://docs.microsoft.com/azure/azure-arc/data/get-connection-endpoints-and-connection-strings-postgres-hyperscale',
url: 'https://docs.microsoft.com/azure/postgresql/concepts-connection-libraries',
}).component();
const infoAndLink = this.modelView.modelBuilder.flexContainer().withLayout({ flexWrap: 'wrap' }).component();
@@ -82,7 +82,8 @@ export class PostgresConnectionStringsPage extends DashboardPage {
new InputKeyValue(this.modelView.modelBuilder, 'PHP', `host=${endpoint.ip} port=${endpoint.port} dbname=postgres user=postgres password={your_password_here} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'psql', `psql "host=${endpoint.ip} port=${endpoint.port} dbname=postgres user=postgres password={your_password_here} sslmode=require"`),
new InputKeyValue(this.modelView.modelBuilder, 'Python', `dbname='postgres' user='postgres' host='${endpoint.ip}' password='{your_password_here}' port='${endpoint.port}' sslmode='true'`),
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${endpoint.ip}; dbname=postgres user=postgres password={your_password_here} port=${endpoint.port} sslmode=require`)
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${endpoint.ip}; dbname=postgres user=postgres password={your_password_here} port=${endpoint.port} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'Web App', `Database=postgres; Data Source=${endpoint.ip}; User Id=postgres; Password={your_password_here}`)
];
}

View File

@@ -13,7 +13,6 @@ import { PostgresConnectionStringsPage } from './postgresConnectionStringsPage';
import { Dashboard } from '../../components/dashboard';
import { PostgresDiagnoseAndSolveProblemsPage } from './postgresDiagnoseAndSolveProblemsPage';
import { PostgresSupportRequestPage } from './postgresSupportRequestPage';
import { PostgresComputeAndStoragePage } from './postgresComputeAndStoragePage';
export class PostgresDashboard extends Dashboard {
constructor(private _context: vscode.ExtensionContext, private _controllerModel: ControllerModel, private _postgresModel: PostgresModel) {
@@ -31,7 +30,6 @@ export class PostgresDashboard extends Dashboard {
protected async registerTabs(modelView: azdata.ModelView): Promise<(azdata.DashboardTab | azdata.DashboardTabGroup)[]> {
const overviewPage = new PostgresOverviewPage(modelView, this._controllerModel, this._postgresModel);
const connectionStringsPage = new PostgresConnectionStringsPage(modelView, this._postgresModel);
const computeAndStoragePage = new PostgresComputeAndStoragePage(modelView, this._postgresModel);
// TODO: Removed properties page while investigating bug where refreshed values don't appear in UI
// const propertiesPage = new PostgresPropertiesPage(modelView, this._controllerModel, this._postgresModel);
const diagnoseAndSolveProblemsPage = new PostgresDiagnoseAndSolveProblemsPage(modelView, this._context, this._postgresModel);
@@ -42,8 +40,7 @@ export class PostgresDashboard extends Dashboard {
{
title: loc.settings,
tabs: [
connectionStringsPage.tab,
computeAndStoragePage.tab
connectionStringsPage.tab
]
},
{

View File

@@ -157,7 +157,6 @@ export class PostgresOverviewPage extends DashboardPage {
adminPassword: true,
noWait: true
},
this._postgresModel.engineVersion,
{ 'AZDATA_PASSWORD': password });
vscode.window.showInformationMessage(loc.passwordReset);
}

View File

@@ -5,7 +5,7 @@
import { MiaaResourceInfo, ResourceInfo, ResourceType } from 'arc';
import * as vscode from 'vscode';
import { UserCancelledError } from '../../common/api';
import { UserCancelledError } from '../../common/utils';
import * as loc from '../../localizedConstants';
import { ControllerModel, Registration } from '../../models/controllerModel';
import { MiaaModel } from '../../models/miaaModel';

View File

@@ -270,11 +270,6 @@
resolved "https://registry.yarnpkg.com/@types/uuid/-/uuid-8.3.0.tgz#215c231dff736d5ba92410e6d602050cce7e273f"
integrity sha512-eQ9qFW/fhfGJF8WKHGEHZEyVWfZxrT+6CLIJGBcZPfxUh/+BnEj+UCGYMlr9qZuX/2AltsvwrGqp0LhEW8D0zQ==
"@types/yamljs@^0.2.31":
version "0.2.31"
resolved "https://registry.yarnpkg.com/@types/yamljs/-/yamljs-0.2.31.tgz#b1a620b115c96db7b3bfdf0cf54aee0c57139245"
integrity sha512-QcJ5ZczaXAqbVD3o8mw/mEBhRvO5UAdTtbvgwL/OgoWubvNBh6/MxLBAigtcgIFaq3shon9m3POIxQaLQt4fxQ==
ajv@^6.5.5:
version "6.12.0"
resolved "https://registry.yarnpkg.com/ajv/-/ajv-6.12.0.tgz#06d60b96d87b8454a5adaba86e7854da629db4b7"
@@ -304,13 +299,6 @@ append-transform@^2.0.0:
dependencies:
default-require-extensions "^3.0.0"
argparse@^1.0.7:
version "1.0.10"
resolved "https://registry.yarnpkg.com/argparse/-/argparse-1.0.10.tgz#bcd6791ea5ae09725e17e5ad988134cd40b3d911"
integrity sha512-o5Roy6tNG4SL/FOkCAN6RzjiakZS25RLYFrcMttJqbdd8BWrnA+fGz57iN5Pb06pvBGvl5gQ0B48dJlslXvoTg==
dependencies:
sprintf-js "~1.0.2"
asn1@~0.2.3:
version "0.2.4"
resolved "https://registry.yarnpkg.com/asn1/-/asn1-0.2.4.tgz#8d2475dfab553bb33e77b54e59e880bb8ce23136"
@@ -592,7 +580,7 @@ glob@7.1.2:
once "^1.3.0"
path-is-absolute "^1.0.0"
glob@^7.0.5, glob@^7.1.2, glob@^7.1.3:
glob@^7.1.2, glob@^7.1.3:
version "7.1.6"
resolved "https://registry.yarnpkg.com/glob/-/glob-7.1.6.tgz#141f33b81a7c2492e125594307480c46679278a6"
integrity sha512-LwaxwyZ72Lk7vZINtNNrywX0ZuLyStrdDtabefZKAY5ZGJhVtgdznluResxNmPitE0SAO+O26sWTHeKSI2wMBA==
@@ -1121,11 +1109,6 @@ source-map@^0.6.1:
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
sprintf-js@~1.0.2:
version "1.0.3"
resolved "https://registry.yarnpkg.com/sprintf-js/-/sprintf-js-1.0.3.tgz#04e6926f662895354f3dd015203633b857297e2c"
integrity sha1-BOaSb2YolTVPPdAVIDYzuFcpfiw=
sshpk@^1.7.0:
version "1.16.1"
resolved "https://registry.yarnpkg.com/sshpk/-/sshpk-1.16.1.tgz#fb661c0bef29b39db40769ee39fa70093d6f6877"
@@ -1268,11 +1251,3 @@ xml@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/xml/-/xml-1.0.1.tgz#78ba72020029c5bc87b8a81a3cfcd74b4a2fc1e5"
integrity sha1-eLpyAgApxbyHuKgaPPzXS0ovweU=
yamljs@^0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/yamljs/-/yamljs-0.3.0.tgz#dc060bf267447b39f7304e9b2bfbe8b5a7ddb03b"
integrity sha512-C/FsVVhht4iPQYXOInoxUM/1ELSf9EsgKH34FofQOp6hwCPrW4vG4w5++TED3xRUo8gD7l0P1J1dLlDYzODsTQ==
dependencies:
argparse "^1.0.7"
glob "^7.0.5"

View File

@@ -1,226 +1,20 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
id="eec7d136-f112-46af-a3cf-bf91a5c75666"
data-name="Layer 1"
width="59.994473"
height="60"
viewBox="0 0 59.994473 60"
version="1.1"
sodipodi:docname="sqldb_edge.svg"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)">
<metadata
id="metadata71">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
</cc:Work>
</rdf:RDF>
</metadata>
<sodipodi:namedview
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1"
objecttolerance="10"
gridtolerance="10"
guidetolerance="10"
inkscape:pageopacity="0"
inkscape:pageshadow="2"
inkscape:window-width="1356"
inkscape:window-height="974"
id="namedview69"
showgrid="false"
inkscape:zoom="2.36"
inkscape:cx="29.997181"
inkscape:cy="30"
inkscape:window-x="0"
inkscape:window-y="0"
inkscape:window-maximized="0"
inkscape:current-layer="eec7d136-f112-46af-a3cf-bf91a5c75666" />
<defs
id="defs38">
<linearGradient
id="b5c6b104-454d-4729-acd6-907263c2d466"
x1="57.186001"
y1="740.06403"
x2="70.387001"
y2="740.06403"
gradientTransform="matrix(1,0,0,-1,0,770)"
gradientUnits="userSpaceOnUse">
<stop
offset="0"
stop-color="#005ba1"
id="stop2" />
<stop
offset="0.068"
stop-color="#0060a9"
id="stop4" />
<stop
offset="0.356"
stop-color="#0071c8"
id="stop6" />
<stop
offset="0.517"
stop-color="#0078d4"
id="stop8" />
<stop
offset="0.642"
stop-color="#0074cd"
id="stop10" />
<stop
offset="0.82"
stop-color="#006abb"
id="stop12" />
<stop
offset="1"
stop-color="#005ba1"
id="stop14" />
</linearGradient>
<linearGradient
id="a7ff9d6f-dfc1-44bf-b13b-162305f1fa8a"
x1="52.797001"
y1="704.80798"
x2="52.797001"
y2="742.84399"
gradientTransform="matrix(1,0,0,-1,0,770)"
gradientUnits="userSpaceOnUse">
<stop
offset="0"
stop-color="#198ab3"
id="stop17" />
<stop
offset="0.097"
stop-color="#209ec5"
id="stop19" />
<stop
offset="0.242"
stop-color="#28b6da"
id="stop21" />
<stop
offset="0.396"
stop-color="#2ec7e9"
id="stop23" />
<stop
offset="0.565"
stop-color="#31d1f2"
id="stop25" />
<stop
offset="0.775"
stop-color="#32d4f5"
id="stop27" />
</linearGradient>
<linearGradient
id="bc37103c-430d-46e7-aa83-30a9235b7b74"
x1="36.847"
y1="702.948"
x2="59.476002"
y2="702.948"
xlink:href="#b5c6b104-454d-4729-acd6-907263c2d466" />
<radialGradient
id="e36b5364-8f7a-422d-a36a-9a21681809ae"
cx="48.790001"
cy="702.229"
r="12.478"
gradientTransform="matrix(1,0,0,-1,0,770)"
gradientUnits="userSpaceOnUse">
<stop
offset="0"
stop-color="#f2f2f2"
id="stop31" />
<stop
offset="0.58"
stop-color="#eee"
id="stop33" />
<stop
offset="1"
stop-color="#e6e6e6"
id="stop35" />
</radialGradient>
<linearGradient
inkscape:collect="always"
xlink:href="#b5c6b104-454d-4729-acd6-907263c2d466"
id="linearGradient4581"
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(1,0,0,-1,0,770)"
x1="57.186001"
y1="740.06403"
x2="70.387001"
y2="740.06403" />
</defs>
<title
id="title40">SQL_Database_Edge_100x</title>
<g
id="g66"
transform="translate(-20.002819,-20)">
<path
d="m 57.135,30.379 c 5.493,1.494 6.882,2.438 11.833,6.466 -0.037,-0.064 -0.078,-0.12 -0.12,-0.189 -0.23,-0.369 -0.48,-0.72 -0.731,-1.062 -0.094,-0.12 -0.186,-0.253 -0.285,-0.373 -0.223,-0.275 -0.463,-0.527 -0.7,-0.78 a 13.488,13.488 0 0 0 -1.182,-1.1 l -0.012,-0.01 a 13.364,13.364 0 0 0 -4.393,-2.346 l -0.2,-0.066 q -0.507,-0.15 -1.032,-0.26 c -0.273,-0.055 -0.552,-0.1 -0.832,-0.138 -0.215,-0.031 -0.428,-0.073 -0.645,-0.094 q -0.767,-0.068 -1.537,-0.059 z"
style="opacity:0.4;isolation:isolate;fill:#ffffff"
id="path42"
inkscape:connector-curvature="0" />
<path
d="m 63.786,24.777 c -3.642,0 -6.6,-1.069 -6.6,-2.389 V 35.1 c 0,1.308 2.9,2.371 6.506,2.388 h 0.094 c 3.643,0 6.6,-1.068 6.6,-2.388 V 22.388 c -0.009,1.32 -2.962,2.389 -6.6,2.389 z"
id="path44"
inkscape:connector-curvature="0"
style="fill:url(#linearGradient4581)" />
<path
d="m 70.377,22.388 c 0,1.32 -2.953,2.389 -6.6,2.389 -3.647,0 -6.591,-1.069 -6.591,-2.389 0,-1.32 2.954,-2.388 6.6,-2.388 3.646,0 6.6,1.069 6.6,2.388"
id="path46"
inkscape:connector-curvature="0"
style="fill:#e8e8e8" />
<path
d="m 68.837,22.2 c 0,0.84 -2.263,1.519 -5.055,1.519 -2.792,0 -5.056,-0.68 -5.056,-1.519 0,-0.839 2.263,-1.518 5.06,-1.518 2.797,0 5.056,0.679 5.056,1.518"
id="path48"
inkscape:connector-curvature="0"
style="fill:#50e6ff" />
<path
d="m 63.786,22.545 a 12.247,12.247 0 0 0 -4,0.576 13.658,13.658 0 0 0 8,0 12.246,12.246 0 0 0 -4,-0.576 z"
id="path50"
inkscape:connector-curvature="0"
style="fill:#198ab3" />
<path
d="M 79.627,53.279 A 12.057,12.057 0 0 0 69.163,41.69 C 69.083,33.65 62.109,27.158 53.512,27.158 a 15.592,15.592 0 0 0 -14.9,10.16 C 31.441,38.41 25.967,44.19 25.967,51.16 c 0,7.75 6.764,14.032 15.108,14.032 0.449,0 0.893,-0.023 1.332,-0.058 h 24.467 a 2.434,2.434 0 0 0 0.646,-0.1 C 74.257,64.761 79.627,59.6 79.627,53.279 Z"
id="path52"
inkscape:connector-curvature="0"
style="fill:url(#a7ff9d6f-dfc1-44bf-b13b-162305f1fa8a)" />
<path
d="m 78.463,29.993 c -1.653,-2.16 -4.49,-3.21 -8.086,-3.254 v 3.6 c 2.52,0.042 4.355,0.68 5.239,1.839 C 78.651,36.156 73.137,49.2 57.263,61.4 41.389,73.6 27.411,75.561 24.378,71.581 c -1.719,-2.254 -0.653,-7.43 3.445,-13.7 a 13.266,13.266 0 0 1 -1.571,-4.006 c -5.86,8.04 -7.976,15.626 -4.721,19.9 5.217,6.841 22.191,2.587 37.913,-9.5 C 75.166,52.188 83.679,36.833 78.463,29.993 Z"
id="path54"
inkscape:connector-curvature="0"
style="fill:#0072c6" />
<path
d="m 48.161,58.2 c -6.247,0 -11.314,-1.835 -11.314,-4.1 v 21.8 c 0,2.244 4.983,4.067 11.16,4.1 h 0.155 c 6.249,0 11.314,-1.834 11.314,-4.1 V 54.1 c -10e-4,2.268 -5.067,4.1 -11.315,4.1 z"
id="path56"
inkscape:connector-curvature="0"
style="fill:url(#bc37103c-430d-46e7-aa83-30a9235b7b74)" />
<path
d="m 59.475,54.105 c 0,2.263 -5.066,4.1 -11.314,4.1 -6.248,0 -11.314,-1.835 -11.314,-4.1 0,-2.265 5.067,-4.1 11.314,-4.1 6.247,0 11.314,1.835 11.314,4.1"
id="path58"
inkscape:connector-curvature="0"
style="fill:#e8e8e8" />
<path
d="m 56.834,53.773 c 0,1.441 -3.883,2.606 -8.673,2.606 -4.79,0 -8.673,-1.167 -8.673,-2.606 0,-1.439 3.884,-2.605 8.673,-2.605 4.789,0 8.673,1.167 8.673,2.605"
id="path60"
inkscape:connector-curvature="0"
style="fill:#50e6ff" />
<path
d="m 48.161,54.373 a 20.975,20.975 0 0 0 -6.87,0.989 20.252,20.252 0 0 0 6.87,1.017 20.262,20.262 0 0 0 6.871,-1.017 20.981,20.981 0 0 0 -6.871,-0.989 z"
id="path62"
inkscape:connector-curvature="0"
style="fill:#198ab3" />
<path
d="m 55.038,69.236 v -5.96 H 53.4 v 7.292 h 4.343 V 69.236 Z M 42.452,66.3 a 3.63,3.63 0 0 1 -0.907,-0.547 0.77,0.77 0 0 1 -0.221,-0.567 0.633,0.633 0 0 1 0.274,-0.536 1.245,1.245 0 0 1 0.748,-0.2 2.887,2.887 0 0 1 1.68,0.509 v -1.52 a 4.809,4.809 0 0 0 -1.77,-0.281 2.969,2.969 0 0 0 -1.934,0.593 1.932,1.932 0 0 0 -0.722,1.569 2.394,2.394 0 0 0 1.668,2.135 4.887,4.887 0 0 1 1.086,0.631 0.751,0.751 0 0 1 0.268,0.575 0.633,0.633 0 0 1 -0.279,0.539 1.312,1.312 0 0 1 -0.786,0.2 2.917,2.917 0 0 1 -1.927,-0.743 v 1.627 a 3.968,3.968 0 0 0 1.888,0.407 3.371,3.371 0 0 0 2.079,-0.569 1.9,1.9 0 0 0 0.753,-1.608 1.832,1.832 0 0 0 -0.433,-1.241 4.491,4.491 0 0 0 -1.465,-0.973 z m 9.153,2.8 a 4.517,4.517 0 0 0 0.145,-4.17 3.154,3.154 0 0 0 -1.236,-1.32 3.523,3.523 0 0 0 -1.8,-0.464 3.786,3.786 0 0 0 -1.913,0.48 3.276,3.276 0 0 0 -1.296,1.374 4.319,4.319 0 0 0 -0.46,2.021 4.03,4.03 0 0 0 0.424,1.854 3.222,3.222 0 0 0 1.2,1.309 3.533,3.533 0 0 0 1.752,0.514 l 1.511,1.694 h 2.135 L 49.953,70.434 A 3.119,3.119 0 0 0 51.605,69.1 Z m -1.643,-0.447 a 1.639,1.639 0 0 1 -1.347,0.618 1.617,1.617 0 0 1 -1.337,-0.638 3.136,3.136 0 0 1 0.009,-3.425 1.669,1.669 0 0 1 1.37,-0.646 1.561,1.561 0 0 1 1.32,0.645 2.864,2.864 0 0 1 0.48,1.752 2.618,2.618 0 0 1 -0.495,1.698 z"
id="path64"
inkscape:connector-curvature="0"
style="fill:url(#e36b5364-8f7a-422d-a36a-9a21681809ae)" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 24.0.3, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="ee7a2d03-25e0-4b7b-970e-53f514516b83"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 50 50"
style="enable-background:new 0 0 50 50;" xml:space="preserve">
<title>Azure_SQL_Edge</title>
<path d="M49.3,22.2c-0.7-1.7-1.9-3.1-3.5-4.1c3.9-5.8,5.6-11.5,2.9-14.8c-2.8-3.4-9.3-2.9-17.8,1.4l0,0.1C29.5,4.3,28,4,26.6,4
c-1.2,0-2.4,0.1-3.5,0.4c-2.3,0.6-4.3,1.7-6,3.3c-0.9,0.8-1.6,1.7-2.3,2.7c-0.7-0.1-1.5-0.2-2.3-0.2c-1.7,0-3.3,0.3-4.9,1
c-3,1.3-5.4,3.7-6.7,6.7c-1.3,3.1-1.3,6.6,0,9.7c0.5,1,1.1,2,1.8,2.9l-0.1-0.1c-3,5.4-3.5,9.6-1.3,12.3c1.4,1.6,3.5,2.4,5.6,2.2
c2.1-0.1,4.1-0.4,6-1.1c3.2,2.8,7.4,4.2,11.6,3.9c7.5,0,13.1-3.1,13.1-7.3v-5.2h2.9c1.2,0,2.5-0.2,3.6-0.7c2.2-1,4-2.8,5-5
C50.2,27.2,50.2,24.5,49.3,22.2L49.3,22.2z M46.3,5.2c1.5,1.8,0.4,6.1-3.4,11.6c-0.8-0.2-1.6-0.3-2.4-0.3c-0.2-1.7-0.7-3.4-1.6-5
c-0.8-1.5-1.8-2.8-3.1-4c-0.5-0.4-1-0.8-1.6-1.2C40.2,3.8,44.8,3.3,46.3,5.2z M3.9,19.1c1-2.2,2.8-4,5-5c1.2-0.5,2.4-0.7,3.6-0.7
c1.3,0,2.6,0.3,3.9,0.9c0.4-1,0.9-2,1.6-2.9c1.3-1.7,3.2-3,5.3-3.7C25.7,6.9,28.4,7,30.8,8c2.6,1.1,4.7,3.2,5.8,5.8
c0.6,1.3,0.9,2.8,0.9,4.3v1.6h3.1c0.1,0,0.2,0,0.3,0c-0.9,1.2-2,2.4-3.2,3.7v-0.2h-0.1c-0.6-3.8-6-6.6-13-6.6s-12.4,2.8-13,6.6h-0.1
V32c-0.9-0.1-1.8-0.3-2.7-0.6c-2.2-1-4-2.8-5-5C2.9,24.1,2.9,21.4,3.9,19.1L3.9,19.1z M24.7,19.6c5.7,0,10,2.2,10,4.2
s-4.3,4.2-10,4.2s-10-2.2-10-4.2S18.9,19.6,24.7,19.6z M3.7,40.8c-1.2-1.4-0.7-4.4,1.2-8.1c0.8,0.7,1.7,1.2,2.7,1.6
c1.3,0.5,2.6,0.8,3.9,0.9v5.3c0,0.2,0.1,0.5,0.1,0.7C7.8,42.3,4.8,42.2,3.7,40.8z M24.7,44.6c-5.7,0-10-2.2-10-4.2V28.6
c3,1.8,6.5,2.6,10,2.5c3.5,0.1,6.9-0.8,10-2.5v11.8C34.6,42.4,30.4,44.6,24.7,44.6z M46.4,28.3c-0.6,1.5-1.8,2.7-3.3,3.3
c-0.8,0.3-1.6,0.5-2.4,0.5h-2.9v-4.4c2.2-2.2,4.3-4.6,6.2-7.1c1.1,0.7,1.9,1.7,2.4,2.8C47,25,47,26.7,46.4,28.3L46.4,28.3z"/>
</svg>

Before

Width:  |  Height:  |  Size: 9.3 KiB

After

Width:  |  Height:  |  Size: 1.9 KiB

View File

@@ -0,0 +1,10 @@
<svg width="50" height="50" viewBox="0 0 50 50" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0)">
<path d="M49.2998 22.2C48.5998 20.5 47.3998 19.1 45.7998 18.1C49.6998 12.3 51.3998 6.60003 48.6998 3.30003C45.8998 -0.099967 39.3998 0.400033 30.8998 4.70003V4.80003C29.4998 4.30003 27.9998 4.00003 26.5998 4.00003C25.3998 4.00003 24.1998 4.10003 23.0998 4.40003C20.7998 5.00003 18.7998 6.10003 17.0998 7.70003C16.1998 8.50003 15.4998 9.40003 14.7998 10.4C14.0998 10.3 13.2998 10.2 12.4998 10.2C10.7998 10.2 9.1998 10.5 7.5998 11.2C4.5998 12.5 2.1998 14.9 0.899805 17.9C-0.400195 21 -0.400195 24.5 0.899805 27.6C1.3998 28.6 1.9998 29.6 2.6998 30.5L2.5998 30.4C-0.400195 35.8 -0.900195 40 1.2998 42.7C2.6998 44.3 4.7998 45.1 6.8998 44.9C8.9998 44.8 10.9998 44.5 12.8998 43.8C16.0998 46.6 20.2998 48 24.4998 47.7C31.9998 47.7 37.5998 44.6 37.5998 40.4V35.2H40.4998C41.6998 35.2 42.9998 35 44.0998 34.5C46.2998 33.5 48.0998 31.7 49.0998 29.5C50.1998 27.2 50.1998 24.5 49.2998 22.2ZM46.2998 5.20003C47.7998 7.00003 46.6998 11.3 42.8998 16.8C42.0998 16.6 41.2998 16.5 40.4998 16.5C40.2998 14.8 39.7998 13.1 38.8998 11.5C38.0998 10 37.0998 8.70003 35.7998 7.50003C35.2998 7.10003 34.7998 6.70003 34.1998 6.30003C40.1998 3.80003 44.7998 3.30003 46.2998 5.20003ZM3.8998 19.1C4.8998 16.9 6.6998 15.1 8.8998 14.1C10.0998 13.6 11.2998 13.4 12.4998 13.4C13.7998 13.4 15.0998 13.7 16.3998 14.3C16.7998 13.3 17.2998 12.3 17.9998 11.4C19.2998 9.70003 21.1998 8.40003 23.2998 7.70003C25.6998 6.90003 28.3998 7.00003 30.7998 8.00003C33.3998 9.10003 35.4998 11.2 36.5998 13.8C37.1998 15.1 37.4998 16.6 37.4998 18.1V19.7H40.5998C40.6998 19.7 40.7998 19.7 40.8998 19.7C39.9998 20.9 38.8998 22.1 37.6998 23.4V23.2H37.5998C36.9998 19.4 31.5998 16.6 24.5998 16.6C17.5998 16.6 12.1998 19.4 11.5998 23.2H11.4998V32C10.5998 31.9 9.6998 31.7 8.7998 31.4C6.5998 30.4 4.7998 28.6 3.7998 26.4C2.8998 24.1 2.8998 21.4 3.8998 19.1ZM24.6998 19.6C30.3998 19.6 34.6998 21.8 34.6998 23.8C34.6998 25.8 30.3998 28 24.6998 28C18.9998 28 14.6998 25.8 14.6998 23.8C14.6998 21.8 18.8998 19.6 24.6998 19.6ZM3.6998 40.8C2.4998 39.4 2.9998 36.4 4.8998 32.7C5.6998 33.4 6.5998 33.9 7.5998 34.3C8.8998 34.8 10.1998 35.1 11.4998 35.2V40.5C11.4998 40.7 11.5998 41 11.5998 41.2C7.7998 42.3 4.7998 42.2 3.6998 40.8ZM24.6998 44.6C18.9998 44.6 14.6998 42.4 14.6998 40.4V28.6C17.6998 30.4 21.1998 31.2 24.6998 31.1C28.1998 31.2 31.5998 30.3 34.6998 28.6V40.4C34.5998 42.4 30.3998 44.6 24.6998 44.6ZM46.3998 28.3C45.7998 29.8 44.5998 31 43.0998 31.6C42.2998 31.9 41.4998 32.1 40.6998 32.1H37.7998V27.7C39.9998 25.5 42.0998 23.1 43.9998 20.6C45.0998 21.3 45.8998 22.3 46.3998 23.4C46.9998 25 46.9998 26.7 46.3998 28.3Z" fill="white"/>
</g>
<defs>
<clipPath id="clip0">
<rect width="50" height="50" fill="white"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 2.7 KiB

View File

@@ -29,7 +29,7 @@
"1. Create an Azure Edge VM as a virtual IoT device, you can go the \"Default Settings\" cell and adjust the vm_size variable based on your needs. Available sizes and pricing information can be found [here](https://docs.microsoft.com/azure/virtual-machines/linux/sizes).\n",
"1. Create an Azure IoT hub, you can go to the \"Default Settings\" cell and adjust value of the following variables based on your needs: iot_hub_sku and iot_hub_units. Available SKUs and pricing information can be found [here](https://azure.microsoft.com/pricing/details/iot-hub/).\n",
"1. Add the device to the IoT hub\n",
"1. Deploy SQL Edge module to the device with optional package file\n",
"1. Deploy SQL Edge module to the device with optional dacpac\n",
"1. Enable connecting to the SQL Edge instance on the device\n",
"\n",
"### Dependencies\n",
@@ -53,7 +53,13 @@
{
"cell_type": "code",
"source": [
"import sys,os,json,html,getpass,time,ntpath,uuid\n",
"import pandas,sys,os,json,html,getpass,time,ntpath,uuid\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"\n",
"def run_command(command:str, displayCommand:str = \"\", returnObject:bool = False):\n",
" print(\"Executing: \" + (displayCommand if displayCommand != \"\" else command))\n",
@@ -91,11 +97,7 @@
{
"cell_type": "code",
"source": [
"extensions = run_command('az extension list', returnObject=True)\r\n",
"extensions = [ext for ext in extensions if ext['name'] == 'azure-cli-iot-ext']\r\n",
"if len(extensions) > 0:\r\n",
" run_command('az extension remove --name azure-cli-iot-ext')\r\n",
"run_command('az extension add --name azure-iot')"
"run_command('az extension add --name azure-cli-iot-ext')"
],
"metadata": {
"azdata_cell_guid": "55bb2f96-6f7f-4aa0-9daf-d0f7f9d9243c",
@@ -124,7 +126,7 @@
"sa_password = os.environ[\"AZDATA_NB_VAR_SA_PASSWORD\"]\n",
"vm_admin = os.environ[\"AZDATA_NB_VAR_ASDE_VM_ADMIN\"]\n",
"vm_password = os.environ[\"AZDATA_NB_VAR_ASDE_VM_PASSWORD\"]\n",
"package_path = os.environ[\"AZDATA_NB_VAR_ASDE_PACKAGE_PATH\"]\n",
"dacpac_path = os.environ[\"AZDATA_NB_VAR_ASDE_DACPAC_PATH\"]\n",
"sql_port = os.environ[\"AZDATA_NB_VAR_ASDE_SQL_PORT\"]\n",
"new_rg_flag = os.environ[\"AZDATA_NB_VAR_ASDE_NEW_RESOURCEGROUP\"]\n",
"new_rg_name = os.environ[\"AZDATA_NB_VAR_ASDE_NEW_RESOURCEGROUP_NAME\"]\n",
@@ -138,7 +140,7 @@
"print(f'VM admin password: ******')\n",
"print(f'SQL Server port: {sql_port}')\n",
"print(f'SQL Server sa password: ******')\n",
"print(f'Package path: {package_path}')"
"print(f'Dacpac path: {dacpac_path}')"
],
"metadata": {
"azdata_cell_guid": "dde9388b-f623-4d62-bb74-36a05f5d2ea3",
@@ -174,7 +176,7 @@
"subnet_address_prefix = '10.0.0.0/24'\n",
"vnet_address_prefix = '10.0.0.0/16'\n",
"azure_storage_account = f'sa{suffix}'\n",
"storage_account_container = 'sqldatabasepackage'\n",
"storage_account_container = 'sqldatabasedacpac'\n",
"sql_lcid = '1033'\n",
"sql_collation = 'SQL_Latin1_General_CP1_CI_AS'"
],
@@ -268,7 +270,7 @@
{
"cell_type": "markdown",
"source": [
"### Create storage account and storage account container, then upload the package"
"### Create storage account and storage account container, then upload the dacpac"
],
"metadata": {
"azdata_cell_guid": "90ec2b26-0c4a-4aa4-b397-f16b09b454ea"
@@ -278,11 +280,11 @@
"cell_type": "code",
"source": [
"storage_account_created = False\n",
"if package_path == \"\":\n",
" print(f'Package file not provided')\n",
"if dacpac_path == \"\":\n",
" print(f'Dacpac zip file not provided')\n",
" blob_sas = ''\n",
"else: \n",
" package_name = ntpath.basename(package_path)\n",
" dacpac_name = ntpath.basename(dacpac_path)\n",
" storage_accounts = run_command(f'az storage account list --resource-group {azure_resource_group} --subscription {azure_subscription_id}', returnObject=True)\n",
" storage_accounts = [storage_account for storage_account in storage_accounts if storage_account['name'] == azure_storage_account]\n",
" if len(storage_accounts) == 0:\n",
@@ -298,14 +300,14 @@
" else:\n",
" run_command(f'az storage container create --name {storage_account_container} --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key')\n",
"\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" if blob_exists:\n",
" print(f'blob \\\"{package_name}\\\" already exists.')\n",
" print(f'blob \\\"{dacpac_name}\\\" already exists.')\n",
" else:\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {package_name} --file \\\"{package_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {dacpac_name} --file \\\"{dacpac_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" now = time.localtime()\n",
" expiry = f'{(now.tm_year + 1)}-{now.tm_mon}-{now.tm_mday}'\n",
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
],
"metadata": {
"azdata_cell_guid": "7ab2b3ec-0832-40b3-98c0-4aa87320e7ce",
@@ -473,9 +475,8 @@
{
"cell_type": "code",
"source": [
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"PlanId=asde-developer-on-iot-edge\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}<PACKAGE_INFO>},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_SAS_URL>\\\"}}}}'\n",
"package_info = '' if blob_sas == ''else ',\\\"MSSQL_PACKAGE\\\":{\\\"value\\\":\\\"'+blob_sas+'\\\"}'\n",
"manifest = manifest.replace('<PACKAGE_INFO>', package_info).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge-developer\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"MSSQL_PID=Developer\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"SqlPackage\\\":\\\"<Optional_DACPAC_ZIP_SAS_URL>\\\",\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_ZIP_SAS_URL>\\\"}}}}'\n",
"manifest = manifest.replace('<Optional_DACPAC_ZIP_SAS_URL>', blob_sas).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"file_name = f'{uuid.uuid4().hex}.json'\n",
"manifest_file = open(file_name, 'w')\n",
"manifest_file.write(manifest)\n",
@@ -524,7 +525,7 @@
"source": [
"if storage_account_created:\r\n",
" delete_storage_account_command = \"run_command(f'az storage account delete -n {azure_storage_account} -g {azure_resource_group} --yes')\"\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the package file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the dacpac file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span><font size=\"2\">'+delete_storage_account_command+'</font></span>'))"
],
"metadata": {

View File

@@ -48,8 +48,14 @@
{
"cell_type": "code",
"source": [
"import sys,os,getpass,json,html,time\n",
"import pandas,sys,os,getpass,json,html,time\n",
"from string import Template\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"\n",
"def run_command(displayCommand = \"\"):\n",
" print(\"Executing: \" + (displayCommand if displayCommand != \"\" else cmd))\n",
@@ -187,7 +193,7 @@
{
"cell_type": "code",
"source": [
"template = Template(f'docker run -e ACCEPT_EULA=Y -e \"SA_PASSWORD=$password\" -e \"MSSQL_PID=Developer\" -p {sql_port}:1433 --name {container_name} -d {docker_registry}/{docker_repository}:{docker_imagetag}')\n",
"template = Template(f'docker run -e ACCEPT_EULA=Y -e \"SA_PASSWORD=$password\" -p {sql_port}:1433 --name {container_name} -d {docker_registry}/{docker_repository}:{docker_imagetag}')\n",
"cmd = template.substitute(password=sql_password)\n",
"run_command(template.substitute(password='******'))"
],

View File

@@ -48,7 +48,13 @@
{
"cell_type": "code",
"source": [
"import sys,os,json,html,getpass,time,ntpath,uuid\n",
"import pandas,sys,os,json,html,getpass,time,ntpath,uuid\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"\n",
"def run_command(command:str, displayCommand:str = \"\", returnObject:bool = False):\n",
" print(\"Executing: \" + (displayCommand if displayCommand != \"\" else command))\n",
@@ -86,15 +92,13 @@
{
"cell_type": "code",
"source": [
"extensions = run_command('az extension list', returnObject=True)\r\n",
"extensions = [ext for ext in extensions if ext['name'] == 'azure-cli-iot-ext']\r\n",
"if len(extensions) > 0:\r\n",
" run_command('az extension remove --name azure-cli-iot-ext')\r\n",
"run_command('az extension add --name azure-iot')"
"run_command('az extension add --name azure-cli-iot-ext')"
],
"metadata": {
"azdata_cell_guid": "55bb2f96-6f7f-4aa0-9daf-d0f7f9d9243c",
"tags": []
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": null
@@ -114,7 +118,7 @@
"azure_subscription_id = os.environ[\"AZDATA_NB_VAR_ASDE_SUBSCRIPTIONID\"]\n",
"azure_resource_group = os.environ[\"AZDATA_NB_VAR_ASDE_RESOURCEGROUP\"]\n",
"sa_password = os.environ[\"AZDATA_NB_VAR_SA_PASSWORD\"]\n",
"package_path = os.environ[\"AZDATA_NB_VAR_ASDE_PACKAGE_PATH\"]\n",
"dacpac_path = os.environ[\"AZDATA_NB_VAR_ASDE_DACPAC_PATH\"]\n",
"sql_port = os.environ[\"AZDATA_NB_VAR_ASDE_SQL_PORT\"]\n",
"iot_hub_name = os.environ[\"AZDATA_NB_VAR_ASDE_HUBNAME\"]\n",
"target_condition = os.environ[\"AZDATA_NB_VAR_ASDE_TARGET_CONDITION\"]\n",
@@ -124,7 +128,7 @@
"print(f'Target condition: {target_condition}')\n",
"print(f'Azure SQL Edge instance port: {sql_port}')\n",
"print(f'Azure SQL Edge instance sa password: ******')\n",
"print(f'Package path: {package_path}')"
"print(f'Dacpac path: {dacpac_path}')"
],
"metadata": {
"azdata_cell_guid": "dde9388b-f623-4d62-bb74-36a05f5d2ea3",
@@ -150,7 +154,7 @@
"suffix = time.strftime(\"%y%m%d%H%M%S\", time.localtime())\n",
"azure_storage_account = f'sa{suffix}'\n",
"deployment_id = f'asde{suffix}'\n",
"storage_account_container = 'sqldatabasepackage'\n",
"storage_account_container = 'sqldatabasedacpac'\n",
"sql_lcid = '1033'\n",
"sql_collation = 'SQL_Latin1_General_CP1_CI_AS'\n",
"deployment_priority = 10"
@@ -243,7 +247,7 @@
{
"cell_type": "markdown",
"source": [
"### Create storage account and storage account container, then upload the package"
"### Create storage account and storage account container, then upload the dacpac"
],
"metadata": {
"azdata_cell_guid": "90ec2b26-0c4a-4aa4-b397-f16b09b454ea"
@@ -253,11 +257,11 @@
"cell_type": "code",
"source": [
"storage_account_created = False\n",
"if package_path == \"\":\n",
" print(f'Package file not provided')\n",
"if dacpac_path == \"\":\n",
" print(f'Dacpac zip file not provided')\n",
" blob_sas = ''\n",
"else:\n",
" package_name = ntpath.basename(package_path)\n",
" dacpac_name = ntpath.basename(dacpac_path)\n",
" storage_accounts = run_command(f'az storage account list --resource-group {azure_resource_group} --subscription {azure_subscription_id}', returnObject=True)\n",
" storage_accounts = [storage_account for storage_account in storage_accounts if storage_account['name'] == azure_storage_account]\n",
" if len(storage_accounts) == 0:\n",
@@ -273,14 +277,14 @@
" else:\n",
" run_command(f'az storage container create --name {storage_account_container} --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key')\n",
"\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" if blob_exists:\n",
" print(f'blob \\\"{package_name}\\\" already exists.')\n",
" print(f'blob \\\"{dacpac_name}\\\" already exists.')\n",
" else:\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {package_name} --file \\\"{package_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {dacpac_name} --file \\\"{dacpac_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" now = time.localtime()\n",
" expiry = f'{(now.tm_year + 1)}-{now.tm_mon}-{now.tm_mday}'\n",
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
],
"metadata": {
"azdata_cell_guid": "7ab2b3ec-0832-40b3-98c0-4aa87320e7ce",
@@ -303,9 +307,8 @@
{
"cell_type": "code",
"source": [
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"PlanId=asde-developer-on-iot-edge\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}<PACKAGE_INFO>},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_SAS_URL>\\\"}}}}'\n",
"package_info = '' if blob_sas == ''else ',\\\"MSSQL_PACKAGE\\\":{\\\"value\\\":\\\"'+blob_sas+'\\\"}'\n",
"manifest = manifest.replace('<PACKAGE_INFO>', package_info).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge-developer\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"MSSQL_PID=Developer\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"SqlPackage\\\":\\\"<Optional_DACPAC_ZIP_SAS_URL>\\\",\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_ZIP_SAS_URL>\\\"}}}}'\n",
"manifest = manifest.replace('<Optional_DACPAC_ZIP_SAS_URL>', blob_sas).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"file_name = f'{uuid.uuid4().hex}.json'\n",
"manifest_file = open(file_name, 'w')\n",
"manifest_file.write(manifest)\n",
@@ -328,7 +331,7 @@
"from IPython.display import *\r\n",
"if storage_account_created:\r\n",
" delete_storage_account_command = \"run_command(f'az storage account delete -n {azure_storage_account} -g {azure_resource_group} --yes')\"\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the package file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the dacpac file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span><font size=\"2\">'+delete_storage_account_command+'</font></span>'))"
],
"metadata": {

View File

@@ -46,7 +46,13 @@
{
"cell_type": "code",
"source": [
"import sys,getpass,os,json,html,time\n",
"import pandas,sys,getpass,os,json,html,time\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"\n",
"def run_command():\n",
" print(\"Executing: \" + cmd)\n",
@@ -255,7 +261,7 @@
{
"cell_type": "code",
"source": [
"cmd = f'docker run -e ACCEPT_EULA=Y -e \"SA_PASSWORD={sql_password}\" -e \"MSSQL_PID=Developer\" -p {sql_port}:1433 --name {container_name} -d {docker_registry}/{docker_repository}:{docker_imagetag}'\n",
"cmd = f'docker run -e ACCEPT_EULA=Y -e \"SA_PASSWORD={sql_password}\" -p {sql_port}:1433 --name {container_name} -d {docker_registry}/{docker_repository}:{docker_imagetag}'\n",
"run_command_remote()"
],
"metadata": {

View File

@@ -26,8 +26,8 @@
"![Microsoft](https://raw.githubusercontent.com/microsoft/azuredatastudio/main/extensions/resource-deployment/images/microsoft-small-logo.png)\n",
"## Deploy Azure SQL Edge to an existing device via IoT hub\n",
"This notebook will walk you through process of deploying Azure SQL Edge to an existing device of an IoT hub:\n",
"1. Deploy Azure SQL Edge module to the device with optional package\n",
"1. If a package is selected, a storage account will be created to host the package file\n",
"1. Deploy Azure SQL Edge module to the device with optional dacpac\n",
"1. If a dacpac is selected, a storage account will be created to host the dacpac file\n",
"1. Enable connecting to the Azure SQL Edge instance on the device\n",
"\n",
"### Dependencies\n",
@@ -51,7 +51,13 @@
{
"cell_type": "code",
"source": [
"import sys,os,json,html,getpass,time,ntpath,uuid\n",
"import pandas,sys,os,json,html,getpass,time,ntpath,uuid\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"\n",
"def run_command(command:str, displayCommand:str = \"\", returnObject:bool = False):\n",
" print(\"Executing: \" + (displayCommand if displayCommand != \"\" else command))\n",
@@ -89,11 +95,7 @@
{
"cell_type": "code",
"source": [
"extensions = run_command('az extension list', returnObject=True)\r\n",
"extensions = [ext for ext in extensions if ext['name'] == 'azure-cli-iot-ext']\r\n",
"if len(extensions) > 0:\r\n",
" run_command('az extension remove --name azure-cli-iot-ext')\r\n",
"run_command('az extension add --name azure-iot')"
"run_command('az extension add --name azure-cli-iot-ext')"
],
"metadata": {
"azdata_cell_guid": "55bb2f96-6f7f-4aa0-9daf-d0f7f9d9243c",
@@ -119,7 +121,7 @@
"azure_subscription_id = os.environ[\"AZDATA_NB_VAR_ASDE_SUBSCRIPTIONID\"]\n",
"azure_resource_group = os.environ[\"AZDATA_NB_VAR_ASDE_RESOURCEGROUP\"]\n",
"sa_password = os.environ[\"AZDATA_NB_VAR_SA_PASSWORD\"]\n",
"package_path = os.environ[\"AZDATA_NB_VAR_ASDE_PACKAGE_PATH\"]\n",
"dacpac_path = os.environ[\"AZDATA_NB_VAR_ASDE_DACPAC_PATH\"]\n",
"sql_port = os.environ[\"AZDATA_NB_VAR_ASDE_SQL_PORT\"]\n",
"iot_hub_name = os.environ[\"AZDATA_NB_VAR_ASDE_HUBNAME\"]\n",
"iot_device_id = os.environ[\"AZDATA_NB_VAR_ASDE_DEVICE_ID\"]\n",
@@ -131,7 +133,7 @@
"print(f'Device IP address: {ip_address}')\n",
"print(f'Azure SQL Edge instance port: {sql_port}')\n",
"print(f'Azure SQL Edge instance sa password: ******')\n",
"print(f'Package path: {package_path}')"
"print(f'Dacpac path: {dacpac_path}')"
],
"metadata": {
"azdata_cell_guid": "dde9388b-f623-4d62-bb74-36a05f5d2ea3",
@@ -156,13 +158,15 @@
"source": [
"suffix = time.strftime(\"%y%m%d%H%M%S\", time.localtime())\n",
"azure_storage_account = f'sa{suffix}'\n",
"storage_account_container = 'sqldatabasepackage'\n",
"storage_account_container = 'sqldatabasedacpac'\n",
"sql_lcid = '1033'\n",
"sql_collation = 'SQL_Latin1_General_CP1_CI_AS'"
],
"metadata": {
"azdata_cell_guid": "19ebeaf4-94c9-4d2b-bd9f-e3c6bf7f2dda",
"tags": []
"tags": [
"hide_input"
]
},
"outputs": [],
"execution_count": null
@@ -251,7 +255,7 @@
{
"cell_type": "markdown",
"source": [
"### Create storage account and storage account container, then upload the package"
"### Create storage account and storage account container, then upload the dacpac"
],
"metadata": {
"azdata_cell_guid": "90ec2b26-0c4a-4aa4-b397-f16b09b454ea"
@@ -261,11 +265,11 @@
"cell_type": "code",
"source": [
"storage_account_created = False\n",
"if package_path == \"\":\n",
" print(f'Package file not provided')\n",
"if dacpac_path == \"\":\n",
" print(f'Dacpac zip file not provided')\n",
" blob_sas = ''\n",
"else:\n",
" package_name = ntpath.basename(package_path)\n",
" dacpac_name = ntpath.basename(dacpac_path)\n",
" storage_accounts = run_command(f'az storage account list --resource-group {azure_resource_group} --subscription {azure_subscription_id}', returnObject=True)\n",
" storage_accounts = [storage_account for storage_account in storage_accounts if storage_account['name'] == azure_storage_account]\n",
" if len(storage_accounts) == 0:\n",
@@ -281,14 +285,14 @@
" else:\n",
" run_command(f'az storage container create --name {storage_account_container} --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key')\n",
"\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" blob_exists = run_command(f'az storage blob exists --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-key {storage_account_key} --account-name {azure_storage_account} --auth-mode key', returnObject=True)['exists']\n",
" if blob_exists:\n",
" print(f'blob \\\"{package_name}\\\" already exists.')\n",
" print(f'blob \\\"{dacpac_name}\\\" already exists.')\n",
" else:\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {package_name} --file \\\"{package_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" run_command(f'az storage blob upload --account-name {azure_storage_account} --container-name {storage_account_container} --name {dacpac_name} --file \\\"{dacpac_path}\\\" --account-key {storage_account_key} --auth-mode key')\n",
" now = time.localtime()\n",
" expiry = f'{(now.tm_year + 1)}-{now.tm_mon}-{now.tm_mday}'\n",
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{package_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
" blob_sas = run_command(f'az storage blob generate-sas --container-name {storage_account_container} --name \\\"{dacpac_name}\\\" --account-name {azure_storage_account} --account-key {storage_account_key} --auth-mode key --full-uri --https-only --permissions r --expiry {expiry}', returnObject=True)"
],
"metadata": {
"azdata_cell_guid": "7ab2b3ec-0832-40b3-98c0-4aa87320e7ce",
@@ -311,9 +315,8 @@
{
"cell_type": "code",
"source": [
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"PlanId=asde-developer-on-iot-edge\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}<PACKAGE_INFO>},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_SAS_URL>\\\"}}}}'\n",
"package_info = '' if blob_sas == ''else ',\\\"MSSQL_PACKAGE\\\":{\\\"value\\\":\\\"'+blob_sas+'\\\"}'\n",
"manifest = manifest.replace('<PACKAGE_INFO>', package_info).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"manifest = '{\\\"modulesContent\\\":{\\\"$edgeAgent\\\":{\\\"properties.desired\\\":{\\\"modules\\\":{\\\"AzureSQLEdge\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azure-sql-edge-developer\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"CapAdd\\\\\\\":[\\\\\\\"SYS_PTRACE\\\\\\\"],\\\\\\\"Binds\\\\\\\":[\\\\\\\"sqlvolume:/sqlvolume\\\\\\\"],\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"1433/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"<SQL_Port>\\\\\\\"}]},\\\\\\\"Mounts\\\\\\\":[{\\\\\\\"Type\\\\\\\":\\\\\\\"volume\\\\\\\",\\\\\\\"Source\\\\\\\":\\\\\\\"sqlvolume\\\\\\\",\\\\\\\"Target\\\\\\\":\\\\\\\"/var/opt/mssql\\\\\\\"}]},\\\\\\\"User\\\\\\\":\\\\\\\"0:0\\\\\\\",\\\\\\\"Env\\\\\\\":[\\\\\\\"MSSQL_AGENT_ENABLED=TRUE\\\\\\\",\\\\\\\"ClientTransportType=AMQP_TCP_Only\\\\\\\",\\\\\\\"MSSQL_PID=Developer\\\\\\\"]}\\\"},\\\"type\\\":\\\"docker\\\",\\\"version\\\":\\\"1.0\\\",\\\"env\\\":{\\\"ACCEPT_EULA\\\":{\\\"value\\\":\\\"Y\\\"},\\\"SA_PASSWORD\\\":{\\\"value\\\":\\\"<Default_SQL_SA_Password>\\\"},\\\"MSSQL_LCID\\\":{\\\"value\\\":\\\"<SQL_LCID>\\\"},\\\"MSSQL_COLLATION\\\":{\\\"value\\\":\\\"<SQL_Collation>\\\"}},\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}},\\\"runtime\\\":{\\\"settings\\\":{\\\"minDockerVersion\\\":\\\"v1.25\\\"},\\\"type\\\":\\\"docker\\\"},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"systemModules\\\":{\\\"edgeAgent\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-agent:1.0\\\",\\\"createOptions\\\":\\\"\\\"},\\\"type\\\":\\\"docker\\\"},\\\"edgeHub\\\":{\\\"settings\\\":{\\\"image\\\":\\\"mcr.microsoft.com/azureiotedge-hub:1.0\\\",\\\"createOptions\\\":\\\"{\\\\\\\"HostConfig\\\\\\\":{\\\\\\\"PortBindings\\\\\\\":{\\\\\\\"443/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"443\\\\\\\"}],\\\\\\\"5671/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"5671\\\\\\\"}],\\\\\\\"8883/tcp\\\\\\\":[{\\\\\\\"HostPort\\\\\\\":\\\\\\\"8883\\\\\\\"}]}}}\\\"},\\\"type\\\":\\\"docker\\\",\\\"status\\\":\\\"running\\\",\\\"restartPolicy\\\":\\\"always\\\"}}}},\\\"$edgeHub\\\":{\\\"properties.desired\\\":{\\\"routes\\\":{},\\\"schemaVersion\\\":\\\"1.0\\\",\\\"storeAndForwardConfiguration\\\":{\\\"timeToLiveSecs\\\":7200}}},\\\"AzureSQLEdge\\\":{\\\"properties.desired\\\":{\\\"SqlPackage\\\":\\\"<Optional_DACPAC_ZIP_SAS_URL>\\\",\\\"ASAJobInfo\\\":\\\"<Optional_ASA_Job_ZIP_SAS_URL>\\\"}}}}'\n",
"manifest = manifest.replace('<Optional_DACPAC_ZIP_SAS_URL>', blob_sas).replace('<Default_SQL_SA_Password>',sa_password).replace('<SQL_LCID>',sql_lcid).replace('<SQL_Port>',sql_port).replace('<SQL_Collation>',sql_collation)\n",
"file_name = f'{uuid.uuid4().hex}.json'\n",
"manifest_file = open(file_name, 'w')\n",
"manifest_file.write(manifest)\n",
@@ -365,7 +368,7 @@
"source": [
"if storage_account_created:\r\n",
" delete_storage_account_command = \"run_command(f'az storage account delete -n {azure_storage_account} -g {azure_resource_group} --yes')\"\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the package file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span style=\"color:red\"><font size=\"2\">NOTE: A storage account was created to host the dacpac file, you can delete it after the database is created and populated successfully. To delete the storage account, copy the following code to a new code cell and run the cell.</font></span>'))\r\n",
" display(HTML('<span><font size=\"2\">'+delete_storage_account_command+'</font></span>'))"
],
"metadata": {

View File

@@ -2,7 +2,7 @@
"name": "asde-deployment",
"displayName": "%extension-displayName%",
"description": "%extension-description%",
"version": "0.4.1",
"version": "0.1.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",
@@ -24,11 +24,11 @@
"displayName": "%resource-type-sql-edge-display-name%",
"description": "%resource-type-sql-edge-description%",
"platforms": "*",
"icon": "./images/sqldb_edge.svg",
"tags": [
"Hybrid",
"SQL Server"
],
"icon": {
"light": "./images/sqldb_edge.svg",
"dark": "./images/sqldb_edge_inverse.svg"
},
"tags": ["Hybrid", "SQL Server"],
"options": [
{
"name": "type",
@@ -114,7 +114,7 @@
"label": "%docker-repository-field%",
"variableName": "AZDATA_NB_VAR_DOCKER_REPOSITORY",
"type": "text",
"defaultValue": "azure-sql-edge",
"defaultValue": "azure-sql-edge-developer",
"required": true
},
{
@@ -235,7 +235,7 @@
"label": "%docker-repository-field%",
"variableName": "AZDATA_NB_VAR_DOCKER_REPOSITORY",
"type": "text",
"defaultValue": "azure-sql-edge",
"defaultValue": "azure-sql-edge-developer",
"required": true
},
{
@@ -296,6 +296,7 @@
"defaultValue": "westus",
"required": true,
"locationVariableName": "AZDATA_NB_VAR_ASDE_AZURE_LOCATION",
"displayLocationVariableName": "AZDATA_NB_VAR_ASDE_AZURE_LOCATION_TEXT",
"locations": [
"australiaeast",
"australiasoutheast",
@@ -337,12 +338,7 @@
"type": "password",
"confirmationRequired": true,
"confirmationLabel": "%vm_password_confirm%",
"required": true,
"validations" : [{
"type": "regex_match",
"regex": "^(?=.*[a-z])(?=.*[A-Z])(?=.*\\d)(?=.*[\\W_])[A-Za-z\\d\\W_]{12,123}$",
"description": "%vm_password_validation_error_message%"
}]
"required": true
}
]
},
@@ -368,17 +364,14 @@
"required": true
},
{
"label": "%package_path%",
"description": "%package_path_description%",
"variableName": "AZDATA_NB_VAR_ASDE_PACKAGE_PATH",
"label": "%dacpac_path%",
"variableName": "AZDATA_NB_VAR_ASDE_DACPAC_PATH",
"type": "file_picker",
"required": false,
"filter": {
"displayName": "%package-files%",
"displayName": "%dacpac-zip-files%",
"fileTypes": [
"zip",
"bacpac",
"dacpac"
"zip"
]
}
}
@@ -391,7 +384,7 @@
"requiredTools": [
{
"name": "azure-cli",
"version": "2.13.0"
"version": "2.9.1"
}
],
"when": "type=azure-create-new"
@@ -458,17 +451,14 @@
"required": true
},
{
"label": "%package_path%",
"description": "%package_path_description%",
"variableName": "AZDATA_NB_VAR_ASDE_PACKAGE_PATH",
"label": "%dacpac_path%",
"variableName": "AZDATA_NB_VAR_ASDE_DACPAC_PATH",
"type": "file_picker",
"required": false,
"filter": {
"displayName": "%package-files%",
"displayName": "%dacpac-zip-files%",
"fileTypes": [
"zip",
"bacpac",
"dacpac"
"zip"
]
}
}
@@ -481,7 +471,7 @@
"requiredTools": [
{
"name": "azure-cli",
"version": "2.13.0"
"version": "2.9.1"
}
],
"when": "type=azure-single-device"
@@ -551,17 +541,14 @@
"required": true
},
{
"label": "%package_path%",
"description": "%package_path_description%",
"variableName": "AZDATA_NB_VAR_ASDE_PACKAGE_PATH",
"label": "%dacpac_path%",
"variableName": "AZDATA_NB_VAR_ASDE_DACPAC_PATH",
"type": "file_picker",
"required": false,
"filter": {
"displayName": "%package-files%",
"displayName": "%dacpac-zip-files%",
"fileTypes": [
"zip",
"bacpac",
"dacpac"
"zip"
]
}
}
@@ -574,7 +561,7 @@
"requiredTools": [
{
"name": "azure-cli",
"version": "2.13.0"
"version": "2.9.1"
}
],
"when": "type=azure-multi-device"
@@ -595,10 +582,5 @@
}
}
]
},
"__metadata": {
"id": "76",
"publisherDisplayName": "Microsoft",
"publisherId": "Microsoft"
}
}

View File

@@ -33,12 +33,10 @@
"vm_admin": "VM admin username",
"vm_password": "VM admin password",
"vm_password_confirm": "Confirm VM admin password",
"vm_password_validation_error_message": "VM password must be 12 to 123 characters in length and consists of upper case characters, lower case characters, numbers and special characters.",
"package_path": "Package file",
"package_path_description": "Path of the SQL Server package file(dacpac, bacpac) or compressed package file.",
"dacpac_path": "Dacpac zip file",
"azure-info-section-title": "Azure information",
"sqlserver-info-section-title": "Azure SQL Edge information",
"package-files": "SQL Server Package files",
"dacpac-zip-files": "Dacpac zip files",
"sql-edge-azure-single-device-display-name": "Existing device of an Azure IoT Hub",
"sql-edge-azure-single-device-title": "Deploy Azure SQL Edge to an existing device",
"iot-hub-name": "IoT Hub name",

View File

@@ -2,14 +2,14 @@
"name": "azdata",
"displayName": "%azdata.displayName%",
"description": "%azdata.description%",
"version": "0.5.0",
"version": "0.3.1",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",
"icon": "images/extension.png",
"engines": {
"vscode": "*",
"azdata": ">=1.25.0"
"azdata": ">=1.22.0"
},
"activationEvents": [
"*"

Some files were not shown because too many files have changed in this diff Show More