Compare commits

...

499 Commits

Author SHA1 Message Date
Karl Burtram
4d4917d328 Bump version for 1.25.2 release (#14007) 2021-01-20 14:05:08 -08:00
Hale Rankin
edea311757 Revised section scrolling logic to fix broken user experience. (#13926) (#14005) 2021-01-20 13:55:16 -08:00
Karl Burtram
e7eacc32c0 Bump ADS version for Hotfix (#13761) 2020-12-10 13:19:22 -08:00
Charles Gagnon
12f50cca8d Update STS to revert SqlClient update (#13758) (#13760)
(cherry picked from commit 94feb1a80d)
2020-12-10 13:08:03 -08:00
Charles Gagnon
88a4dba695 [Port] Fix env var names for Arc deployment (#13735)
* Fix environment variables for controller create (#13732)


(cherry picked from commit aee8bc2759)

* vBump and update engine version
2020-12-09 10:50:16 -08:00
Charles Gagnon
634ea0ab6a Use console.log for retry logging (#13722) (#13723)
(cherry picked from commit a74119038f)
2020-12-08 10:28:02 -08:00
Charles Gagnon
cd0b5cbc7a Retry getConfig (#13712) (#13713)
* Retry getConfig

* Add logging

(cherry picked from commit d6e1e8eb52)
2020-12-07 15:14:38 -08:00
Charles Gagnon
0b7de6608a Retry publish and always try adding asset (#13700) (#13704)
* Retry publish and always try adding asset

* Undo asset upload change

* Add logging

(cherry picked from commit 6c89c61b0d)
2020-12-07 12:36:03 -08:00
Charles Gagnon
8c6bd8c857 [Port] Add descriptions/validations for Arc connected mode deployment (#13689)
* Add descriptions and validation to connected mode (#13676)


(cherry picked from commit 757ac1d4aa)

* bump version
2020-12-04 16:52:39 -08:00
Monica Gupta
def5775e00 Fix issue with pasting results in Teams (#13673) (#13687)
* Fix issue with pasting results in Teams

* Addressed comment to change header tag to th

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-12-04 16:31:08 -08:00
Chris LaFreniere
91da9aea98 Prevent Table from Disappearing due to exception when looking for tHead (#13680) (#13685)
* Prevent exception when tHead doesn't exist at node

* Add test for no thead
2020-12-04 15:14:32 -08:00
Vasu Bhog
1c898402f8 Fix notebook unordered grid values after papermill execution (#13614) (#13665)
* Fix unordered table

* check entire first row schema:

* SQL Notebooks should not get affected

* delete unused variable and edit comments

* refactor for efficient table ordering

* nit naming
2020-12-03 18:10:24 -08:00
Monica Gupta
54210cf479 Fix empty column issue (#13641) (#13653)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-12-03 14:57:16 -08:00
Barbara Valdez
cbcea87a82 add right padding to notebook toolbar action item (#13640) (#13650)
* add right padding to action item

* remove extra line and add space
2020-12-03 11:30:10 -08:00
Barbara Valdez
2d50d2c5d1 add await to thenable method (#13635) (#13638) 2020-12-02 18:12:48 -08:00
Chris LaFreniere
7448c6c32c WYSIWYG Improvements to highlight (#13032) (#13636)
* Improvements to highlight

* wip

* Tests pass

* Leverage escaping mechanism

* Tweak highlight logic

* PR comments
2020-12-02 16:26:18 -08:00
Karl Burtram
3196cf5be0 Bump distro to pickup new icons (#13598) (#13625) 2020-12-02 11:24:23 -08:00
Kim Santiago
666726a5fa Update open existing dialog icons (#13571) (#13593)
* update open existing dialog icons

* undo removing folder.svg

* remove max width and max height
2020-12-01 14:38:06 -08:00
Benjin Dubishar
818a3d204e Update tools service to .61 (#13591) (#13595) 2020-12-01 14:36:01 -08:00
Lucy Zhang
d45758b4f4 dont add column header in continue request (#13568) (#13577) 2020-11-30 15:12:06 -08:00
Benjin Dubishar
1eda5eb33a Adding additional parameter to data workspace provider API (#13570) (#13574) 2020-11-30 13:50:31 -08:00
Sakshi Sharma
6ac5b7c8a5 Add Import UI to Data-workspace Ellipsis (#13544) 2020-11-30 08:48:09 -08:00
Monica Gupta
397354ebc3 Copy clipboard command in ADS (html/plain text supported) (#13527)
* draft commit

* few changes

* Changes to copy query with results in plain and html formatting

* undo changes

* undo unintended change

* remove comments

* Addressed comments

* Some clean up

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-11-25 21:08:29 -08:00
Vasu Bhog
2a7b90fd70 Fix WYSIWYG text + image paste (#13542)
* Fix WYSIWYG text + image paste

* add test for a link and text
2020-11-25 12:42:07 -06:00
Lucy Zhang
1554e51932 Add logs to smoke tests (#13533)
* add logs

* fix log option

* update log folder
2020-11-25 06:58:43 -08:00
Arvind Ranasaria
d060f1b9a0 Classes for adding kube config and kube cluster picker to Controller connection dialog (#13479) 2020-11-24 20:08:27 -08:00
Alan Ren
c8632c255a scoped refresh commands (#13541) 2020-11-24 17:32:50 -08:00
Charles Gagnon
2ac03b9ef4 Fix error path for predict button on ML import (#13540) 2020-11-24 16:08:26 -08:00
Charles Gagnon
a0d89449cc Fix BDC table icons (#13539) 2020-11-24 15:36:22 -08:00
Charles Gagnon
b03a914934 Update required icon for labels for dynamic enablement (#13515) 2020-11-24 15:05:29 -08:00
Barbara Valdez
d3bcb942f5 Add synapse repo to dropdown option (#13536)
* Add synapse repo

* fix typo
2020-11-24 15:01:34 -08:00
Charles Gagnon
84822b23ac Fix BDC Deployment table (#13538)
* Fix BDC Deployment table

* fix mappings

* Fix names
2020-11-24 14:28:19 -08:00
Charles Gagnon
40ca82c63d Fix declarative table display issues with ML ext (#13529)
* Fix declarative table display issues with ML ext

* Fix test
2020-11-24 12:55:04 -08:00
Udeesha Gautam
f4a6b42b3a Adding ML and DB project to Recommended Extns (#13526)
* adding DB Project to recommended extensions

* adding ML extensions to recommended extensions
2020-11-24 12:09:40 -08:00
Alex Ma
7ad631d4a5 assets folder removed (#13525) 2020-11-24 11:02:32 -08:00
Maddy
4b7baa652f update css to remove extra padding (#13491) 2020-11-24 10:47:52 -08:00
Alan Ren
148e802f4a show the resources as soon as they become available (#13530)
* results streaming

* remove variable
2020-11-24 10:45:14 -08:00
Charles Gagnon
7bb4d00073 Let child ModelView components control their own enabled status by default (#13524) 2020-11-23 15:49:02 -08:00
Leila Lali
3b20e8a61c Fixed an issue when azure account is expired or not valid (#13483) 2020-11-23 13:03:50 -08:00
Alan Ren
6e0a4f27de fix the icon sizing issue (#13522) 2020-11-23 13:02:52 -08:00
Charles Gagnon
21ddf30a7b Increase head size for sql script compile (#13520) 2020-11-23 10:59:44 -08:00
Alex Ma
2ade45858e Removal of placeholder notebooks in Hybrid Toolkit Extension (#13505)
* placeholder notebooks removed and readme changed

* toc updated as well
2020-11-23 10:43:41 -08:00
Vladimir Chernov
e0b1a3460d bumping versions and using ComponentWithIcon props to set icon size (#13517)
* bumping versions and using ComponentWithIcon props to set icon size

* combine withProps sections
2020-11-23 21:41:45 +03:00
Charles Gagnon
f44c714cf2 Update SqlToolsService to .59 (#13519) 2020-11-23 10:29:19 -08:00
Kim Santiago
f72e12fe32 Move focus to inside sql database projects dialogs when they open (#13512) 2020-11-23 10:06:35 -08:00
Alan Ren
0b6fb504dc fix icon size issue (#13514) 2020-11-20 21:31:13 -08:00
Charles Gagnon
145b2491df Cleanup Resource Deployment ModelView (#13510) 2020-11-20 18:29:00 -08:00
Lucy Zhang
aa30b52d03 Notebooks: Fix query results not displaying table rows (#13488)
* fix PQSQL queries not displaying rows

* comment

* change comment and fix unit test
2020-11-20 15:31:01 -08:00
Charles Gagnon
6edcbbb738 Suppress scan warnings (#13507) 2020-11-20 14:28:57 -08:00
Alan Ren
815c61315c add option to show link icon (#13506) 2020-11-20 14:06:07 -08:00
Charles Gagnon
172a044ba7 Remove inputValueTransformer and getInputComponentValue (#13502) 2020-11-20 10:51:00 -08:00
Vladimir Chernov
2a81a0a70f Sql-Assessment api info button (#13490) 2020-11-20 20:53:34 +03:00
Sakshi Sharma
749989cd0b Create project from database UI dialog (#13179)
* UI hook up

* Add tests

* Add back the missing statement for opening project

* Fix failures

* Add a few more tests

* Fix test failure

* Addressed comments

* Update UI to match the mocks

* Update UI to match updated mockups

* Addressed comments to match UI with mockup

* Updated all import strings to be called as Create Project From Database strings

* Fix a couple of test failures and one comment addressed

* Update one missed import string

* Skipping a failing test for now

* Fix failures. Fix alignment of icons

* Addressed PR comments

* Addressed couple more PR comments
2020-11-20 09:38:16 -08:00
Chris LaFreniere
8d42182db8 Attempt to Colorize Code Cells from Notebook Contents (#13473)
* Attempt to colorize from saved language info

* Simplify colorization change

* Fixup
2020-11-19 19:54:09 -08:00
Charles Gagnon
bb2a1db6e8 Add connectivity mode option to Arc controller create (#13495)
* Add connectivity mode option to Arc controller create

* Add connectivity mode to summary

* Use name instead of display name for dropdown values
2020-11-19 17:06:49 -08:00
Kim Santiago
c579ecb111 update sql-database-projects and schema compare versions (#13489) 2020-11-19 16:25:16 -08:00
Charles Gagnon
175d46d508 Add support for dynamic enablement of resource deployment components (#13464)
* saving wip to merge main

* temp fixes for textValidation* removal

* save wip to switch tasks

* save wip to switch tasks

* save wip to switch tasks

* code complete - with known bugs

* missed test file

* fix extHostModelView changes

* validation module

* missed test changes

* missed change

* pr feedback

* pr feedback

* revert inadvertent change

* remove unneeded change

* merge from bug/12082-2

* pr feedback

* pr feedback

* bdd -> tdd for validation tests

* bdd -> tdd for validation tests

* pr feedback

* remove unneeded file

* pr feedback

* EOL instead of '\n'

* pr feedback

* pr feedback

* minor fixes.

* pr feedback

* fix comment

* comments and var renames

* test fixes

* working version after validation simplification

* working version after validation simplification

* remove inadvertent change

* simplified validations

* undo uneeded change

* cleanup

* working version after latest merge

* comments and whitespace fixes

* remove is_integer checks

* sentence case field validation messages

* Use generic strings in sample fields

* minor fixes to sample extension strings

* spaces to tabs for indentation

* request fields before limit fields

* reaarange request/limit fields

* is_integer checks for PG Server Group number fields

* Thenable to Promise

* InputBoxInfo

* pr feedback

* pr feedback

* isUndefinedOrEmpty to utils

* include asde package.json

* use ValidationValueType

* ValidationValueType -> InputValueType

* Add support for dynamic enablement of resource deployment components

* use instanceof function

* getValue returns InputValueType

Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
2020-11-19 16:23:28 -08:00
Leila Lali
870ff39527 Fixed the issue with adding and removing flex container (#13480) 2020-11-19 09:19:39 -08:00
Charles Gagnon
ddd0b8b4bc Remove pandas import from some notebooks (#13481) 2020-11-19 08:19:20 -08:00
Arvind Ranasaria
c7cca5afea Improved Validations for ARC Wizards (#12945) 2020-11-18 22:03:59 -08:00
Kim Santiago
e63e4f0901 update a few strings in projects (#13482) 2020-11-18 20:04:33 -08:00
Kim Santiago
ddc8c00090 Data workspace projects changes (#13466)
* Fix project context menu actions (#12541)

* delete works again

* make fewer changes

* update all sql db project commands

* cleanup

* Remove old projects view (#12563)

* remove old projects view from file explorer view

* fix tests failing

* remove projects in open folder opening up in old view

* Update db reference dialog to show projects in the workspace (#12580)

* update database reference dialog to show projects in the workspace in the project dropdown

* remove workspace stuff from sql projects extension

* undo change

* add class that implements IExtension

* undo a change

* update DataWorkspaceExtension to take workspaceService as a parameter

* add type

* Update sql database project commands (#12595)

* remove sql proj's open and create new project from comman palette

* hook up create project from database to data workspace

* rename the remaining import databases to create project from database

* remove open, new, and close commands

* expose addProjectsToWorkspace() in IExtension instead of calling command

* Addressing comments

* fix failing sql project tests (#12651)

* update SSDT projects opened in projects viewlet (#12669)

* fix action not refreshing the tree issue (#12692)

* fix adding project references in new projects viewlet (#12688)

* Remove old projects tree provider (#12702)

* Remove old projects tree provider and fix tests

* formatting

* update refreshProjectsTree() to accept workspaceTreeItem()

* Cleanup ProjectsController (#12718)

* remove openProject from ProjectController and some cleanup

* rename

* add project and open project dialogs (#12729)

* empty dialogs

* wip

* new project dialog implementation

* revert gitattributes

* open project dialog

* implement add project

* remove icon helper

* refactor

* revert script change

* adjust views

* more updates

* make data-workspace a builtin extension

* show the view only when project provider is detected (#12819)

* only show the view when proj provider is available

* update

* fix sql project tests after merge (#12793)

* Update dialogs to be closer to mockups (#12879)

* small UI changes to dialogs

* center radio card group text

* Create workspace if needed when opening/new project (#12930)

* empty dialogs

* wip

* new project dialog implementation

* revert gitattributes

* open project dialog

* implement add project

* remove icon helper

* refactor

* revert script change

* create workspace

* initial changes

* create new workspace working

* fix tests

* cleanup

* remove showWorkspaceRequiredNotification()

* Add test for no workspace open

* update blue buttons

* move loading temp project to activate() instead of workspaceService constructor

* move workspace creation warning message to before project is created

* pass uri to createWorkspace

* add tests

Co-authored-by: Alan Ren <alanren@microsoft.com>

* Additional create workspace changes (#13004)

* Dialogs workspace updates (#13010)

* adding workspace text boxes

* match new project dialog to mockups

* Add validation error message for workspace file

* add enterWorkspace api

* add warning message for opening workspace

* cleanup

* update commands to remove project so they're more generic

* remove 'empty' from string

* Move default project location setting to data workspace extension (#13022)

* remove project location setting and notification from sql database projects extension

* add default project location setting to data workspace extension

* fix typo

* Add back project name incrementing

* other merge fixes

* fix strings from other PR

* default to last opened directory instead of home directory if no specified default location

* A few small updates (#13092)

* fix build error

* update title for inputboxes

* add missing file

* Add tests for data workspace dialogs (#13324)

* add tests for dialogs

* create helper functions

* New project dialog workspace inputbox fixes (#13407)

* workspace inputbox fixes

* fix folder icons

* Update package.jsons and readme (#13451)

* update package.jsons

* update readme

* add workspace information to open existing dialog (#13455)

Co-authored-by: Alan Ren <alanren@microsoft.com>
2020-11-18 16:13:43 -08:00
Charles Gagnon
34170e7741 Fix Loading component removal (#13478)
* Fix Loading component removal

* More undefined checks
2020-11-18 15:42:33 -08:00
Karl Burtram
f5e4b32d01 Update VM notebook 2020-11-18 15:16:40 -08:00
Kim Santiago
28fef53731 fix schema compare database dropdown not starting with values (#13461) 2020-11-17 13:42:00 -08:00
Aasim Khan
438bc67072 Fixed the generate script logic for notebook wizards. (#13418)
* Fixed the generate script logic for notebook wizards.

* -reverted previous changes
-added last page check to the page validation change logic.

* checking if the page is valid when entering it.

* removing unnecessary index variable in forEach loop

* added comments for generate script button enabling on notebookwizard page.
2020-11-17 10:26:45 -08:00
Charles Gagnon
472c9decfa Display error when doing notebook convert (#13438)
* Display error when doing notebook convert

* Update STS
2020-11-17 10:03:33 -08:00
Charles Gagnon
6cd2d6c942 Fix extension install version check (#13436) 2020-11-17 09:58:50 -08:00
Arvind Ranasaria
39e1181c5d Remove WizardBase.ts (#13350) 2020-11-16 19:21:26 -08:00
Lucy Zhang
c898b50b94 Remove resultSet from IDisplayResult metadata (#13450)
* remove resultSet from IDisplayResult metadata

* remove metadata from IDisplayResult
2020-11-16 17:54:09 -08:00
Justin M
271fe62344 12567 Fixed Notebooks not adding to recent connections (#13113)
* 12567 Changed tryAddActiveConnection to always add recent connection

* 12567 Reverted change to tryAddActiveConnection. Removed this._params.input from connectionDialogService > createModel

* 12567 Simplified conditional in connectionDialogService
2020-11-16 11:27:09 -08:00
Justin M
c18a54bc1d 12666 Passed azureAccount into onFetchDatabases and set on tempProfile. (#13239) 2020-11-16 11:24:22 -08:00
Charles Gagnon
b57bf53b67 Fix ModelView container child layout issues (#13412) 2020-11-16 10:59:21 -08:00
Charles Gagnon
c699179e15 Re-register contributed book commands when new extension loaded (#13403) 2020-11-16 10:47:27 -08:00
Alan Ren
690937443c enable the outline for active tab header (#13415) 2020-11-16 10:21:30 -08:00
Kim Santiago
698b79f0f3 bump sql database projects version (#13408) 2020-11-14 11:10:14 -08:00
Alan Ren
798af5fc2d uncomment the hideContextMenu (#13411) 2020-11-13 16:30:17 -08:00
Charles Gagnon
af55dcfb42 Convert ModelView validate to Promise (#13390)
* Convert ModelView validate to Promise

* more cleanup
2020-11-13 15:31:22 -08:00
Charles Gagnon
76781d6cf4 Fix ModelView logging error (#13410) 2020-11-13 14:40:00 -08:00
Alan Ren
99e3da5b48 Editable dropdown component improvement (#13389)
* replace Tree with List

* comments
2020-11-13 13:36:54 -08:00
Chris LaFreniere
6b657259a5 Ensure that we close editors before utils tests (#13391) 2020-11-13 11:16:14 -08:00
Charles Gagnon
cbe2ba0901 Add some more logging to ModelView components (#13387)
* Add some more logging to ModelView components

* Remove catch

* remove unused
2020-11-13 10:30:40 -08:00
Arvind Ranasaria
b3d99117ca onComplete -> onLeave for toolsAndEulaPage (#13394) 2020-11-13 09:59:08 -08:00
Charles Gagnon
32ac586431 Add MIAA Compute+Storage page (#13367)
* Add MIAA Compute+Storage page

* update min memory limits

* update strings

* feedback
2020-11-13 07:20:46 -08:00
Aasim Khan
bd4676ac8c removing separate heading for page number and clubbing it with wizard title (#13380) 2020-11-13 00:38:04 -08:00
Charles Gagnon
536628603e Fix ModelView container validation ordering (#13386)
* Fix ModelView container validation ordering

* Also validate component after adding

* undo child component validate call
2020-11-12 17:17:27 -08:00
Kim Santiago
ea7fe08b98 remove sqlproj from Project tree node (#13379) 2020-11-12 15:49:34 -08:00
Aasim Khan
6c920f6d54 Fixed the table width so it does not overflow from the wizard. (#13372) 2020-11-12 15:06:03 -08:00
Alex Ma
a2f7136728 Update for Azure SQL Hybrid Cloud Toolkit (#13360)
* Added azurehybridtoolkit to list of external extensions

* Added updated book

* added to recommended extensions

* extensions.js updated for build

* added small changes to extension

* small changes to extension

* tsconfig change

* gitignore and vscode changes

* changed package display name
2020-11-12 14:22:50 -08:00
Charles Gagnon
a082c1e478 Ignore built notebook component files (#13369) 2020-11-12 13:17:54 -08:00
Lucy Zhang
32a6385fef Notebooks: Save cell connection name in cell metadata (#13208)
* save connection info in notebook metadata

* update attachTo dropdown based on saved alias

* add setting for saving connection (default=false)

* save/read cell connection name to/from metadata

* get started on toggling multi connection mode

* add activeConnection property to cell model

* add changeContext method for cell

* add comments

* add unit test for reading connection name

* save connection mode in metadata

* clean up code

* address PR comments
2020-11-12 10:44:34 -08:00
Karl Burtram
468119caa4 Update readme and changelog for 1.24.0 (#13370) 2020-11-12 10:09:59 -08:00
Kim Santiago
abdf575885 fix wizard page name and step getting announced when pages are added and removed (#13362) 2020-11-12 09:44:33 -08:00
Lucy Zhang
07df54ee61 Notebook: Add smoke test for trusted notebooks (#13330)
* add smoke test for trusted notebooks

* change repo

* update trustNotebook method

* pr comment
2020-11-12 06:45:20 -08:00
Kim Santiago
850422164c Add setting for default dacpac save location (#13194)
* add setting for default save location

* lowercase

* addressing comments
2020-11-11 17:20:58 -08:00
Charles Gagnon
865d49c2fb Always upload build artifacts + logs during product build (#13356) 2020-11-11 15:16:07 -08:00
Charles Gagnon
b17ce7a531 Use progress notification for azdata install/updates (#13355) 2020-11-11 15:14:02 -08:00
Charles Gagnon
d64062dfe0 Update minimum azdata version to 20.2.4 (#13357) 2020-11-11 15:13:43 -08:00
Karl Burtram
1b0f3af094 Bump ADS to 1.25 for Dec release (#13358) 2020-11-11 14:04:57 -08:00
Alex Ma
fa608f9f80 Azure SQL Hybrid Cloud Toolkit Notebooks Extension Command (#13286)
* added extension folder incomplete

* WIP extension progress

* notebook finally opens in side panel

* notebook now opens via notebook extension

* html file spaces restored

* package json fixed

* fixed vscode import issue

* more cleanup

* remove git stuff

* placeholder icon logos added

* fixed gulpfile

* cleanup changes

* vscode import fixed

* fixed main and yarn.lock

* added provided notebooks view

* formatting for package.json

* removed first command as its not necessary

* fixed notebook typo

* readded spaces
2020-11-11 13:50:36 -08:00
Charles Gagnon
b32e5f8f25 Add debug logging for ModelView (#13346) 2020-11-11 10:33:13 -08:00
Charles Gagnon
c37f178e71 Undo temporary changes made (#13349) 2020-11-11 10:26:22 -08:00
Alan Ren
3be00b1635 add tabindex and handle keyboard selection (#13348) 2020-11-10 23:00:03 -08:00
Lucy Zhang
b397150264 Notebooks: Add smoke test (#13196)
* add new smoketest

* change repro

* rename methods

* add waitforallresults method

* pr comment
2020-11-10 17:31:17 -08:00
Chris LaFreniere
8c6a966bb9 Re-enable notebook editor unit tests (#13328)
* wip

* Re-enable notebook editor tests

* PR Feedback
2020-11-10 17:04:30 -08:00
Charles Gagnon
b83da2dfa8 Run initial modelview actions first (#13317)
* Run initial modelview actions first

* add param

* Comments and typings

* Catch promise error

* fix db projects test

* remove extra calls
2020-11-10 17:00:16 -08:00
Charles Gagnon
b4dd0442c5 Fix extension unit test script cleanup (#13345) 2020-11-10 16:11:43 -08:00
Charles Gagnon
d4e8053797 Increase yarn timeout when installing packages (#13347)
* Increase yarn timeout when installing packages

* Update other platforms
2020-11-10 16:05:50 -08:00
Aditya Bist
8bbcfff119 fix agent types (#13340) 2020-11-10 13:38:57 -08:00
Arvind Ranasaria
d7a6b55f82 Hook up generateScriptButton to lastPageValidation (#13339) 2020-11-10 13:16:25 -08:00
Kim Santiago
3dd74971d8 Adding @kisantia as a codeowner for dacpac and schema compare extensions (#13336)
* adding @kisantia as a codeowner for dacpac and schema compare extensions

* fix paths for other extensions

* add owners for sql database projects
2020-11-10 13:15:28 -08:00
Alan Ren
e076062d4f restore focus on escape (#13285)
* restore focus on escape

* obtain focus on action click

* comment

* comment 2

* simplify the implementation

* remove the toggle action completely

* remove the import

* implement aria requirements
2020-11-10 11:31:53 -08:00
Srivatsan Vasudevan
fbe8e9d9f3 Schema Compare object type labels missing space (#13315)
* Added spaces to the database trigger, symmetric key and stored procedure labels in Schema Compare object types.
2020-11-10 10:32:19 -08:00
Aasim Khan
1c9322c0e8 Fixes for incorrect button labels and the password validation error. (#13333) 2020-11-10 09:05:28 -08:00
Charles Gagnon
0d49744061 Upload user dir from test runs (#13326)
* Add archive logs step to linux build

* right file

* try

* try this

* correct order

* build artifact

* export

* log

* use tmp

* zip up all

* Remove extra publish

* other scripts

* add test name to dir
2020-11-10 08:57:57 -08:00
Vasu Bhog
7cd4964f35 Fix for < > (non HTML) tags disappearing in WYSIWYG (#13267)
* Push the latest update for WYSIWYG bug

* Improvements to nested lists

* OL tests and PR feedback

* Fixed all toolbar options for tags

* Address PR comments

* Ensure style is kept and not escaped

* Add all markdown toolbar action tests

* Style text edge case fix

* Address repeat function and type comment

* add more clarifying test
2020-11-09 20:26:28 -06:00
nasc17
689c7ab27e Removed web app connection string from postgres and miaa. Remove '!' (#13323) 2020-11-09 17:15:17 -08:00
Karl Burtram
8cf5e4c9fd Revert "Verify the token belongs to the proper user. (#11593)" (#13321)
This reverts commit 45cbaca31f.
2020-11-09 15:32:21 -08:00
Aditya Bist
9d766198b5 fix connection and notebook icons not highlighting (#13314) 2020-11-09 13:12:34 -08:00
Monica Gupta
295aa99e05 Update kusto extension sqltoolsservice (#13306)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-11-09 11:13:29 -08:00
Barbara Valdez
3f76d343a5 Table with no headings gets corrupted when editing (#13293)
* fix data loss issue when modifying table with no headings

* fix comment

* Add tests and extra empty space to empty header format
2020-11-09 09:42:42 -08:00
Aditya Bist
fa9a38d74a Update/agent (#13298)
* Revert "Fix Agent not working (#13126)"

This reverts commit a0ee8b00fb.

* agent fix and bump
2020-11-09 09:00:03 -08:00
Charles Gagnon
8b73391845 Call ModelView base init after view init (#13261)
* Call ModelView base init after view init

* baseInit to end
2020-11-07 09:41:49 -08:00
Karl Burtram
5e00dcb78c Update welcome page icon (#13296) 2020-11-06 17:06:29 -08:00
Kim Santiago
c8db2b60f3 Add more tests for duplicate database references (#13292)
* Add test for duplicate project reference

* add test for different slashes that caused problem before

* split up duplicate database reference tests
2020-11-06 16:48:26 -08:00
Benjin Dubishar
aaa8831a7d Enables streaming job validation, minor bugfix (#13287)
* Enabling validation

* Fixing issue where ESJ property isn't set during creation, only from sqlproj load

* Bump dependency on azdata
2020-11-06 16:15:50 -08:00
Barbara Valdez
e83d3ba57f Remove button property setter causing exception (#13284)
* Fix navigation buttons

* Remove icon set in welcome page
2020-11-06 14:49:14 -08:00
Karl Burtram
8b40e8e4bf Update insider icons (#13281)
* Update insider icons

* Update welcome page icon

* Update distro
2020-11-06 13:27:43 -08:00
Aasim Khan
712c6ae5d8 Adding tools and Eula page to Resource Deployment (#13182)
* SQL VM wizard migration to ResourceType Wizard

* Revert "SQL VM wizard migration to ResourceType Wizard"

This reverts commit e58cd47707a7e2812be20d915f1fe638b96b035f.

* migrated notebook wizard

* SQL VM wizard migration to ResourceType Wizard

* Fixed some imports on SQL VM wizard

* migrated sqldb wizard to generic ResourceTypeWizard

* Added missing import
Solving errors from the merge

* Moved some common functionality into ResourceTypeWizard

* Changed logic of start deployment

* fixed some import after changing files.

* added pagelss model and tools and Eula Page

* Hacky solution to fix wizard update bugs

* Removed tools and Eula components from resourceTypePickerDialog

* Removing changes in ext host

* reverting every change in ext host dialog

* Fixed setting the first provider for resourceTypeWizard.

* Some PR related changes
-Fixed typo in localized constants
-made some code logic concise
-Removed unnecessary check in tools&Eula

* Added some fixes for compilation error

* some refactoring for PRs

* moved comment

* cleaning up some code to make it more readable

* fixed comment typo

* Some additional cleaning up of code.

* Adding a public getter for model

* -Adding error message for failed EULA validation
-Removed unnecessary check for selected resource.

* Added comment to explain model variable behavior

* Added additional comments

* Fixed a comment to make it accurate

* Better phrasing for a comment
2020-11-06 13:05:41 -08:00
Kim Santiago
bb35276652 bump sql-database-projects extension version (#13283) 2020-11-06 11:46:05 -08:00
Charles Gagnon
335c6bf544 Update sql grammar (#13282)
* Update sql grammar

* fix

Co-authored-by: Alex Ross <alros@microsoft.com>
2020-11-06 11:38:23 -08:00
Aasim Khan
054583e0de Adding aria label to "check all" check box in declarative table. (#13216)
* added arialable for check all checkboxes and added some missing roles for the table elements

* removed duplicate attribute

* Moved column header aria label logic to a function.

* fixed typos in declarative table

* Changed the aria label text to something that is more intuiive.

* fixed typo in localized string identifier
2020-11-06 10:44:33 -08:00
Karl Burtram
cd6fa08543 Bump distro for icon refresh (#13258) 2020-11-06 09:09:42 -08:00
Charles Gagnon
a2265f8ccd Update to latest SQL Tools Service (.52) (#13264) 2020-11-06 08:47:06 -08:00
Chris LaFreniere
e4390db779 WYSIWYG fix list nesting (#13251)
* Improvements to nested lists

* OL tests and PR feedback
2020-11-05 18:21:33 -08:00
Leila Lali
fe546e3791 Fixed a bug with validing empty table name (#13259) 2020-11-05 16:07:18 -08:00
Aasim Khan
415a20f9f7 Adding custom focus outline-offset for checkboxes (#13229)
* adding focus outline styling for checkboxes

* Merge pull request #110038 from aasimkhan30/aasim/fix/checkboxOutline

Added outline offset to checkbox to make focus visible.

* Revert "adding focus outline styling for checkboxes"

This reverts commit 1f9de7a00f7e947725e9e935d0be5bca2d662600.

Co-authored-by: Miguel Solorio <miguel.solorio@microsoft.com>
2020-11-05 14:44:16 -08:00
Maddy
6c37ac56b7 fix delete behavior on text cells. (#13153)
* remove default <p> text and add css instead

* remove the top/bottom padding on preview markdown
2020-11-05 14:19:11 -08:00
nasc17
27fbad5884 Changed hyperlink (#13252) 2020-11-05 14:09:49 -08:00
nasc17
bc452ec9be Compute and Storage Page, progress notification waits for model to be refreshed before saying instance has been updated. (#13254)
* Progress notification does not say updated until after the model has been refreshed

* git errors
2020-11-05 13:49:11 -08:00
Benjin Dubishar
9225d6d6aa Disabling validation, changing template for Streaming Jobs (#13253)
* Disable streaming job validation by hiding menu item

* Adding comment to and genercizing column names in external streaming job script
2020-11-05 11:42:26 -08:00
Hale Rankin
5174f3b2f6 ML fine tuning (#13247)
* Added new warning icon for model input mapping view. Adjusted padding between it and the input field beside it.

* Added new warning icon for model input mapping view. Adjusted padding between it and the input field beside it.

* Modified icon fill colors for better presentation in dark/HC themes. Modified font sizes and padding in various places. Increased input field widths. Removed unnecessary width properties.
2020-11-05 10:56:03 -08:00
Chris LaFreniere
8877d74034 Check for clientSession before property (#13198) 2020-11-04 16:28:59 -08:00
Charles Gagnon
d1f26844ef Remove hardcoded colors for HDFS access dialog (#13240) 2020-11-04 15:16:08 -08:00
Charles Gagnon
fd8f88db4b Only show resource viewer on insiders and dev (#13238) 2020-11-04 14:34:13 -08:00
Charles Gagnon
bfb2c20e0f Add Azure account initialization logging (#13235) 2020-11-04 13:24:09 -08:00
Aditya Bist
6b2c409cff Feature: Tasks (core) (#12184)
* initial

* string sanitize

* hygiene

* build task

* added build task

* hygiene

* initial tasks view

* finished tasks

* add sql carbon tags and undo changes

* remove task message

* fix tasks tests

* removed extension stuff

* add problem matcher

* remove extra space

* undo extension changes

* add problem matcher to mssql
2020-11-04 13:07:20 -08:00
Hale Rankin
4af67c9734 ML extension - View models styles / layout updates (#13091)
* Revised styles and added elements to view models stage of the View and import models.

* Implemented Leilas DataInfoComponent, replacing my use of icon and title, description fields. Corrected some styles.

* Fixed the issue with icon title and description

* Fixed severla issues

* Added method to output localized text for number of models shown. Added component to display models shown text.

* Fixed the issues with order of components

* Fixed showing number of models

Co-authored-by: llali <llali@microsoft.com>
2020-11-04 12:07:54 -08:00
Vladimir Chernov
06a4b0d1a2 fix for history reload upon target change and filter out unsupported … (#13205)
* fix for history reload upon target change and filter out unsupported asmt messages
2020-11-04 22:22:56 +03:00
Hale Rankin
3677a69c76 Import models - Browse location styles. (#13110)
* Added icon and style changes for browse location step in import models.

* Revised icon path to point to one for light and one for dark.
2020-11-04 10:27:57 -08:00
Kim Santiago
e51f16fa8d fix folder icon being cutoff (#13199) 2020-11-04 10:00:54 -08:00
Charles Gagnon
a0ee8b00fb Fix Agent not working (#13126)
* Fix Agent display

* more fixes

* cleanup
2020-11-04 09:23:16 -08:00
Charles Gagnon
eeb7da4ace Fix resource viewer grid resizing (#13197)
* Fix resource viewer grid resizing

* keep height
2020-11-04 08:34:30 -08:00
Aasim Khan
b68d504f68 Added file path check (#13201) 2020-11-03 22:05:11 -08:00
Arvind Ranasaria
3ec1f7cc2b Kube Service tests (#13183) 2020-11-03 15:46:31 -08:00
Lucy Zhang
7bdb7c328a add unit test for data conversion (#13190) 2020-11-03 15:05:57 -08:00
Arvind Ranasaria
b175c97dfe Notebook Service tests (#13181) 2020-11-03 14:16:11 -08:00
Arvind Ranasaria
f10ac10f6d platformService tests and move tests from tdd to bdd (#13131) 2020-11-03 13:34:33 -08:00
Monica Gupta
64510506e8 Update sqltoolsservice in kusto extension for kusto aria fix (#13207)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-11-03 13:18:25 -08:00
Charles Gagnon
e6c265b254 Post release vBump for Arc and Azdata ext (#13204) 2020-11-03 07:46:48 -08:00
Benjin Dubishar
342ff47e51 Adding External Streaming Job I/O validation (#13195)
* Added Tools Service call for ValidateStreamingJob

* Partial addition of ESJ

* adding test mocks

* Validation working

* Modifying command visibility logic to submatch ESJs in addition to files

* Changed string literal to constant, corrected attribute order

* Added tests

* correcting casing that's causing test failures on linux

* Swapping Thenable for Promise

* excluded validate from command palette
2020-11-02 19:02:20 -08:00
Kim Santiago
ba80000e27 add loading indicators to schema compare dialog (#13189) 2020-11-02 17:50:28 -08:00
Benjin Dubishar
f792684763 Updating SqlToolsService version (#13192) 2020-11-02 16:38:04 -08:00
Barbara Valdez
036faeb06d [Backend] Editing and creating books (#13089)
* Add interface for modifying the table of contents of books

* Add logic for creating toc

* Fix issue with toc

* Add test for creating toc

* Delete bookTocManager.test.ts

* update allowed extensions

* Fix failing tests and add test

* Add tests for creating books

* Remove unused methods

* add section to section
2020-11-02 14:55:44 -08:00
Chris LaFreniere
04117b2333 Remove prose from notebooks default package list (#13173) 2020-11-02 10:54:32 -08:00
Chris LaFreniere
7559d8463f Ensure trusted action is enabled (#13174) 2020-11-02 10:48:13 -08:00
Vasu Bhog
351516f08d Change the way we insert Injected Parameter (#13171) 2020-11-02 09:14:37 -08:00
Vladimir Chernov
b2e06fd440 grid item find by message as well (#13184)
icon for generate html report
2020-11-02 20:02:56 +03:00
Vasu Bhog
338beaff29 Update powershell kernel version to 0.1.4 (#13167)
* Update powershell kernel version to 0.1.4
2020-10-31 09:01:05 -07:00
Alan Ren
cb1c8503b0 handle keyboard selection (#13175) 2020-10-30 21:17:49 -07:00
Charles Gagnon
d7767c7d91 Resource viewer cleanup (#13168)
* Resource viewer cleanup

* uneeded

* Only show in insiders
2020-10-30 16:48:50 -07:00
Srivatsan Vasudevan
c6f72e6504 Disable the upgradeExistingDatabases radio button if databases don't exist. (#13067)
* Created a function to disable the upgrade existing databases radio button if no databases exist.

* Created helper functions for the upgrade database radio button and new database radio button and abstracted the code accordingly.

* Made changes to helper function names and signatures, and added the enableUpgradeRadioButton() helper function.

* Made some changes to the enableUpgradeRadioButton helper function.

* Made changes to helper functions to ensure the radio buttons enable accordingly when a server is changed from the server dropdown.

* Made changes to helper functions and checks.

* Made additional checks to the populateDatabaseDropdown function.

* Added the logic to select the new radio button when upgrade radio button is disabled.
2020-10-30 16:48:38 -07:00
Vladimir Chernov
da6f800f11 table column with iconcss (#13056) 2020-10-31 02:23:38 +03:00
Alan Ren
2f571d868b show folder icon for some tree nodes (#13161) 2020-10-30 15:38:23 -07:00
Charles Gagnon
3015845093 Add title property for data grid providers (#13155) 2020-10-30 15:37:11 -07:00
Charles Gagnon
341f7aa7ad Enable script to notebook actions under preview flag (#13164) 2020-10-30 15:24:35 -07:00
Charles Gagnon
8717a03466 vBump arc extension post release (#13165) 2020-10-30 15:21:29 -07:00
BranislavGrbicMDCS
3b18f9f8ec Hiding dacpac from search for SQLOD (#13142)
* Hiding dacpac from search for SQLOD

* Removing Schema compare from search bar for SQL OD

* Updating STS version for removing ServerRoles folder change for SQL OD
2020-10-30 22:05:19 +01:00
Lucy Zhang
69527f91b0 Notebooks: Save connection information in metadata (#13060)
* save connection info in notebook metadata

* update attachTo dropdown based on saved alias

* add setting for saving connection (default=false)

* dont show saved conn if seting off + added test

* show conn dialog if save conn name setting off

* address PR comments and fix unit test

* change connectionName to connection_name
2020-10-30 13:56:33 -07:00
Monica Gupta
17073f9d2a Bump sqltoolsservice version for Kusto extension (#13154)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-30 13:02:12 -07:00
Aasim Khan
4f96ac46be Migrating other deployment wizards to the generic ResourceTypeWizard (#13132)
* SQL VM wizard migration to ResourceType Wizard

* Revert "SQL VM wizard migration to ResourceType Wizard"

This reverts commit e58cd47707a7e2812be20d915f1fe638b96b035f.

* migrated notebook wizard

* SQL VM wizard migration to ResourceType Wizard

* Fixed some imports on SQL VM wizard

* migrated sqldb wizard to generic ResourceTypeWizard

* Added missing import
Solving errors from the merge

* Moved some common functionality into ResourceTypeWizard

* Changed logic of start deployment

* fixed some import after changing files.
2020-10-30 12:42:20 -07:00
Charles Gagnon
76625012dd Add Loading Spinner plugin for SlickGrid table (#13152)
* Add Loading Spinner plugin for SlickGrid table

* better comment

* add aria

* remove
2020-10-30 11:42:22 -07:00
Chris LaFreniere
dfb40e0159 Notebooks: Ensure Python Environment Variables are Removed Before First Python Command (#13097)
* Add CodeQL Analysis workflow (#10195)

* Add CodeQL Analysis workflow

* Fix path

* Ensure we delete python env vars correctly

* Move delete to top of method

Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>
2020-10-30 11:06:46 -07:00
Alan Ren
82d5fe3821 put feature behind preview flag (#13147)
* put feature behind preview flag

* remove unused import
2020-10-30 10:17:54 -07:00
nasc17
f5fc5c648e Nasc/Compute and Storage Loading Page Bug Fix (#13150)
* fix git issues

* Merged fixes for loading error on compute&storage page. Clear out input boxes after save.
2020-10-30 09:03:37 -07:00
Amir Omidi
680dc1b5da Fix #13108 (#13145) 2020-10-30 08:19:08 -07:00
Alan Ren
ac476ba973 make the tree theme aware and remove group color (#13143)
* make the tree theme aware and remove group color

* fix eslint error
2020-10-29 22:46:11 -07:00
Charles Gagnon
7857e8aeb9 Fix filtering for resource viewer (#13141) 2020-10-29 17:47:17 -07:00
Leila Lali
d450588e39 ML - Added a link in models page to run predict on a model (#13124)
* Added a link in models page to run predict on a model

* Updated the icons
2020-10-29 16:37:23 -07:00
Charles Gagnon
e31d563f61 Implement open in portal link for Azure resource viewer (#13139)
* Implement open in portal link for Azure resource viewer

* localize
2020-10-29 14:49:02 -07:00
Alan Ren
7819d25c95 resource label update (#13129)
* resource label update

* preserve existing behavior

* fix connection group color

* comments
2020-10-29 14:42:01 -07:00
Kim Santiago
8c956cdb79 Update sqlcmd table to use dataValues instead of deprecated data (#13121)
* Update sqlcmd table to use dataValues instead of deprecated data

* fix in declarativeTableComponent
2020-10-29 13:19:59 -07:00
Aasim Khan
e15ad17967 Aasim/fix/sqldbtypos (#13130)
* fixed some easy typos on sql db wizard.

* Fixed some instructions in the notebook

* - Added option to enable or disable firewall rules

* converted toggle firewall dropdown to checkbox
2020-10-29 11:53:56 -07:00
Charles Gagnon
66da2a46c5 Add support for "More Actions" column in Resource Viewer (#13093)
* Add support for "More Actions" column in resource viewer

* update provider

* remove import

* Use menu contribution and make actions column always show

* cleanup

* move context menu anchor

* Comments
2020-10-29 10:50:27 -07:00
Brian Bergeron
10f6fe2d09 add --force to azdata arc postgres server delete (#13127)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-10-29 09:51:04 -07:00
Aasim Khan
dd77c79090 Refactoring Deploy Cluster to use the a generic Wizard (#13105)
* Refactoring Deploy Cluster to use the generic ResourceTypeWizard

* - Remove unnecessary error message
- Fixed spacing
- Fixed onCancel method implementation in deploy cluster
2020-10-28 20:47:27 -07:00
BranislavGrbicMDCS
9fcd85e640 Adjusting ADS to support new changes for CloudServerType for SQLOD (#12824)
* Add CodeQL Analysis workflow (#10195)

* Add CodeQL Analysis workflow

* Fix path

* Adjusting ADS to support new changes for CloudServerType for SQLOD

* Using existing constants instead of hardcoded values

* Updating STS version

Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>
2020-10-29 01:37:42 +01:00
Brian Bergeron
d1b8c15e11 Arc Postgres - Rename shards to workers
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-10-28 16:57:17 -07:00
Vladimir Chernov
e679d70a4b append data functionality (#13120)
append data functionality
2020-10-29 01:57:08 +03:00
Chris LaFreniere
31817c5494 Fix plotly responsiveness (#13119) 2020-10-28 15:40:46 -07:00
v-bbrady
9d776863a1 updates copy (#13079)
* updates copy

* localizes copy
2020-10-28 14:10:37 -07:00
Alan Ren
1d4398388c bring Azure context menu to the new tree and enhance start cloud shell command (#13101)
* bring context menu to the new tree

* enhance start cloud shell command

* text
2020-10-28 13:48:28 -07:00
nasc17
281592fa97 Cores and memory had needed request/limit commands switched, Workers needs to be checked for undefined (#13116)
* Vores and memory had needed request/limit commands switched

* If saving returns an error, leave discard button enable

* Check if workers is undefined
2020-10-28 13:36:52 -07:00
Kim Santiago
dccccd0110 fix intermittent sql database project test failure (#13114) 2020-10-28 13:15:07 -07:00
Leila Lali
5c474d8614 ML - model source type page (#13077)
* Initial checkin

* Style adjustments.

* addressed PR comments

Co-authored-by: Hale Rankin <harankin@microsoft.com>
2020-10-28 11:57:59 -07:00
Leila Lali
429d8fe584 ML - New UI component for icon, title and description of an item (#13109)
* initial checkin

* addressed PR comment
2020-10-28 11:57:07 -07:00
Lucy Zhang
86357b45b0 Notebooks: re-factor grid streaming (#12937)
* refactor grid streaming (convert to data first)

* change convertRowsToHtml method to return value

* remove griddataconversioncomplete checks

* send row data from STS to gridoutput component

* clean up code

* send data updates to cell model

* serialize cell output at the end of cell execution

* remove unused parameters

* update output contents instead of output reference

* remove unnecessary promise

* move azdata changes to proposed

* update comment
2020-10-28 09:08:15 -07:00
Barbara Valdez
42e16b1752 Support trusted books in new version of Jupyter Books (#12874)
* Fix path gor v1 and v2 boooks

* Add tests and address PR comments

* fix failing tests

* Address pr comments

* address pr comments
2020-10-27 20:57:44 -07:00
Alan Ren
e2c9d3899b make sure saved connections are up to date (#13098) 2020-10-27 19:41:44 -07:00
Kim Santiago
4587fd06c3 add missing get and set in extHostModelView for title and validationErrorMessage (#13095) 2020-10-27 15:30:35 -07:00
Aasim Khan
f9d34cb18b Changed the welcome screen text for deploy (#13086) 2020-10-27 08:26:37 -07:00
Leila Lali
f97ae9e570 ML - Fixed a failing test (#13080)
* Fixed a failing test
2020-10-27 06:42:33 -07:00
Leila Lali
eec6f64d62 ML - Bug fixing (#13018)
* Fixing couple of bugs
2020-10-26 17:36:37 -07:00
Leila Lali
20ed569a71 Updating sqlmlutils to 1.0.3 (#13029) 2020-10-26 17:34:30 -07:00
Alan Ren
79800902db enable filtering, account node context menu and introduce flat account tree node (#13066)
* add search box

* switch back to the traditional azure tree

* Revert "switch back to the traditional azure tree"

This reverts commit 7904b9cd599591e94412ec79da23590068de46b6.

* flat account tree node and filtering

* add comment

* context menu

* fix test

* handle disposable

* add logging
2020-10-26 17:00:44 -07:00
Kim Santiago
1e3c9b722e Expose inputbox title so hover text can be set (#13084)
* expose inputbox title so hover text can be set

* only update title if there's a value
2020-10-26 16:30:53 -07:00
Charles Gagnon
c2bd11fa9e Add error logging when fetching account provider accounts (#13082)
* Add error logging when fetching account provider accounts

* fix test

* fix tests #2
2020-10-26 16:06:55 -07:00
Charles Gagnon
791dee1457 Update query editor taskbar on language flavor change (#13057) 2020-10-26 15:29:34 -07:00
Charles Gagnon
2db51ca243 Allow string for deployment icons and update a couple (#13076)
* Update to colorized versions of bdc and container deployment icons

* update edge

* Allow string for icons
2020-10-26 13:27:12 -07:00
Arvind Ranasaria
2bc2f7f520 New Validation Module for Resource Deployment Extension (#13075) 2020-10-26 11:41:04 -07:00
Kim Santiago
7222c698aa Fix changing database sqlcmdvar value (#13049)
* fix db sqlcmdvar not updating in sqlproj

* Add tests
2020-10-26 11:17:42 -07:00
Arvind Ranasaria
1c6d7866e8 Revert "New Validation Module for Resource-Deployment extension. (#12953)" (#13074)
quick revert of a merge done from PR too soon. Will merge it again later.
2020-10-26 10:03:37 -07:00
Arvind Ranasaria
c810c5a0bb New Validation Module for Resource-Deployment extension. (#12953) 2020-10-26 09:50:25 -07:00
Charles Gagnon
ff45bdd072 Add hyperlink support to DataGrid columns (#13061)
* Add hyperlink support to DataGrid columns

* pr feedback

* Remove unused aria label

* fix error message display

* fix compile
2020-10-26 08:43:09 -07:00
Leila Lali
3ad39bd0d3 ML- Updated extension icon (#13043)
* Updated extension icon
2020-10-26 08:39:45 -07:00
Vasu Bhog
cb30dd1893 Notebook Parameterization - Papermill Compatibility (#13034)
* Parameterization papermill fix

* Utilize isParameter instead

* Address PR comments, and fix tests

* Address comment
2020-10-23 20:32:55 -05:00
Vasu Bhog
bf9fd5a3b8 UI Component for Parameterized Notebook (#13021)
* Backend work for  Parameterization + Tests

* address comments

* Add Parameters Tag upon state change

* Edit CSS Styling for accessibility

* more generic tag names
2020-10-23 19:51:03 -05:00
Udeesha Gautam
4c4d2d4463 Adding backup/restore back for pgsql db and restore for server in dashboard (#13064) 2020-10-23 15:36:49 -07:00
Charles Gagnon
2d182fcd03 Remove calls to DOM.addClass and DOM.removeClass (#13063) 2020-10-23 14:42:22 -07:00
Kim Santiago
c7ab69d46d mark dacpac and schema compare tests as unstable (#13059) 2020-10-23 11:22:51 -07:00
Alan Ren
89b935e2df vbump asde deployment extension (#13051) 2020-10-23 10:22:10 -07:00
Charles Gagnon
3ef2650e69 Fix kubectl storage class check (#13046) 2020-10-22 15:05:14 -07:00
Charles Gagnon
dbb30110ac Update BDC deployment min azdata version (#13044) 2020-10-22 13:24:37 -07:00
Chris LaFreniere
ff8e451af9 Further enhancements to spans (#13035) 2020-10-22 13:21:05 -07:00
Charles Gagnon
c97f75283b Use static name value for azdata sudo commands (#13041)
* Use static name value for azdata sudo commands

* remove unused import
2020-10-22 11:41:17 -07:00
Charles Gagnon
6550c032ee Azure provider cleanup and add rg property (#13030)
* Move Azure DataGrid Provider into own class

* Fix compile
2020-10-22 10:30:01 -07:00
Kim Santiago
dfb1d5411e Don't open schema compare if project build fails (#13027)
* don't open schema compare if project build fails

* update error message
2020-10-22 10:24:11 -07:00
Arvind Ranasaria
98774527bc Bdc wizard pages now track all components in wizard.InputCompnentsInfo (#13023)
* track all components in wizard.InputCompnentsInfo

* pr feedback

* fix formatting
2020-10-22 08:49:48 -07:00
rajeshka
de7cc8ea53 Fix: Switching powershell notebooks reset kernel.json and failed to resolve kernel paths. (#13026)
* Set python path in kernel specs when running on SAW devices.

* Use tab spacer for kernel json.

* Update path to jupyter kernelspec.

* removing the kernelspec write

* Changed powershell kernel.json to use  appdata folder

* Addressed PR and added try catches around the code.

* removed redundant try catch

* removed redundant try catch

* removed another try catch

* removed space

* Fix for multiple powershell notebook failing issue

Co-authored-by: Cory Rivera <corivera@microsoft.com>
2020-10-21 17:36:47 -07:00
Chris LaFreniere
a427606050 WYSIWYG Span Style Fixes, Refactor, and Tests (#13011)
* Refactor into own class, add tests

* Add more tests

* Test fixes

* Test fix hopefully

* Tests D vs C drive
2020-10-21 16:55:41 -07:00
Charles Gagnon
656d727854 Skip getting icon if we don't have a server info (#13017) 2020-10-21 15:56:22 -07:00
nasc17
4184c28ce7 Removed configuration indent and bold font (#13006) 2020-10-21 13:45:22 -07:00
Charles Gagnon
ab7aaf8d2f Include instructions on filtering test runs (#13016) 2020-10-21 10:43:50 -07:00
Charles Gagnon
09d559e7e0 Add some more Azure regions (#13015) 2020-10-21 09:53:28 -07:00
Charles Gagnon
94b34350a3 More azdata tests (#12999)
* More azdata tests

* comment

* fix
2020-10-21 08:53:11 -07:00
Lucy Zhang
bed70ebd09 Re-enable notebook smoke test and increase timeout (#13007) 2020-10-21 07:57:45 -07:00
Charles Gagnon
660c1d6f21 Add icon for resource view items (#13009)
* Add icon for resource view items

* Remove unneeded stuff
2020-10-21 07:43:04 -07:00
Charles Gagnon
f783a26a56 Update STS for Notebook Convert fixes (#13008) 2020-10-20 15:28:29 -07:00
Vasu Bhog
fcec690546 Backend work for Notebook Parameterization and Tests (#12914)
* Backend work for  Parameterization + Tests

* minor comments

* fix test

* address comments
2020-10-20 13:26:59 -05:00
Kim Santiago
c6b3b797c5 fix inputbox width (#12998) 2020-10-20 09:39:54 -07:00
Arvind Ranasaria
1e916e93ad Add @ranasaria code reviewer for 3 extensions (#12973)
* Add @ranasaria code reviewer for 3 extensions

Adding myself as a code reviewer for arc, azdata, and resource-deployment extensions.

* pr feedback
2020-10-19 21:25:11 -07:00
Chris LaFreniere
81122538d2 skip notebook smoke tests for now (#12996) 2020-10-19 18:43:49 -07:00
rajeshka
c4f649a849 Changes to use bundled python package (#12967)
* Set python path in kernel specs when running on SAW devices.

* Use tab spacer for kernel json.

* Update path to jupyter kernelspec.

* removing the kernelspec write

* Changed powershell kernel.json to use  appdata folder

* Addressed PR and added try catches around the code.

* removed redundant try catch

* removed redundant try catch

* removed another try catch

* removed space

Co-authored-by: Cory Rivera <corivera@microsoft.com>
2020-10-19 18:20:48 -07:00
Lucy Zhang
eb36a275a2 increase wait time for python installation (#12993) 2020-10-19 13:36:24 -07:00
Vladimir Chernov
873668a7cf sql-assessment extension code (#12948)
sql-assessment extension on model view components base
2020-10-19 22:43:22 +03:00
Alan Ren
72f7e8de52 add password validation regex (#12976) 2020-10-19 10:52:17 -07:00
Alan Ren
a1c8d4d34a azure-cli-iot-ext is deprecated use azure-iot now (#12970) 2020-10-17 15:38:40 -07:00
Karl Burtram
f96a96a60c Bump SQL Tools to 3.0.0-release.43 (#12972) 2020-10-16 18:52:57 -07:00
Barbara Valdez
2801e59edc Fix links on WYSIWYG (#12952)
* fix for removed links in untrusted notebooks

* replace whitespaces on link for %20

* remove dot from hyperlinks

* Address PR comments

* Change name of variable
2020-10-16 17:51:14 -07:00
nasc17
39b6cc193f Nasc/compute storage db tab (#12917)
* Git problem fix

* Formatted doc

* checkbox feature works correctly

* Functional page, edits needed for visual design

* Fix git problems

* Check which input have acceptable values before running edit commands

* fix git error

* Updating constants

* Format doc

* fix git error

* Corrected Worker node count and added missing localized constant

* Updated discard button function

* Fixed constants off of review

* Rework service updating and discard. Renaming of functions and variables. Rework box intialization

* Fix git error, redo UserInputSection

* Cleaning up

* Added unit tests for GiB conversion funtion

* Cleaned up edit vcores and memory input boxes

* Removed unused constants, added throw error to gib conversion function

Co-authored-by: chgagnon <chgagnon@microsoft.com>
2020-10-16 14:40:55 -07:00
Alan Ren
49983a6f05 fix connection dialog scroll issue (#12956) 2020-10-16 11:09:30 -07:00
Lucy Zhang
767465edbf wait for kernel change before running cell (#12949) 2020-10-16 09:59:03 -07:00
Alex Ma
f6949d834b When Clause processing added to getokbutton and getprovider (#12886)
* processwhenclause added

* processWhenClause explanation added

* changed comment to be more generic.

* changed expected comparison

* test and space fix added

* fixed tests

* resourceTypeService now uses forloop
2020-10-16 09:38:07 -07:00
Arvind Ranasaria
f4c7ab29f0 Allow WithValidation on ComponentBuilder to register async callbacks (#12950) 2020-10-15 17:38:20 -07:00
Christopher Suh
36f758dfca Revert "Bump SQL Tools to 3.0.0-release.42 (#12929)" (#12946)
This reverts commit 52e7bcdf09.
2020-10-15 14:50:34 -07:00
Alan Ren
545a5504e0 fix missing icon issue (#12928)
* fix missing icon issue

* move the logic to renderServerIcon
2020-10-15 12:01:35 -07:00
Charles Gagnon
b460b7834c Add more azdata tests (#12902)
* Add more azdata tests

* fix build

* comments
2020-10-15 10:48:09 -07:00
Tony Xia
c7e4cf7ca4 Componenet -> Component (#12934) 2020-10-15 10:47:47 -07:00
Tony Xia
607447365d Fixed minor typos (#12936) 2020-10-15 09:55:00 -07:00
Tony Xia
f72453fc59 Registery -> Registry (#12935) 2020-10-15 09:46:15 -07:00
Vladimir Chernov
a88669677f handling array as a table data and headerFilter.plugin (#12926) 2020-10-15 19:20:59 +03:00
Karl Burtram
52e7bcdf09 Bump SQL Tools to 3.0.0-release.42 (#12929) 2020-10-15 08:55:04 -07:00
Aditya Bist
581774d413 Update README.md (#12927)
fix deb link typo
2020-10-14 16:17:59 -07:00
Aditya Bist
34079d1612 update extension versions (#12920) 2020-10-14 11:34:19 -07:00
Aditya Bist
08219b2d8e update readme and changelog (#12915) 2020-10-14 10:09:17 -07:00
Alex Ma
ee0b87544b High contrast and dark border top added for results panel in query editor. (#12892)
* border top themes added for tabbed panel

* removed unnecessary space

* border styles added to tabbed panel

* removed unnecessary field
2020-10-14 08:45:39 -07:00
Alex Ma
729378b2d7 Null check for updatedetail (#12900)
* added null check for options

* simplification
2020-10-14 08:45:25 -07:00
Lucy Zhang
7bc26cc493 Add simple notebook smoke test (#12898)
* add simple notebook smoke test

* add id for notebook dropdown elements
2020-10-14 05:58:32 -07:00
Alan Ren
a0c03784f2 introduce CODEOWNERS file (#12913)
* introduce CODEOWNERS file

* comments
2020-10-13 17:33:55 -07:00
Charles Gagnon
35e79f7173 Clean up sqlClusterLookup (#12910) 2020-10-13 16:22:32 -07:00
Anthony Dresser
9fdb5037bc Connection Browse Tab (#12222)
* add browse tab and flat tree provider for azure

* fix tests

* add comment

* fix build errors

* fix test cases

Co-authored-by: Alan Ren <alanren@microsoft.com>
2020-10-13 14:58:09 -07:00
Charles Gagnon
3251b26317 Fix HDFS node for Integrated auth (#12906)
* Fix some HDFS issues

* Undo other changes
2020-10-13 14:35:22 -07:00
rajeshka
50ec75ec57 Conditional Exclusion of Non-Windows Build (#12895)
* Conditional Exclusion of Non-Windows Build

* Update sql-product-build.yml
2020-10-13 09:45:57 -07:00
Charles Gagnon
1c279675c8 Add support for low-privileged user to run spark notebooks (#12883)
* Add support for low-privileged user to run spark notebooks

* error

* fix test
2020-10-13 09:29:00 -07:00
Arvind Ranasaria
10d3a6b2ba throw when onOk, onGenerateScript errors (#12838)
* throw when onOk, onGenerateScript errors

* pr feedback

* remove try catch in onOk && onGenerateScript
2020-10-12 17:08:52 -07:00
Alan Ren
bc3527d310 API improvement: make registerConnectionEventProvider return disposable (#11880)
* promote api to official

* add comments

* disposable

* move getConnection out

* comment for connection namespace

* remove extra line

* also fix registerQueryInfoHandler
2020-10-12 14:29:48 -07:00
Cory Rivera
108891ba2e Place current release at the top of version dropdown in Manage Packages dialog. (#12884)
* Also improved sorting for version numbers with non-numeric components.
2020-10-12 12:20:33 -07:00
Aasim Khan
f61ffae15c Updated Fix for Deployment defaulting to first item when filter filters out all items (#12876)
* Fix where update table is not called

* Clearing table when no resource is selected
2020-10-12 11:53:28 -07:00
Monica Gupta
82726a9119 update sqltoolsversion to 3.0.0-release.41 (#12864)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-12 10:19:51 -07:00
Vasu Bhog
53f00bee12 Fix connection when changing kernel from Kusto to SQL (#12881)
* Fix Kusto to SQL kernel connection change

* Updated Fix - removes kernel alias mapping while ensuring multi kusto notebooks work properly

* Fix tests
2020-10-10 00:10:33 -05:00
Alex Ma
0f6bb683d6 sql db deployments into main (WIP) (#12767)
* added my resource-deployment

* changed notebook message

* Add more advanced properties for spark job submission dialog (#12732)

* Add more advanced properties for spark job submission dialog

* Add queue

* Revert "Add more advanced properties for spark job submission dialog (#12732)"

This reverts commit e6a7e86ddbe70b39660098a8ebd9ded2a1c5530c.

* Changes made for simplification

* changed error messages

* tags added

* tags removed due to redundancy

* Update package.json

* Update resourceTypePickerDialog.ts

* changes based on feedback

* activaterealtimevalidation removed

Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
2020-10-09 15:46:41 -07:00
Kim Santiago
ef8e86a78d fix publish button not always enabling correctly (#12869) 2020-10-09 15:33:51 -07:00
Aasim Khan
daf40393d9 Fix for Depoyment defaults to first item when filter filters out all items (#12871)
* disabled ok button when there are no resources

* added missing space
2020-10-09 14:42:43 -07:00
Charles Gagnon
c04823e2d8 Fix BDC mount status error message string (#12870) 2020-10-09 14:02:25 -07:00
Charles Gagnon
846ed9cd26 vBump arc and azdata ext (#12868) 2020-10-09 13:58:26 -07:00
Charles Gagnon
653b442b60 Remove unneeded descriptions from MIAA deploy (#12863) 2020-10-09 13:22:59 -07:00
Charles Gagnon
9389a92896 Update HDFS mount path (#12865) 2020-10-09 13:17:27 -07:00
Arvind Ranasaria
d9ca879533 disableControlButtons disables generateScriptButton (#12835) 2020-10-09 12:19:09 -07:00
Amir Omidi
45cbaca31f Verify the token belongs to the proper user. (#11593)
* Verify the user signed into the correct account

* Add and fix tests

* Fix tests
2020-10-09 10:27:12 -07:00
Charles Gagnon
ba85d370d6 Remove arc preview setting (#12850) 2020-10-09 10:15:45 -07:00
Alan Ren
5a49b99b8a Fixes #106829 (#12849)
Co-authored-by: Eric Amodio <eamodio@microsoft.com>
2020-10-09 10:00:59 -07:00
Benjin Dubishar
0be752b64c Replace literal with constant (#12848)
* Include database-scoped credentials when deploying with default options

* Changing int literal to constant

* replace int literal with named const
2020-10-09 00:17:26 -07:00
Benjin Dubishar
1935ce1adb Include database-scoped credentials when deploying with default options (#12840) 2020-10-08 21:44:23 -07:00
Leila Lali
2820fb4f15 Fixed a bug with deleting models from list in import model wizard (#12798) 2020-10-08 16:46:26 -07:00
Lucy Zhang
b910bf2f33 Notebooks: Fix strict compile errors (#12591)
* strict compile for sqlSessionManager.ts

* start clientSession.ts fixes

* strict compile for clientSession.ts

* strict compile for notebookModel.ts

* add display name

* clean up code

* clean up code

* initialize string to empty string

* address PR comments

* address PR comments

* address PR comments

* remove errorMessage check
2020-10-08 15:39:57 -07:00
Arvind Ranasaria
3e0135b6b3 capture 'this' to use retrieveVariable as callback (#12828)
* capture 'this' to use retrieveVariable as callback

* remove change not needed for #12082
2020-10-08 14:50:47 -07:00
Barbara Valdez
5fb7c879ed replace pip in notebook (#12808) 2020-10-08 14:16:44 -07:00
Chris LaFreniere
e61cc474a3 default to relative links in images and links (#12802) 2020-10-08 10:50:34 -07:00
Kim Santiago
bd56c49538 fix duplicate db references being allowed (#12795) 2020-10-08 09:55:51 -07:00
Chris LaFreniere
18f87ab03c Prune notebooks package.json, upgrade node-fetch (#12785)
* Prune notebooks package.json, upgrade node-fetch

* wip
2020-10-08 09:48:56 -07:00
Charles Gagnon
27b69b36f7 Update SqlToolsService to pick up fixes for Notebook Convert Service (#12790) 2020-10-07 19:32:35 -07:00
Leila Lali
56d2a66ac5 Fixed a regression in predict map columns page (#12799) 2020-10-07 17:08:49 -07:00
Kim Santiago
acc58a0d1c Stop using deprecated declarative table data in publish dialog (#12782)
* stop using deprecated declarative table data in publish dialog

* fix reloading values from project
2020-10-07 13:50:46 -07:00
Aasim Khan
280a9d20f9 Added categories and search based filtering to the resource dialog. (#12658)
* added filtering to the resource type along with a new component.

* -Added caching of cards
-Removed unused component props
-localized tags
-limited the scope of list items

* Made some changes in the PR

* - Added Iot Category to SQL edge
- Moved category names to constants
- Moved localization strings to localized constants
- Made filtering logic more concise
- Changed how category list is generated
--Category list can now be ordered
-Added back event generation for selectedCard

* Fixed bugs, and some additional changes
-Fixed radiogroup height to avoid the movement of options below it
-Restoring the focus back to the search and listview components
- Added focus behaviour for listview
- Fixed a typo in comment

* Made categories an Enum

* Added localized string

* localized category string
converted categories to enum.

* made the filtering logic more concise.

* returning string if no localized string formed
removed unnecessary returns

* fixed the filtering tag logic
resetting search when category is changed

* removing the iot tag from sql edge deployment

* made filtering logic more concise
made enum const

* added vscode list

* some cleanup

* Some PR changes
- Made PR camelcase
- added comments to SQL
- removed unnecessary export

* -Some PR related changes
-Removing unsupported style property
-scoping down css and removing unused ones.

* Fixed a comment text

* Fixed typings for listview event
2020-10-07 13:38:12 -07:00
Monica Gupta
830cef06db Bump kusto extension version (#12791)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-07 12:37:55 -07:00
Chris LaFreniere
23e2a5dd12 Notebooks: WYSIWYG Add Redo, Fix Shortcuts (#12752)
* Add redo and out/indent

* Check for active cell before doing shortcut

* PR feedback

* Remove unnecessary parameter
2020-10-06 21:47:07 -07:00
Alan Ren
97cd9dba9f post publish extension vbump (#12772) 2020-10-06 16:58:51 -07:00
Alan Ren
adc3fde2e4 auto updated when extension viewlet (#12765) 2020-10-06 13:34:44 -07:00
Alan Ren
c1267a9387 use undefined instead of home directory (#12763)
* use undefined instead of home directory

* fix error
2020-10-06 13:34:16 -07:00
Barbara Valdez
825663fd77 Fix search for pinned notebooks (#12719)
* fix search for pinned notebooks

* fix filtering when verifying that a search folder is not a subdirectory from the current folder queries path

* Show book node on pinned notebooks search results

* fix parent node on pinned notebooks search results

* fix search for pinned notebook and modify how pinned notebooks are stored in workspace

* update format of pinned notebooks for users that used the september release version

* removed unused functions

* Address PR comments

* fix parent node for legacy version of jupyter books

* remove cast from book path
2020-10-06 11:54:42 -07:00
Charles Gagnon
288288df03 Add arc tree data provider tests (#12758)
* Add arc tree data provider tests

* Generic ResourceModel

* no message

* undo

* Fix compile error
2020-10-06 11:26:56 -07:00
Charles Gagnon
1dea5f8f7b Add line endings gitattribute (#12760) 2020-10-06 10:41:09 -07:00
Charles Gagnon
f633e07ed1 Add more advanced properties for spark job submission dialog (#12732)
* Add more advanced properties for spark job submission dialog

* Add queue
2020-10-06 09:50:32 -07:00
Vasu Bhog
726f0cef0e Add Notebook tests for Kernel Alias connections (#12722)
* More Generic tests for kernel alias connections

* Kernel Alias tests for the Notebook Model

* Updated titles of tests
2020-10-05 23:17:41 -05:00
Alan Ren
253ed55412 conditional env var MSSQL_PACKAGE (#12757) 2020-10-05 20:01:22 -07:00
Arvind Ranasaria
c679d5e1f0 allow registering options source providers to resource-deployment (#12712)
* first draft

* compile fixes

* uncomment code

* waitForAzdataToolDisovery added to azdata api

* missed change in last commit

* remove switchReturn

* contributeOptionsSource renamed

* remove switchReturn reference

* create optionSourceService

* azdataTool usage more reliable

* package.json fixes and cleanup

* cleanup

* revert 4831a6e6b8b08684488b2c9e18092fa252e3057f

* pr feedback

* pr feedback

* pr feedback

* cleanup

* cleanup

* fix eulaAccepted check

* fix whitespade in doc comments.
2020-10-05 19:29:40 -07:00
Karl Burtram
362605cea0 Bump insider to 1.24 release (#12756) 2020-10-05 16:18:18 -07:00
Karl Burtram
2af8982640 Update smoke tests to fix merge breaks (#12751)
* Update azuredatastudio-sqlite path

* Disable web smoke tests
2020-10-05 15:16:29 -07:00
Karl Burtram
753d785076 Fix sorting of extension list in build script (#12733) 2020-10-05 12:55:00 -07:00
Christopher Suh
6ff1e3866b Merge from vscode fcf3346a8e9f5ee1e00674461d9e2c2292a14ee3 (#12295)
* Merge from vscode fcf3346a8e9f5ee1e00674461d9e2c2292a14ee3

* Fix test build break

* Update distro

* Fix build errors

* Update distro

* Update REH build file

* Update build task names for REL

* Fix product build yaml

* Fix product REH task name

* Fix type in task name

* Update linux build step

* Update windows build tasks

* Turn off server publish

* Disable REH

* Fix typo

* Bump distro

* Update vscode tests

* Bump distro

* Fix type in disto

* Bump distro

* Turn off docker build

* Remove docker step from release

Co-authored-by: ADS Merger <andresse@microsoft.com>
Co-authored-by: Karl Burtram <karlb@microsoft.com>
2020-10-03 11:42:05 -07:00
Alan Ren
58d02b76db update preview feature notification (#12723) 2020-10-02 19:33:45 -07:00
Charles Gagnon
572310b906 Fix HDFS node to only show up for BDC connections (#12714) 2020-10-02 11:42:56 -07:00
Kim Santiago
6dd7c7d0f0 Fix error message typo (#12708)
add missing '
2020-10-02 11:09:28 -07:00
dependabot[bot]
07d4579c19 Bump @actions/core from 1.2.3 to 1.2.6 in /build/actions (#12696)
Bumps [@actions/core](https://github.com/actions/toolkit/tree/HEAD/packages/core) from 1.2.3 to 1.2.6.
- [Release notes](https://github.com/actions/toolkit/releases)
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/core/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/core)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2020-10-02 08:22:11 -07:00
Benjin Dubishar
9652007266 Bumping sqltoolsservice (#12709) 2020-10-01 21:43:11 -07:00
Charles Gagnon
1df74e8c4a Add additional logging to spark command failures (#12706) 2020-10-01 17:13:20 -07:00
Monica Gupta
b574f2aa32 Fix notebook issue when creating Kusto notebooks 2nd time after launching ADS (#12700)
* Fix notebook issue

* Removed not required code

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-10-01 16:27:41 -07:00
Charles Gagnon
8bf1859d36 Fix checkbox change event not firing on enter press (#12703)
* Fix checkbox change event not firing

* Add comment
2020-10-01 16:23:52 -07:00
Kim Santiago
253f7774de Fix can't find SQLCMD variable error (#12672)
* remove sqlcmd variable deletion verification

* check if sqlcmd variable exists first
2020-10-01 15:26:27 -07:00
Cory Rivera
9b7a1d7a26 Add more unit tests for JupyterServerInstallation (#12675) 2020-10-01 13:54:12 -07:00
Hale Rankin
4965f3de1a ML extension - revised button component (#12674)
* Revert "Revert "ML extension updates  (#11817)" (#12645)"

This reverts commit 34a6200a47.

* Modified button template and renamed infoButton ElementRef

* fix rendering issue

* Minor code cleanup.

* add clean up previous button logic

Co-authored-by: Alan Ren <alanren@microsoft.com>
2020-10-01 13:41:48 -07:00
Charles Gagnon
2da0fafcd5 Update arc dashboards to use logs/metrics links from status (#12673)
* Update arc dashboards to use logs/metrics links from status

* remove unused

* Fix loading
2020-10-01 10:20:19 -07:00
Arvind Ranasaria
39e43d2401 use selected subscriptions (#12691)
* working version

* pr feedback
2020-10-01 09:08:04 -07:00
Charles Gagnon
02afd2dc39 Fix error when clicking empty dropdown (#12684)
* vBump notebooks to get latest CU6 version of book

* Don't throw error when clicking on empty select box

* undo
2020-10-01 08:51:34 -07:00
Alan Ren
54b24d935c asde deployment update (#12662)
* repository update

* plan id

* package update

* replace dacpac with package
2020-09-30 23:53:51 -07:00
Charles Gagnon
8e30b9a6e3 Add arc and azdata to recommended ext and vBump (#12690)
* Add arc and azdata to recommended ext and vBump

* remove bdc
2020-09-30 21:33:19 -07:00
Aasim Khan
c9ec18670c version bump of flat file services (#12686) 2020-09-30 15:10:39 -07:00
Charles Gagnon
fac642de1e Only allow one arc controller connection (#12685) 2020-09-30 15:06:22 -07:00
Arvind Ranasaria
7bfea07b9b Fetch arc dc config profile list from azdata (#12678) 2020-09-30 14:25:15 -07:00
Aasim Khan
ab85028a94 Fixing import getting stuck on step 4 (#12677)
* Getting the proper attribute during column modification
Exposing errors of change column settings and stopping import if they occur

* removing extra space

* Added a comment for error handling

* Fixed a test error that was caused due to insufficient null checks.

* removing unnecessary return
2020-09-30 14:08:21 -07:00
Charles Gagnon
1cd94518a8 vBump notebooks to get latest CU6 version of book (#12683) 2020-09-30 13:07:27 -07:00
Aditya Bist
0583e1db8a update readme links and changelog after release (#12681) 2020-09-30 11:03:16 -07:00
Kim Santiago
5681d6b9e3 set includeCompositeObjects to true if there are database references (#12641) 2020-09-30 10:58:35 -07:00
Charles Gagnon
b3578d058f Save username/password for BDC HDFS connections (#12667)
* Save username/password for BDC HDFS connections

* comment
2020-09-30 08:40:20 -07:00
Alex Ma
82648fab3e connection tests for connectionDialogWidget (#11160)
* Added providerRegistered test

* test for EditConnectionDialog

* changed wording

* added test for connectionInfo

* utils.ts tests added

* hasRegisteredServers test

* commented out editconnection tests, addl. tests

* added onConnectionChangedNotification test

* added changeGroupId tests

* Added connection  profile changes

* added connectIfNotConnected test

* added delete connection test

* isRecent and disconnect editor tests

* Add CodeQL Analysis workflow (#10195)

* Add CodeQL Analysis workflow

* Fix path

* added registerIconProvider test

* Fix for ensureDefaultLanguageFlavor test

* added a few tests

* utils prefix test updated

* added utils tests

* disconnect tests added

* Added additional get connection info tests

* added some more tests

* minor additions to tests

* again another commit

* another change

* connectionManagementService finalized

* connectionDialogWidget test WIP

* wip connectiondialogwidget test

* added working connectionDialogWidget test

* added more tests

* current connectionDialogWidget tests

* undid space

* hanging promise addressed

* added open test

* finished connectionDialogWidget tests

* Added showDialog test

* mockConnectionDialogService added

* added accessorConnectionDialogService

* removed accessor service

* added openDialogAndWait test

* added fake error to test

* added error tests

* Added comments to test

* more coverage

* async to await change

* registerCapabilities test added

* connectionDialogService tests finished

* undefined added

* Added views for tests

* tslint disable added

* error catchers added

* added empty connectioninfo

Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>
2020-09-30 08:39:54 -07:00
Charles Gagnon
3c4df5332e Fix azdata network test failure (#12670)
* Fix azdata network test failure

* Add comment
2020-09-30 08:38:56 -07:00
Arvind Ranasaria
b8de69dfac GenerateToNotebook and Deploy buttons for Notebook Wizards (#12656)
* enable userChooses how to run notebook

* arc ext changes

* nb fixes

* working version

* revert unneeded changes

* fix comments

* Update interfaces.ts

* fix comments

* fix comments

* fix comments

* runAllCells instead of background execute

* pr feedback

* PR feedback

* pr feedback

* arc ext changes for new WizardInfo syntax

* fix doc comments

* pr feedback
2020-09-29 18:12:30 -07:00
Aasim Khan
fd5acf7ab1 Added awaits to change column setting (#12314) 2020-09-29 15:21:04 -07:00
Charles Gagnon
5de1c10dd1 Fix disabled azdata tests (#12659)
* Fix successful install

* fix more tests

* Fix stubs

* Don't throw

* fix check
2020-09-29 14:23:27 -07:00
Charles Gagnon
a91b965a33 Allow non-admin BDC connections to see BDC features (#12663)
* Add handling for non-admin BDC users

* Bump STS

* Fix HDFS root node commands

* remove nested awaits

* colon
2020-09-29 14:02:01 -07:00
Drew Skwiers-Koballa
a2552c1cc1 updates import readme for ga (#12614) 2020-09-29 14:01:44 -07:00
Charles Gagnon
253aa78650 Fix root group name check (#12660) 2020-09-29 10:03:55 -07:00
Monica Gupta
2376e7b384 Hotfix version update for kusto extension (#12655)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-29 09:57:26 -07:00
Lucy Zhang
cf9754f627 revert grid streaming changes (#12650) 2020-09-28 18:34:58 -07:00
Julie Koesmarno
44416abe6e Clarifying the instructions to install (#12306)
1. Updated the link to the download page to the correct one. 
2. Clarified in the Insiders that the instructions is the same as the standard build.
2020-09-28 14:17:43 -07:00
Kim Santiago
5718e6b471 Change target platform of project (#12639)
* Add quick pick to select target platform for a project

* add test

* show current target platform and info message for new target platform

* fix test failing
2020-09-28 13:28:34 -07:00
Justin M
c79cfd709a 3644 Kusto Token Refresh (#12576)
* 3644 Added RequestSecurityTokenParams, RequestSecurityTokenResponse, and SecurityTokenRequest to Kusto/contracts.ts. Added AccountFeature to features.ts. Registered feature in kustoServer.ts

* 3644 Removed TryCatch in kusto features > getToken

* 3644 Added AccountId to Kusto > RequestSecurityTokenParams. Refactored kusto features getToken to use the accountId for the query window.

* 3644 Removed unused AccountQuickPickItem
2020-09-28 11:59:16 -07:00
Alan Ren
34a6200a47 Revert "ML extension updates (#11817)" (#12645)
This reverts commit 037d638927.
2020-09-26 13:46:09 -07:00
Alan Ren
4ec5991a13 update welcome page to use context menu service and some code clean up (#12643)
* use existing menu service and cleanup code

* fix mac issue

* button width
2020-09-25 19:53:14 -07:00
Justin M
5396ed855c 3707 Kusto Icon Fix (#12621)
* 3707 Changed getIconpath in serverTreeRenderer to always set iconPath if iconId is undefined. Changed renderConnection to always renderServerIcon

* 3707 Added default flag to KustoIcon in package.json

* fix caching issue

* 3707 Changed default to optional variable. Updated asyncServerTreeRenderer > getIconPath to check for default

* 3707 Changed logic for setting iconPath in getIconPath

Co-authored-by: chgagnon <chgagnon@microsoft.com>
2020-09-25 16:49:33 -07:00
Hale Rankin
037d638927 ML extension updates (#11817)
* dashboardWidget to use updated button component - added an enum for buttonType.

* Code fixes.

* Casting ButtonType to string

* Leaving default description value as empty string. Testing logic for buttonType.

* Moved ifFile check into buttonType getter. Hard coded button type returns to test on the front end.

* revised buttonType getter to return Normal by default if not specified in properties.

* Extended sql Button as InfoButton for use in dashboardWidget.

* Added InfoButton to ngAfterViewInit

* Upadted how infoButton element is built. Exposed properties for use in button.component.

* Experiment: Added interface and imported iconUtils for getting iconClass.

* infoButton updates

* Some modifications

* Defined HTMLElements and populated each with properties from dashboardWidget.

* Rewrote elements and passed in properties from dashboardWidget to define button contents and layout.

* Added missing delineator.

* Correctd width getter.

* Code cleanup

* Styled header and button copy. Corrected button and header dimensions.

* Button and welcome page background style adjustments.

* infoButton: Added element reference for container coming in via super so I could apply styles to it. Corrected how button renders.

* Addressed PR feedback - removed async functionality where not needed. Modified syntax that is being deprecated. Made extended IButtonOptions all optional.

* Formatted azdata.proposed. Hygiene check flagged it.

* Removing ? and undefined from button properties as all are being passed from dashboard Widget

* Initialized private vars representing options from custom interface IInfoButtonOptions.

* Addressed PR feedback - added stylesheet for infoButton. Removed iconPath from infoButton context. Cleaned up overall implementation.

* Simplified linear-gradient behind main image so it works across all three themes.

* Style cleanup.

* Removed block notation from element creation in constructor.

* Fixed type signature of IInfoButtonOptions.

* Removed comments. Shifted infoButton style properties into stylesheet.

* Set background color for infoButton when active. This fixes issue where is becomes invisible while selected.

* Corrected styles. Removed hard-coded font-colors. Removed unnecessary styles.

* CSS - added hover styles to side panel items per Figma comps. Cleaned up CSS and removed code comments.

* remove unused splash screen and fix issue reporter path (#12218)

* data workspace extension batch 2 (#12208)

* work in progress

* load projects in view and test cases

* update scope

* make the sql proj menu available in workspace view

* add extension unit test

* address comments

* fix errors

* Add reference to another sql project (#12186)

* add projects to add database reference dialog

* able to add project references

* check for circular dependency

* only allow adding reference to project in the same workspace

* fix location dropdown when project reference is enabled

* add tests

* more tests

* cleanup

* fix flakey test

* addressing comments

* fix focus order (#12233)

* Notebook Text Cell Highlight Improvements (#12197)

* 1st iteration

* works but multiple highlights

* remove comment

* Works but multiple selects

* wip

* cleanup

* cleanup

* Update TPN

* Add mark.js to remote + web

* PR feedback

* Tweak workbench html files

* Add vcore limit support (#12212)

* Quick tweaks to resource viewer (v1) (#12210)

* Hackathon - Better Markdown Editor  (#11540)

* Hackathon - better markdown editor - modified Bold to wrap selection in HTML. Split Image button into two new options: embed and link. Made preview container contentEditable.

* Removed the new dropdown from Image button -- it is not necessary since we are adding a context panel instead.

* Modified preview icons

* Set code-component dimensions so it is not visible. It is still being used to pass markdown changes to however.

* add turndown and save markdown

* update model on UI when source changes

* Added conditional that sets element attribute contentEditable when it is in edit mode.

* Added textView component that can be used for editing.

* update source on MD view not on every keystroke

* Added markdown editor buttons that allow user to swap editor, preview views.

* Cleaning up implementation

* Setting base value of _showPreview to false.

* don't allow html edit on split view

* Update editor automagically

* Add an image picking dialog to notebook toolbar.

* Await transformText()

* revert pushEditOperations to fix cursor issue

* Implemented radio buttons for three view toggles.

* Added new, optional properties to radioButton: name, icon class and tooltip. This allows for display as toggleable icon. Updated styles and theme accordingly.

* Style tweaks.

* Added new ViewAction file where the RadioButton action will reside.

* Removed radio button implementation in exchange for native button instantiation. Adjusted CSS and theme accordingly.

* Styles, component and template changes to handle view toggle between text, markdownn an splitview. Includes reverting of radioButton as this is no longer used.

* WYSIWYG 3 Modes

* Ensure one action active at a time

* Setting Text View button active by default. Cleaned up styles. Moved toolbar element to prevent code cell layout overflow.

* Ensure we respect editMode, add showMarkdown

* hiding overflow on code-cell

* Empty text container needs 100% width. Eliminates weird selection border too.

* Initialize _previewMode

* Actions Compatibility

* Further toolbar enhancements

* Update yarn lock after merge

* Slim down changes

* Remove commented out code

* Added margins around notebook-preview container for more visual space for text

* Add turndown to workbench html

* Tweak import

* Add types/turndown

* Remove workbench.html fix

* Import cjs modules directly for turndown

* Leverage solution from github

* browser umd

* non browser umd

* welp dependency

* Modified updatePreview to insert a p tag only when text cell is empty.

* add listener for undo

* add turndown to remote and web

* Fix workbench, check in plugin

* PR comment

Co-authored-by: maddydev <makoripa@microsoft.com>
Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: Cory Rivera <corivera@microsoft.com>
Co-authored-by: Lucy Zhang <luczhan@microsoft.com>

* Update external repo links (#12226)

* Update external repo links

* Update yarn files

* make schema compare test unstable (#12234)

* make schema compare test unstable

* also make Standalone database context menu test unstable

* Fix missing package update (#12235)

* Resource Deployment UX Refresh (#12173)

* adding new card to styles

* renamed property, removed unnecessary css

* Fixed to match new props

* added horizontal class

* merged from master

* Make SandDance work generically for Kusto (#12229)

* Make SandDance work generically for Kusto and Postgres

* Addressed comments

* removed param

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* Set items in cell model (#12237)

* Change tables to make them work for our scenario (#12193)

* Change tables to make them work for our scenario

* Comments & deprecate API

* Disable selections by default

* Have default values in add database dialog input boxes (#12155)

* show default values in text boxes

* add sqlcmd formatting

* add tests

* Add some sqlcmd variable name validation

* Addressing comments

* fixes after merge

* fix test

* don't localize OtherServer

* fix for windows

* one more fix

* fix test

* fix bug that doesn't register double click enabled on new notebook contexts (#12239)

* Notebooks: Hide link and image buttons in text cell toolbar in WYSIWYG mode (#12240)

* hide link and image buttons in WYSIWYG mode

* defined taskbar actions

* rename arrays

* Notebooks: Add setting for default text cell edit behavior (#12245)

* Add setting for default text cell edit behavior

* string updates

* Fix pinned notebooks navigator (#12246)

* Added loading to dropdowns (#12214)

* Added loading prop to dropdowns

* Added property for setting loading text message

* removed unnecessary sets

* changed code to match new changes

* Changed the dropdown loader to use select component instead of edit.
Added missing props in dropdown loader

* Address comments jupyter create book notebook (#12250)

* Remove toc file link and add warning for windows users

* Bump node-fetch from 2.6.0 to 2.6.1 in /build/actions (#12219)

Bumps [node-fetch](https://github.com/bitinn/node-fetch) from 2.6.0 to 2.6.1.
- [Release notes](https://github.com/bitinn/node-fetch/releases)
- [Changelog](https://github.com/node-fetch/node-fetch/blob/master/docs/CHANGELOG.md)
- [Commits](https://github.com/bitinn/node-fetch/compare/v2.6.0...v2.6.1)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Bump node-fetch from 2.6.0 to 2.6.1 in /build (#12220)

Bumps [node-fetch](https://github.com/bitinn/node-fetch) from 2.6.0 to 2.6.1.
- [Release notes](https://github.com/bitinn/node-fetch/releases)
- [Changelog](https://github.com/node-fetch/node-fetch/blob/master/docs/CHANGELOG.md)
- [Commits](https://github.com/bitinn/node-fetch/compare/v2.6.0...v2.6.1)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Fix Notebook Kusto Kernel Consistency  (#12256)

* fix kusto notebook consistency

* Address undefined

* add existing project to workspace feature (#12249)

* add existing project to workspace feature

* update file name

* new test and use URI

* handle workspace with no folder

* add more validation

* and more tests

* use forward slash

* Use correct resource name (#12288)

* Add heasdingStyle atx option (#12286)

* Disable resource viewer (#12291)

* Disable resource viewer

* comment

* Remove unused

* adding icon for add new and open project (#12265)

* remove project feature (#12297)

* remove project feature

* update test

* Fix notebook cancel query bug (#12300)

* fix undefined query runner error

* store connection id

* revert sqlSessionManager change

* Update PG Deployment with new fields (#12187)

* Added 5 missing field options

* Missing apostrophe

* Change extensions label

* Update volume size descriptions

* Change volume size labels

* Reorder required tools

* Reorder required tools

* Argument fixes

* Removed tabs

* Rearrange option order

* Rearrange option order

* Rearrange option order VS

* Added validation to fields. VS accepts only integer and appends Gi

* Changed Dusky deployment title

* Removed text validation for VS and Mem. Changed to Number field type

* Min set to 1 for VS and Mem

* Min set to 1 for Cores

* Memory limit must be at least 256Mi

* the casing of 'preview' should remain consistent

* Removed empty line

* Portal links for main branch (#12319)

* first commit

* json field added

* message genericized

* order changed

* removed summary page text

* fixed url

* controller dropdown field to SQL MIAA and Postgres deployment. (#12217)

* saving first draft

* throw if no controllers

* cleanup

* bug fixes

* bug fixes and caching controller access

* pr comments and bug fixes.

* fixes

* fixes

* comment fix

* remove debug prints

* comment fixes

* remove debug logs

* inputValueTransformer returns string|Promise

* PR feedback

* pr fixes

* remove _ from protected fields

* anonymous to full methods

* small fixes

* new postgres model (#12305)

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Nasc/delete instance code removal (#12307)

* Formatted page

* Removed ResourceHealthPage from the dashboard

* Deleted files that no longer applies to the public preview backend

* shouldn't be able to open the postgres dashboard

* Add new deployment options for MIAA (#12325)

* Use custom dialog for prompting MIAA connection info (#12316)

* Use custom dialog for prompting MIAA connection info

* disable inputs

* Update strings

* remove data-workspace dependency (#12321)

* Arc public preview updates (#12329)

* Arc public preview updates

* disable PG dashboards again

* Stop watching for sqlproj updates after the file is closed (#12311)

* stop watching for sqlproj updates after the file is closed

* remove watcher if project is closed

* Fix component items in declarative table not showing (#12330)

* fix maximize bug (#12334)

* 12284 Removed custom CSS that positioned editor text beneath overlapping layers. Text is now selectable. (#12312)

* Add newline after caption (#12276)

* Watch for on load event (#12309)

* Update deletion strings to refer to instances instead of resources (#12332)

* Update deletion strings to refer to instances instead of resources

* one more

* Remove unused

* More

* Fix DT linting issues (#12290)

* vbump sql-db-proj extension (#12336)

* update sqlproj dependency version (#12359)

* Fix highlight issue (#12278)

* Fix highlight issue

* Address PR comments

* remove a import unit test (#12358)

* SQL VM deployments (#12144)

* Added sql vm deployment option

* Added more fields for sql vm deployments

* created basic sqlvm deployment. Mostly hardcoded

* added string to package.nls

* added poc deployments for sql vm

* Made some changes in the notebook that was mentioned in PR

* Added scaffolding for azure sql vm wizard.

* code cleanups

* added some async logic

* added loading component

* fixed loader code

* completed page2 of wizard

* added some more required fields.

* added some more fields

* added network settings page

* added sql server settings page

* added azure signin support and sql server settings page

* added some helper methods in wizard code

* added some fixes

* fixed azure and vm setting page
added validation in azure setting page

* added changes for the notebook variable

* validations and other bug fixes

* commenting sql storage optimization dropdown

* cleanedup wizard base page

* reversing  vm image list to display newer images first

* cleaning model code

* added validations for network setting

* Completed summary page
fixed the code poisition
some additional field validations

* fixed networking page

* - fixed an error with vm size model variable
- removed byol images because it was not working with az sql vm
- Fixed vm size display names in dropdown

* added double quotes to some localized strings

* added some space inside strings

* -Added live validations
-Restyled network component
-Added required to regions
-Some bug fixes

* -redesigned summary page
-localized some strings

* Fixed summary page section titles

* -Fixed validations on sql server settings page
-Fixed some fields on Summary Page

* corrected onleave validation
using array for error messages
using Promises.all

* Fixed bug on network settings dropdowns when user does not have existing resource to populate them

* Change resource deployment display name
Added Ninar's iteration of the notebook
Changed RDP check box label
Surfacing API errors to user
Filtering regions based on Azure VM regions and user's subscription region
Made form validation async
Displaying new checkbox on network page when dropdowns empty
Fixed a small bug in SQL auth form validation
Made summary single item per row and fixed the gaps in spacing
Fixed validations in vm page
Checking if vm name already exists on azure

* Fixed sql vm eula
Fixed sql vm description
Added hyperlink for more info on vm sizes

* Replaced loading component with dropdown loaders.

* localized string
Fixed a bug in network settings page

* Added additonal filtering

* added reverse to image images

* Fixing some merge related issues

* Update arc regions for public preview (#12366)

* Fix duplicate arc instance nodes (#12381)

* Add "No instances available" node for empty arc controllers (#12374)

* Remove MIAA Port Deploy Option (#12388)

* fix option sources (#12387)

* Fix manage action for arc view (#12389)

* 12360 Notebook UI - Mac/Win fix for Select all. (#12383)

* 12360 Notebook UI - Mac/Win fix for Select all.

* Fix for ctrl key selecting all in windows

* Fix undo as well

* preventDefault to prevent confusing behavior

Co-authored-by: chlafreniere <hichise@gmail.com>

* Fix notebook table rendering with multiple code cells (#12363)

* create unique query runner for each cell

* use cellUri instead of cellId to identify runner

* disconnect each query runner connection

* remove queryrunners size check

* Remove azdata eula acceptance from arc deployments (#12292)

* saving to switch tasks

* activate to exports in extApi

* working version - cleanup pending

* improve messages

* apply pr feedback from a different review

* remove unneeded strings

* redo apiService

* remove async from getVersionFromOutput

* remove _ prefix from protected fields

* error message fix

* throw specif errors from azdata extension

* arrow methods to regular methods

* pr feedback

* expand azdata extension api

* pr feedback

* remove unused var

* pr feedback

* Notebooks: Fast update WYSIWYG support for source update (#12289)

* Fast update WYSIWYG support for source update

* Do bracket matching over hardcoding line offsets

* Update Windows command and minor update to installation cell (#12361)

* Fix windows command and minor update to installation cell

* Add expand_section field on the first section of the book

* marking intermittent test failure as unstable (#12402)

* Fix connection dialog indentation (#12401)

* fix connection dialog indentation

* indent tab body

* Remove container registry from arc control deploy (#12392)

* Fix error when clicking on header for tables with no rows (#12408)

* Change default Select query label to "Take 10" for Kusto tables (#12396)

* Change default label to "Take 10" for Kusto tables

* Addressed comments

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* Fix resource deployment text field validation (#12421)

* Fix arc deployment regions and remove docker summary (#12430)

* start with eulaCheckButton hidden (#12427)

* start with eulaCheckButton hidden

* reset buttons on card select

* remove testcode

* fix the legacy card style issue (#12428)

* fix the legacy card style issue

* replace the card class

* Fix Spark kernel connections and switch from Kusto to Spark kernels (#12436)

* Fix connection dialog for Spark and issue when switching from Kusto to Spark

* Address comments

* Add Arc MIAA username configuration (#12429)

* Add Arc MIAA username configuration

* username -> userName

* fix postgres product name (#12443)

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* vBump azdata extension (#12452)

* Arc - Enable Postgres dashboard (#12439)

* get overview, conn strings, properties pages working

* hook up password reset, azure link, scale configuration

* fix comments

* enable opening postgres dashboard from controller dashboard

* minor fixes

Co-authored-by: Brian Bergeron <brberger@microsoft.com>
Co-authored-by: chgagnon <chgagnon@microsoft.com>

* Remove ItemGroup from sqlproj if node being removed is the last one (#12398)

* remove ItemGroup if node being removed is the only one

* fix for if ItemGroup has elements with different tag names

* fix for ItemGroups not at the end of the sqlproj

* fix the extension dependency issue (#12347)

* Bump arc/azdata extension versions (#12463)

* Add preview to Arc controller deployment (#12465)

* Update azdata extension icon (#12469)

* small optimization for select (#12419)

* Revert BDC deployment back to using old azdata check (#12470)

* remove/use unused strings (#12460)

* Activate arc extension with resource deployment command (#12472)

* change to allow refresh and delete correctly (#12477)

* update resultSet in data provider (#12478)

* Add SQL instance name validation (#12480)

* Add SQL instance name validation

* Move -

* Update PG validation

* Fix regex

* simplify

* add table name to models that are imported (#12445)

* add table name to models

* adding null check for safety

* As per PR comment

* Arc/Azdata string updates (#12485)

* Arc/Azdata string updates

* more updates

(cherry picked from commit 2c6f7ac4472e0197650be299ec899388bb495fd8)

* couple more fixes

* more

* update sql database projects readme (#12481)

* remove option to add reference to same database (#12495)

* Update default values and example text when dropdown value changes (#12493)

* Fix PySpark kernel connection change (#12494)

* Notebooks: Fix Grids Not Rendering when Unsaved Notebook Reloaded (#12483)

* Clear Output and fix output change

* Fix tests after forced clear + append output

* Add warning message for users using the new version of jupyter book (#12496)

* Add warning message for users

* Address pr comments

* Arc Postgres - Add Azure params to overview page, update notebook (#12482)

* add azure params to pg overview page, update troubelshooting notebook, string changes

* no default pg version for notebook

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Arc good ARC bad (#12499)

* default to 0 workers (#12506)

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Add progress indicator for arc instance deletion (#12510)

* Fix core and memory request MIAA deploy (#12505)

* Fix core and memory request MIAA deploy

* Memory request/limit as 2Gi for MIAA

* In-Viewlet Notebooks Search  (#12455)

* fix search

* Add sql carbon tags to vs files

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>

* Replace vCores property with state for arc controller dashboard (#12512)

* Bump extensions (#12516)

* fix the reference error due to extra $ in default variable (#12523)

* use GB instead of MB for postgres memory (#12528)

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* data workspace review feedback implementation (#12489)

* add a view to handle no workspace scenario

* text update

* project type filter improvement

* fix the project level context menu issue

* update strings

* Add timestamps to azdata output channel output (#12530)

* Fix arc controller ns/name validation (#12525)

* Fix arc controller ns/name validation

* Rename control plane references

* Fix validation

* new download location of azdata.msi (#12466)

* new download location of azdata.msi

* refactor

* Disable tests

Co-authored-by: chgagnon <chgagnon@microsoft.com>

* move eula prompt post azdata discovery (#12532)

* new download location of azdata.msi

* move eula prompt post azdata discovery

* unacceptEula - test change

* Revert "unacceptEula - test change"

This reverts commit f84a3f5e41797de25b38f87143d66f7041b5c4ec.

* Remove Direct connectivity mode option (#12533)

* Remove Direct connectivity mode option

* remove option completely

* fix string

* Arc - Update Postgres deployment field labels and descriptions (#12537)

* update help text strings

* update field descriptions to match help text

* update cpu/memory field descriptions

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Add troubleshoot button to arc controller/MIAA dashboard (#12534)

* Add troubleshoot button to arc controller dashboard

* Add MIAA button

* Fix links

* Change azdata output channel to Azure Data CLI (#12545)

(cherry picked from commit cdd80c66764bddb2f5ed79045fbd8a0606d1d084)

Co-authored-by: chgagnon <chgagnon@microsoft.com>

* Fix windows azdata install (#12542)

* Fix windows azdata install

* skip failing tests

* Update/release docs (#12544)

* update changelog for 1.22

* update fwlinks

* fix format

* escape the value for display (#12547)

* Add new profile (#12556)

* Add new profile

* version

* move

* fix readonly summary page field widths (#12558)

* Arc - Enable Postgres support request link (#12560)

Co-authored-by: Brian Bergeron <brberger@microsoft.com>

* Add test for notebook result grid streaming (#12539)

* start testing convertAllData

* add test for convertAllData method

* clean up code

* bump ads and extensions (#12550)

* bump ads and extensions

* bump azdata

* bump asde deploy

* Remove command link from deployment error (#12573)

* [Kusto extension] Updated links (#12569)

* Add CodeQL Analysis workflow (#10195)

* Add CodeQL Analysis workflow

* Fix path

* updated links

* edit line 31

* edit lines 11 and 31

* edit line 11 again

Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>

* Don't use deprecated param (#12574)

* add event.preventDefault() (#12564)

* strict compile for sql/workbench/contrib/queryHistory (#12579)

* clone the object to be modified (#12583)

* Strict compile for sql/workbench/services/dialog (#12578)

* Strict compile for sql/workbench/services/dialog

* fix errors

* strict compile for queryResultsView (#12581)

* Add telemetry for ModelView wizards (#12596)

* Add telemetry for ModelView wizards

* Remove unnecessary params

* Fix compile error

* Improved behavior for accepting EULA. (#12453)

* working version of overloading "select" button

* promptForEula to use showErrorMessage

* make parameter optional in promptForEula

* remove test code

* PR feedback

* eula to EULA

* minor fix

* Remove arc deployment extension check (#12598)

* add role for history tab (#12608)

* add title and tab-index for X button (#12605)

* set aria-hidden for a non-visible control (#12607)

* Convert MIAA and Postgres deploy from Dialog to NotebookWizard (#12609)

* dialog to NotebookWizard

* move fields

* fix 12599

* pr feedback

* add missing page titles

* Fix undefined error in server tree data source (#12616)

* Fix undefined error in server tree data source

* Add comment

* Add ModelViewEditorOpened event (#12597)

* Add ModelViewEditorOpened event

* fix

* Fix compile

* Update sqltoolservice release for kusto extension (#12622)

Co-authored-by: Monica Gupta <mogupt@microsoft.com>

* Delete database reference (#12531)

* remove ItemGroup if node being removed is the only one

* fix for if ItemGroup has elements with different tag names

* fix for ItemGroups not at the end of the sqlproj

* add delete for db references

* fix failing tests

* add test

* cleanup

* Addressing comments and fixing a string

* simplify the preview information feature (#12606)

* dashboardWidget to use updated button component - added an enum for buttonType.

* Code fixes.

* Leaving default description value as empty string. Testing logic for buttonType.

* revised buttonType getter to return Normal by default if not specified in properties.

* Extended sql Button as InfoButton for use in dashboardWidget.

* Some modifications

* Rewrote elements and passed in properties from dashboardWidget to define button contents and layout.

* Styled header and button copy. Corrected button and header dimensions.

* Merged latest from main. Resolve conflicts.

Co-authored-by: Amir Omidi <amomidi@microsoft.com>
Co-authored-by: Aditya Bist <adbist@microsoft.com>
Co-authored-by: Alan Ren <alanren@microsoft.com>
Co-authored-by: Kim Santiago <31145923+kisantia@users.noreply.github.com>
Co-authored-by: Chris LaFreniere <40371649+chlafreniere@users.noreply.github.com>
Co-authored-by: maddydev <makoripa@microsoft.com>
Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: Cory Rivera <corivera@microsoft.com>
Co-authored-by: Lucy Zhang <luczhan@microsoft.com>
Co-authored-by: Karl Burtram <karlb@microsoft.com>
Co-authored-by: Aasim Khan <aasimkhan30@gmail.com>
Co-authored-by: Monica Gupta <scorpio90m@gmail.com>
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
Co-authored-by: Kartik Arora <33497301+ktech99@users.noreply.github.com>
Co-authored-by: Barbara Valdez <34872381+barbaravaldez@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Vasu Bhog <vabhog@microsoft.com>
Co-authored-by: Charles Gagnon <chgagnon@microsoft.com>
Co-authored-by: Udeesha Gautam <46980425+udeeshagautam@users.noreply.github.com>
Co-authored-by: nasc17 <69922333+nasc17@users.noreply.github.com>
Co-authored-by: Alex Ma <alma1@microsoft.com>
Co-authored-by: Arvind Ranasaria <ranasaria@outlook.com>
Co-authored-by: Brian Bergeron <brian.e.bergeron@gmail.com>
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
Co-authored-by: Mark Ghanayem <22989000+markingmyname@users.noreply.github.com>
Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>
2020-09-25 16:16:26 -07:00
Alan Ren
62947a3e8a bump sts version (#12636) 2020-09-25 13:44:07 -07:00
Lucy Zhang
56c989ac7b Fix notebook toolbar screen reader reading order (#12252)
* add presentation role to selectContainer

* make nb toolbar dropdown arrow unfocusable

* remove empty string content change
2020-09-25 12:49:42 -07:00
Alan Ren
7e74465fb1 fix show all link color (#12625) 2020-09-25 10:55:35 -07:00
Alan Ren
94d0e1972e use h3 instead of div (#12626) 2020-09-25 10:55:19 -07:00
Alan Ren
e1a9ed0d1e add aria-label for images (#12627) 2020-09-25 10:55:03 -07:00
Alan Ren
3220f8a0b4 update new button role on welcome page (#12630) 2020-09-25 10:54:47 -07:00
Alan Ren
28c7188984 simplify the preview information feature (#12606) 2020-09-24 21:58:29 -07:00
Kim Santiago
49de1f80cf Delete database reference (#12531)
* remove ItemGroup if node being removed is the only one

* fix for if ItemGroup has elements with different tag names

* fix for ItemGroups not at the end of the sqlproj

* add delete for db references

* fix failing tests

* add test

* cleanup

* Addressing comments and fixing a string
2020-09-24 17:27:13 -07:00
Monica Gupta
9780eebb12 Update sqltoolservice release for kusto extension (#12622)
Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-24 16:00:54 -07:00
Charles Gagnon
bf9646ba98 Add ModelViewEditorOpened event (#12597)
* Add ModelViewEditorOpened event

* fix

* Fix compile
2020-09-24 12:53:28 -07:00
Charles Gagnon
1ea33d83bf Fix undefined error in server tree data source (#12616)
* Fix undefined error in server tree data source

* Add comment
2020-09-24 12:51:07 -07:00
Arvind Ranasaria
23e1141484 Convert MIAA and Postgres deploy from Dialog to NotebookWizard (#12609)
* dialog to NotebookWizard

* move fields

* fix 12599

* pr feedback

* add missing page titles
2020-09-24 11:47:34 -07:00
Alan Ren
96ea3d8273 set aria-hidden for a non-visible control (#12607) 2020-09-24 10:13:40 -07:00
Alan Ren
2673eb2c1d add title and tab-index for X button (#12605) 2020-09-24 09:31:56 -07:00
Alan Ren
47532f1287 add role for history tab (#12608) 2020-09-24 09:30:47 -07:00
Charles Gagnon
dd22b195bd Remove arc deployment extension check (#12598) 2020-09-24 09:05:59 -07:00
Arvind Ranasaria
f47c5dcc75 Improved behavior for accepting EULA. (#12453)
* working version of overloading "select" button

* promptForEula to use showErrorMessage

* make parameter optional in promptForEula

* remove test code

* PR feedback

* eula to EULA

* minor fix
2020-09-23 18:36:11 -07:00
Charles Gagnon
807a4ae8c4 Add telemetry for ModelView wizards (#12596)
* Add telemetry for ModelView wizards

* Remove unnecessary params

* Fix compile error
2020-09-23 17:08:22 -07:00
Charles Gagnon
8bc7079d78 strict compile for queryResultsView (#12581) 2020-09-23 13:17:23 -07:00
Charles Gagnon
4e07685588 Strict compile for sql/workbench/services/dialog (#12578)
* Strict compile for sql/workbench/services/dialog

* fix errors
2020-09-23 13:15:59 -07:00
Alan Ren
e1235a7346 clone the object to be modified (#12583) 2020-09-23 10:54:36 -07:00
Charles Gagnon
4a089131fc strict compile for sql/workbench/contrib/queryHistory (#12579) 2020-09-23 10:36:00 -07:00
Lucy Zhang
f8eb203643 add event.preventDefault() (#12564) 2020-09-23 07:10:36 -07:00
Charles Gagnon
c26963e848 Don't use deprecated param (#12574) 2020-09-22 15:47:11 -07:00
Mark Ghanayem
cf8283a2b0 [Kusto extension] Updated links (#12569)
* Add CodeQL Analysis workflow (#10195)

* Add CodeQL Analysis workflow

* Fix path

* updated links

* edit line 31

* edit lines 11 and 31

* edit line 11 again

Co-authored-by: Justin Hutchings <jhutchings1@users.noreply.github.com>
2020-09-22 14:24:01 -07:00
Charles Gagnon
4f433772af Remove command link from deployment error (#12573) 2020-09-22 14:16:40 -07:00
Aditya Bist
95752a7de5 bump ads and extensions (#12550)
* bump ads and extensions

* bump azdata

* bump asde deploy
2020-09-22 13:31:41 -07:00
Lucy Zhang
3fd92adf78 Add test for notebook result grid streaming (#12539)
* start testing convertAllData

* add test for convertAllData method

* clean up code
2020-09-22 12:15:33 -07:00
Brian Bergeron
3bc4178ea8 Arc - Enable Postgres support request link (#12560)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-22 11:19:49 -07:00
Arvind Ranasaria
0f9b1a7d0c fix readonly summary page field widths (#12558) 2020-09-22 09:02:31 -07:00
Charles Gagnon
2dd9aafb83 Add new profile (#12556)
* Add new profile

* version

* move
2020-09-22 00:42:31 -07:00
Alan Ren
cb97072ae2 escape the value for display (#12547) 2020-09-21 22:41:00 -07:00
Aditya Bist
a91bf1cecc Update/release docs (#12544)
* update changelog for 1.22

* update fwlinks

* fix format
2020-09-21 21:02:22 -07:00
Charles Gagnon
5f4546488c Fix windows azdata install (#12542)
* Fix windows azdata install

* skip failing tests
2020-09-21 18:14:49 -07:00
Chris LaFreniere
fe3d1ff48e Change azdata output channel to Azure Data CLI (#12545)
(cherry picked from commit cdd80c66764bddb2f5ed79045fbd8a0606d1d084)

Co-authored-by: chgagnon <chgagnon@microsoft.com>
2020-09-21 18:10:47 -07:00
Charles Gagnon
41ace44f32 Add troubleshoot button to arc controller/MIAA dashboard (#12534)
* Add troubleshoot button to arc controller dashboard

* Add MIAA button

* Fix links
2020-09-21 15:03:23 -07:00
Brian Bergeron
50dbe65e1d Arc - Update Postgres deployment field labels and descriptions (#12537)
* update help text strings

* update field descriptions to match help text

* update cpu/memory field descriptions

Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-21 13:53:53 -07:00
Charles Gagnon
1c21c2956d Remove Direct connectivity mode option (#12533)
* Remove Direct connectivity mode option

* remove option completely

* fix string
2020-09-21 13:39:14 -07:00
Arvind Ranasaria
21d9485ca7 move eula prompt post azdata discovery (#12532)
* new download location of azdata.msi

* move eula prompt post azdata discovery

* unacceptEula - test change

* Revert "unacceptEula - test change"

This reverts commit f84a3f5e41797de25b38f87143d66f7041b5c4ec.
2020-09-21 12:56:03 -07:00
Arvind Ranasaria
db902beb24 new download location of azdata.msi (#12466)
* new download location of azdata.msi

* refactor

* Disable tests

Co-authored-by: chgagnon <chgagnon@microsoft.com>
2020-09-21 11:18:05 -07:00
Charles Gagnon
f9e9cc76ea Fix arc controller ns/name validation (#12525)
* Fix arc controller ns/name validation

* Rename control plane references

* Fix validation
2020-09-21 10:51:51 -07:00
Charles Gagnon
92a8147c8d Add timestamps to azdata output channel output (#12530) 2020-09-21 10:51:08 -07:00
Alan Ren
1054164533 data workspace review feedback implementation (#12489)
* add a view to handle no workspace scenario

* text update

* project type filter improvement

* fix the project level context menu issue

* update strings
2020-09-21 10:22:21 -07:00
Brian Bergeron
9e29c7ab19 use GB instead of MB for postgres memory (#12528)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-21 10:22:11 -07:00
Udeesha Gautam
e892723c58 fix the reference error due to extra $ in default variable (#12523) 2020-09-21 10:04:03 -07:00
Charles Gagnon
3f77b371b3 Bump extensions (#12516) 2020-09-20 07:40:30 -07:00
Charles Gagnon
5bc817f25b Replace vCores property with state for arc controller dashboard (#12512) 2020-09-19 22:05:20 -07:00
Barbara Valdez
b110e4dea1 In-Viewlet Notebooks Search (#12455)
* fix search

* Add sql carbon tags to vs files

Co-authored-by: chlafreniere <hichise@gmail.com>
Co-authored-by: abist <adbist@microsoft.com>
2020-09-19 14:38:27 -07:00
Chris LaFreniere
384f593c80 Fix core and memory request MIAA deploy (#12505)
* Fix core and memory request MIAA deploy

* Memory request/limit as 2Gi for MIAA
2020-09-19 13:21:47 -07:00
Charles Gagnon
ca4663001b Add progress indicator for arc instance deletion (#12510) 2020-09-19 12:48:03 -07:00
Brian Bergeron
2feaca5537 default to 0 workers (#12506)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-19 10:10:39 -07:00
Chris LaFreniere
82749989e6 Arc good ARC bad (#12499) 2020-09-19 10:09:30 -07:00
Brian Bergeron
ffbb1b3917 Arc Postgres - Add Azure params to overview page, update notebook (#12482)
* add azure params to pg overview page, update troubelshooting notebook, string changes

* no default pg version for notebook

Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-18 19:34:56 -07:00
Barbara Valdez
e8624f2de7 Add warning message for users using the new version of jupyter book (#12496)
* Add warning message for users

* Address pr comments
2020-09-18 18:49:55 -07:00
Chris LaFreniere
117ecfebd1 Notebooks: Fix Grids Not Rendering when Unsaved Notebook Reloaded (#12483)
* Clear Output and fix output change

* Fix tests after forced clear + append output
2020-09-18 18:40:21 -07:00
Vasu Bhog
63adfd4d38 Fix PySpark kernel connection change (#12494) 2020-09-18 20:35:25 -05:00
Kim Santiago
b05cbe5356 Update default values and example text when dropdown value changes (#12493) 2020-09-18 18:32:09 -07:00
Kim Santiago
6ae0fc9aef remove option to add reference to same database (#12495) 2020-09-18 18:31:44 -07:00
Kim Santiago
7c57a82589 update sql database projects readme (#12481) 2020-09-18 17:38:41 -07:00
Charles Gagnon
087f4a260e Arc/Azdata string updates (#12485)
* Arc/Azdata string updates

* more updates

(cherry picked from commit 2c6f7ac4472e0197650be299ec899388bb495fd8)

* couple more fixes

* more
2020-09-18 17:21:28 -07:00
Udeesha Gautam
e06980a664 add table name to models that are imported (#12445)
* add table name to models

* adding null check for safety

* As per PR comment
2020-09-18 16:54:54 -07:00
Charles Gagnon
bf49788296 Add SQL instance name validation (#12480)
* Add SQL instance name validation

* Move -

* Update PG validation

* Fix regex

* simplify
2020-09-18 16:43:04 -07:00
Lucy Zhang
d00f94f51c update resultSet in data provider (#12478) 2020-09-18 16:41:41 -07:00
Udeesha Gautam
026f59285a change to allow refresh and delete correctly (#12477) 2020-09-18 16:11:38 -07:00
Charles Gagnon
54ed8bb2a8 Activate arc extension with resource deployment command (#12472) 2020-09-18 15:08:41 -07:00
Arvind Ranasaria
4b37ca7d53 remove/use unused strings (#12460) 2020-09-18 14:48:04 -07:00
Charles Gagnon
1355431a36 Revert BDC deployment back to using old azdata check (#12470) 2020-09-18 14:36:18 -07:00
Alex Ma
949f40d07e small optimization for select (#12419) 2020-09-18 14:21:01 -07:00
Charles Gagnon
de6089609a Update azdata extension icon (#12469) 2020-09-18 13:52:08 -07:00
Charles Gagnon
67f317ec93 Add preview to Arc controller deployment (#12465) 2020-09-18 12:56:11 -07:00
Charles Gagnon
81b9b98250 Bump arc/azdata extension versions (#12463) 2020-09-18 10:58:41 -07:00
Alan Ren
54b5390d03 fix the extension dependency issue (#12347) 2020-09-18 09:49:18 -07:00
Kim Santiago
2162e2f50c Remove ItemGroup from sqlproj if node being removed is the last one (#12398)
* remove ItemGroup if node being removed is the only one

* fix for if ItemGroup has elements with different tag names

* fix for ItemGroups not at the end of the sqlproj
2020-09-18 09:35:00 -07:00
Brian Bergeron
c50067b6d2 Arc - Enable Postgres dashboard (#12439)
* get overview, conn strings, properties pages working

* hook up password reset, azure link, scale configuration

* fix comments

* enable opening postgres dashboard from controller dashboard

* minor fixes

Co-authored-by: Brian Bergeron <brberger@microsoft.com>
Co-authored-by: chgagnon <chgagnon@microsoft.com>
2020-09-18 08:49:54 -07:00
Charles Gagnon
19566e0d9a vBump azdata extension (#12452) 2020-09-18 07:43:17 -07:00
Brian Bergeron
89d5c5febc fix postgres product name (#12443)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-18 07:37:47 -07:00
Charles Gagnon
cac14ff181 Add Arc MIAA username configuration (#12429)
* Add Arc MIAA username configuration

* username -> userName
2020-09-18 07:33:58 -07:00
Vasu Bhog
eea35d4920 Fix Spark kernel connections and switch from Kusto to Spark kernels (#12436)
* Fix connection dialog for Spark and issue when switching from Kusto to Spark

* Address comments
2020-09-17 22:16:42 -05:00
Alan Ren
a9f78694ee fix the legacy card style issue (#12428)
* fix the legacy card style issue

* replace the card class
2020-09-17 19:54:41 -07:00
Arvind Ranasaria
36d78242f7 start with eulaCheckButton hidden (#12427)
* start with eulaCheckButton hidden

* reset buttons on card select

* remove testcode
2020-09-17 19:24:42 -07:00
Charles Gagnon
1f93992736 Fix arc deployment regions and remove docker summary (#12430) 2020-09-17 17:28:52 -07:00
Charles Gagnon
068ba5b4f4 Fix resource deployment text field validation (#12421) 2020-09-17 17:08:15 -07:00
Monica Gupta
2aa00eba80 Change default Select query label to "Take 10" for Kusto tables (#12396)
* Change default label to "Take 10" for Kusto tables

* Addressed comments

Co-authored-by: Monica Gupta <mogupt@microsoft.com>
2020-09-17 17:00:33 -07:00
Charles Gagnon
cac686b9e6 Fix error when clicking on header for tables with no rows (#12408) 2020-09-17 14:54:58 -07:00
Charles Gagnon
ee523fb512 Remove container registry from arc control deploy (#12392) 2020-09-17 13:59:25 -07:00
Aditya Bist
a0886b9152 Fix connection dialog indentation (#12401)
* fix connection dialog indentation

* indent tab body
2020-09-17 13:31:29 -07:00
Udeesha Gautam
09cb0764d7 marking intermittent test failure as unstable (#12402) 2020-09-17 12:41:30 -07:00
Barbara Valdez
77f28083e3 Update Windows command and minor update to installation cell (#12361)
* Fix windows command and minor update to installation cell

* Add expand_section field on the first section of the book
2020-09-17 11:56:25 -07:00
Chris LaFreniere
da8963d1e5 Notebooks: Fast update WYSIWYG support for source update (#12289)
* Fast update WYSIWYG support for source update

* Do bracket matching over hardcoding line offsets
2020-09-17 11:52:30 -07:00
Arvind Ranasaria
ba44a2f02e Remove azdata eula acceptance from arc deployments (#12292)
* saving to switch tasks

* activate to exports in extApi

* working version - cleanup pending

* improve messages

* apply pr feedback from a different review

* remove unneeded strings

* redo apiService

* remove async from getVersionFromOutput

* remove _ prefix from protected fields

* error message fix

* throw specif errors from azdata extension

* arrow methods to regular methods

* pr feedback

* expand azdata extension api

* pr feedback

* remove unused var

* pr feedback
2020-09-17 11:20:32 -07:00
Lucy Zhang
945e04ed92 Fix notebook table rendering with multiple code cells (#12363)
* create unique query runner for each cell

* use cellUri instead of cellId to identify runner

* disconnect each query runner connection

* remove queryrunners size check
2020-09-17 09:44:10 -07:00
Hale Rankin
1ff815fe5a 12360 Notebook UI - Mac/Win fix for Select all. (#12383)
* 12360 Notebook UI - Mac/Win fix for Select all.

* Fix for ctrl key selecting all in windows

* Fix undo as well

* preventDefault to prevent confusing behavior

Co-authored-by: chlafreniere <hichise@gmail.com>
2020-09-17 09:40:02 -07:00
Charles Gagnon
451bbe890b Fix manage action for arc view (#12389) 2020-09-17 07:09:39 -07:00
Charles Gagnon
fca8b85a72 fix option sources (#12387) 2020-09-16 23:00:50 -07:00
Chris LaFreniere
be1e0b3c8d Remove MIAA Port Deploy Option (#12388) 2020-09-16 22:56:51 -07:00
Charles Gagnon
0cd242cd0c Add "No instances available" node for empty arc controllers (#12374) 2020-09-16 16:25:41 -07:00
Charles Gagnon
8b9103208d Fix duplicate arc instance nodes (#12381) 2020-09-16 16:01:12 -07:00
Charles Gagnon
c4859f665b Update arc regions for public preview (#12366) 2020-09-16 14:39:05 -07:00
Aasim Khan
f62020e1ec SQL VM deployments (#12144)
* Added sql vm deployment option

* Added more fields for sql vm deployments

* created basic sqlvm deployment. Mostly hardcoded

* added string to package.nls

* added poc deployments for sql vm

* Made some changes in the notebook that was mentioned in PR

* Added scaffolding for azure sql vm wizard.

* code cleanups

* added some async logic

* added loading component

* fixed loader code

* completed page2 of wizard

* added some more required fields.

* added some more fields

* added network settings page

* added sql server settings page

* added azure signin support and sql server settings page

* added some helper methods in wizard code

* added some fixes

* fixed azure and vm setting page
added validation in azure setting page

* added changes for the notebook variable

* validations and other bug fixes

* commenting sql storage optimization dropdown

* cleanedup wizard base page

* reversing  vm image list to display newer images first

* cleaning model code

* added validations for network setting

* Completed summary page
fixed the code poisition
some additional field validations

* fixed networking page

* - fixed an error with vm size model variable
- removed byol images because it was not working with az sql vm
- Fixed vm size display names in dropdown

* added double quotes to some localized strings

* added some space inside strings

* -Added live validations
-Restyled network component
-Added required to regions
-Some bug fixes

* -redesigned summary page
-localized some strings

* Fixed summary page section titles

* -Fixed validations on sql server settings page
-Fixed some fields on Summary Page

* corrected onleave validation
using array for error messages
using Promises.all

* Fixed bug on network settings dropdowns when user does not have existing resource to populate them

* Change resource deployment display name
Added Ninar's iteration of the notebook
Changed RDP check box label
Surfacing API errors to user
Filtering regions based on Azure VM regions and user's subscription region
Made form validation async
Displaying new checkbox on network page when dropdowns empty
Fixed a small bug in SQL auth form validation
Made summary single item per row and fixed the gaps in spacing
Fixed validations in vm page
Checking if vm name already exists on azure

* Fixed sql vm eula
Fixed sql vm description
Added hyperlink for more info on vm sizes

* Replaced loading component with dropdown loaders.

* localized string
Fixed a bug in network settings page

* Added additonal filtering

* added reverse to image images

* Fixing some merge related issues
2020-09-16 14:02:03 -07:00
Alan Ren
58252bcf97 remove a import unit test (#12358) 2020-09-16 13:32:37 -07:00
Barbara Valdez
ed65a5124e Fix highlight issue (#12278)
* Fix highlight issue

* Address PR comments
2020-09-16 12:01:46 -07:00
Udeesha Gautam
9a32f1a816 update sqlproj dependency version (#12359) 2020-09-16 11:44:20 -07:00
Alan Ren
d793910306 vbump sql-db-proj extension (#12336) 2020-09-16 10:06:15 -07:00
Charles Gagnon
dcc8ef54b9 Fix DT linting issues (#12290) 2020-09-16 07:59:06 -07:00
Charles Gagnon
78de48391d Update deletion strings to refer to instances instead of resources (#12332)
* Update deletion strings to refer to instances instead of resources

* one more

* Remove unused

* More
2020-09-16 07:48:46 -07:00
Chris LaFreniere
ffb81d88fd Watch for on load event (#12309) 2020-09-15 23:27:05 -07:00
Chris LaFreniere
7e76f8cb20 Add newline after caption (#12276) 2020-09-15 21:57:02 -07:00
Hale Rankin
69f5bc1725 12284 Removed custom CSS that positioned editor text beneath overlapping layers. Text is now selectable. (#12312) 2020-09-15 21:30:53 -07:00
Aditya Bist
8aee87e211 fix maximize bug (#12334) 2020-09-15 20:33:17 -07:00
Charles Gagnon
4dd04cb250 Fix component items in declarative table not showing (#12330) 2020-09-15 18:08:56 -07:00
Kim Santiago
eaaaae0a83 Stop watching for sqlproj updates after the file is closed (#12311)
* stop watching for sqlproj updates after the file is closed

* remove watcher if project is closed
2020-09-15 17:28:04 -07:00
Charles Gagnon
47e86e6133 Arc public preview updates (#12329)
* Arc public preview updates

* disable PG dashboards again
2020-09-15 17:15:50 -07:00
Alan Ren
19519a6d7c remove data-workspace dependency (#12321) 2020-09-15 16:01:54 -07:00
Charles Gagnon
4cc9cbb0c5 Use custom dialog for prompting MIAA connection info (#12316)
* Use custom dialog for prompting MIAA connection info

* disable inputs

* Update strings
2020-09-15 16:00:27 -07:00
Chris LaFreniere
233a1aefce Add new deployment options for MIAA (#12325) 2020-09-15 15:59:52 -07:00
nasc17
14b534eb64 Nasc/delete instance code removal (#12307)
* Formatted page

* Removed ResourceHealthPage from the dashboard

* Deleted files that no longer applies to the public preview backend

* shouldn't be able to open the postgres dashboard
2020-09-15 14:53:00 -07:00
Brian Bergeron
f79ff99d0b new postgres model (#12305)
Co-authored-by: Brian Bergeron <brberger@microsoft.com>
2020-09-15 14:52:04 -07:00
Arvind Ranasaria
9cf80113fc controller dropdown field to SQL MIAA and Postgres deployment. (#12217)
* saving first draft

* throw if no controllers

* cleanup

* bug fixes

* bug fixes and caching controller access

* pr comments and bug fixes.

* fixes

* fixes

* comment fix

* remove debug prints

* comment fixes

* remove debug logs

* inputValueTransformer returns string|Promise

* PR feedback

* pr fixes

* remove _ from protected fields

* anonymous to full methods

* small fixes
2020-09-15 14:47:49 -07:00
Alex Ma
92ed830564 Portal links for main branch (#12319)
* first commit

* json field added

* message genericized

* order changed

* removed summary page text

* fixed url
2020-09-15 14:41:10 -07:00
nasc17
caeb33248e Update PG Deployment with new fields (#12187)
* Added 5 missing field options

* Missing apostrophe

* Change extensions label

* Update volume size descriptions

* Change volume size labels

* Reorder required tools

* Reorder required tools

* Argument fixes

* Removed tabs

* Rearrange option order

* Rearrange option order

* Rearrange option order VS

* Added validation to fields. VS accepts only integer and appends Gi

* Changed Dusky deployment title

* Removed text validation for VS and Mem. Changed to Number field type

* Min set to 1 for VS and Mem

* Min set to 1 for Cores

* Memory limit must be at least 256Mi

* the casing of 'preview' should remain consistent

* Removed empty line
2020-09-15 12:43:12 -07:00
Lucy Zhang
0be5f67621 Fix notebook cancel query bug (#12300)
* fix undefined query runner error

* store connection id

* revert sqlSessionManager change
2020-09-15 11:31:31 -07:00
Alan Ren
908a15d6a8 remove project feature (#12297)
* remove project feature

* update test
2020-09-15 11:12:30 -07:00
Udeesha Gautam
725e1b2ee3 adding icon for add new and open project (#12265) 2020-09-15 09:37:12 -07:00
Charles Gagnon
95b76f08f2 Disable resource viewer (#12291)
* Disable resource viewer

* comment

* Remove unused
2020-09-15 07:31:49 -07:00
Chris LaFreniere
e75b3d69f6 Add heasdingStyle atx option (#12286) 2020-09-14 18:31:11 -07:00
Charles Gagnon
8d76985276 Use correct resource name (#12288) 2020-09-14 18:21:56 -07:00
Alan Ren
23c16ebfb3 add existing project to workspace feature (#12249)
* add existing project to workspace feature

* update file name

* new test and use URI

* handle workspace with no folder

* add more validation

* and more tests

* use forward slash
2020-09-14 15:43:29 -07:00
Vasu Bhog
7a524d7a35 Fix Notebook Kusto Kernel Consistency (#12256)
* fix kusto notebook consistency

* Address undefined
2020-09-14 14:18:53 -05:00
1500 changed files with 52421 additions and 19537 deletions

View File

@@ -73,6 +73,7 @@ RUN apt-get update \
libnss3 \
libxss1 \
libasound2 \
libgbm1 \
xfonts-base \
xfonts-terminus \
fonts-noto \

11
.github/CODEOWNERS vendored Normal file
View File

@@ -0,0 +1,11 @@
# Lines starting with '#' are comments.
# Each line is a file pattern followed by one or more owners.
# Syntax can be found here: https://docs.github.com/free-pro-team@latest/github/creating-cloning-and-archiving-repositories/about-code-owners#codeowners-syntax
/src/sql/*.d.ts @alanrenmsft @Charles-Gagnon @ranasaria
/extensions/resource-deployment/ @ranasaria
/extensions/arc/ @ranasaria
/extensions/azdata/ @ranasaria
/extensions/dacpac/ @kisantia
/extensions/schema-compare/ @kisantia
/extensions/sql-database-projects/ @Benjin @kisantia

7
.github/subscribers.json vendored Normal file
View File

@@ -0,0 +1,7 @@
{
"label-to-subscribe-to": [
"list of usernames to subscribe",
"such as:",
"JacksonKearl"
]
}

View File

@@ -31,7 +31,10 @@ jobs:
with:
node-version: 10
# TODO: cache node modules
- run: yarn --frozen-lockfile
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron
@@ -79,7 +82,10 @@ jobs:
- uses: actions/setup-python@v1
with:
python-version: '2.x'
- run: yarn --frozen-lockfile
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron
name: Download Electron
@@ -112,7 +118,10 @@ jobs:
- uses: actions/setup-node@v1
with:
node-version: 10
- run: yarn --frozen-lockfile
# Increase timeout to get around latency issues when fetching certain packages
- run: |
yarn config set network-timeout 300000
yarn --frozen-lockfile
name: Install Dependencies
- run: yarn electron x64
name: Download Electron

View File

@@ -0,0 +1,50 @@
name: "Deep Classifier: Runner"
on:
schedule:
- cron: 0 * * * *
repository_dispatch:
types: [trigger-deep-classifier-runner]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
ref: v35
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/apply/fetch-sources
with:
# slightly overlapping to protect against issues slipping through the cracks if a run is delayed
from: 80
until: 5
configPath: classifier
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
- name: Set up Python 3.7
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade numpy scipy scikit-learn joblib nltk simpletransformers torch torchvision
- name: "Run Classifier: Generator"
run: python ./actions/classifier-deep/apply/generate-labels/main.py
- name: "Run Classifier: Labeler"
uses: ./actions/classifier-deep/apply/apply-labels
with:
configPath: classifier
allowLabels: "needs more info|new release"
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -0,0 +1,27 @@
name: "Deep Classifier: Scraper"
on:
repository_dispatch:
types: [trigger-deep-classifier-scraper]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
ref: v35
path: ./actions
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Additional Dependencies
# Pulls in a bunch of other packages that arent needed for the rest of the actions
run: npm install @azure/storage-blob@12.1.1
- name: "Run Classifier: Scraper"
uses: ./actions/classifier-deep/train/fetch-issues
with:
blobContainerName: vscode-issue-classifier
blobStorageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
token: ${{secrets.ISSUE_SCRAPER_TOKEN}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}

View File

@@ -0,0 +1,27 @@
name: Latest Release Monitor
on:
schedule:
- cron: 0/5 * * * *
repository_dispatch:
types: [trigger-latest-release-monitor]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: 'microsoft/vscode-github-triage-actions'
path: ./actions
ref: v35
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Install Storage Module
run: npm install @azure/storage-blob@12.1.1
- name: Run Latest Release Monitor
uses: ./actions/latest-release-monitor
with:
storageKey: ${{secrets.AZURE_BLOB_STORAGE_CONNECTION_STRING}}
appInsightsKey: ${{secrets.TRIAGE_ACTIONS_APP_INSIGHTS}}
token: ${{secrets.VSCODE_ISSUE_TRIAGE_BOT_PAT}}

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"August 2020\"",
"value": "$repo=repo:microsoft/vscode\n$milestone=milestone:\"September 2020\"",
"editable": true
},
{

View File

@@ -8,7 +8,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"August 2020\"",
"value": "// list of repos we work in\n$repos=repo:microsoft/vscode repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks\n\n// current milestone name\n$milestone=milestone:\"September 2020\"",
"editable": true
},
{

View File

@@ -14,7 +14,7 @@
{
"kind": 2,
"language": "github-issues",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"July 2020\"",
"value": "$repos=repo:microsoft/vscode repo:microsoft/vscode-internalbacklog repo:microsoft/vscode-remote-release repo:microsoft/vscode-js-debug repo:microsoft/vscode-pull-request-github repo:microsoft/vscode-github-issue-notebooks \n$milestone=milestone:\"September 2020\"",
"editable": true
},
{

View File

@@ -0,0 +1,194 @@
# Query: .innerHTML =
# Flags: CaseSensitive WordMatch
# Including: src/vs/**/*.{t,j}s
# Excluding: *.test.ts
# ContextLines: 3
22 results - 14 files
src/vs/base/browser/markdownRenderer.ts:
161 const strValue = values[0];
162 const span = element.querySelector(`div[data-code="${id}"]`);
163 if (span) {
164: span.innerHTML = strValue;
165 }
166 }).catch(err => {
167 // ignore
243 return true;
244 }
245
246: element.innerHTML = insane(renderedMarkdown, {
247 allowedSchemes,
248 // allowedTags should included everything that markdown renders to.
249 // Since we have our own sanitize function for marked, it's possible we missed some tag so let insane make sure.
src/vs/base/browser/ui/contextview/contextview.ts:
157 this.shadowRootHostElement = DOM.$('.shadow-root-host');
158 this.container.appendChild(this.shadowRootHostElement);
159 this.shadowRoot = this.shadowRootHostElement.attachShadow({ mode: 'open' });
160: this.shadowRoot.innerHTML = `
161 <style>
162 ${SHADOW_ROOT_CSS}
163 </style>
src/vs/code/electron-sandbox/issue/issueReporterMain.ts:
57 const platformClass = platform.isWindows ? 'windows' : platform.isLinux ? 'linux' : 'mac';
58 addClass(document.body, platformClass); // used by our fonts
59
60: document.body.innerHTML = BaseHtml();
61 const issueReporter = new IssueReporter(configuration);
62 issueReporter.render();
63 document.body.style.display = 'block';
src/vs/code/electron-sandbox/processExplorer/processExplorerMain.ts:
320 content.push(`.highest { color: ${styles.highlightForeground}; }`);
321 }
322
323: styleTag.innerHTML = content.join('\n');
324 if (document.head) {
325 document.head.appendChild(styleTag);
326 }
src/vs/editor/browser/view/domLineBreaksComputer.ts:
107 allCharOffsets[i] = tmp[0];
108 allVisibleColumns[i] = tmp[1];
109 }
110: containerDomNode.innerHTML = sb.build();
111
112 containerDomNode.style.position = 'absolute';
113 containerDomNode.style.top = '10000';
src/vs/editor/browser/view/viewLayer.ts:
507 private _finishRenderingNewLines(ctx: IRendererContext<T>, domNodeIsEmpty: boolean, newLinesHTML: string, wasNew: boolean[]): void {
508 const lastChild = <HTMLElement>this.domNode.lastChild;
509 if (domNodeIsEmpty || !lastChild) {
510: this.domNode.innerHTML = newLinesHTML;
511 } else {
512 lastChild.insertAdjacentHTML('afterend', newLinesHTML);
513 }
525 private _finishRenderingInvalidLines(ctx: IRendererContext<T>, invalidLinesHTML: string, wasInvalid: boolean[]): void {
526 const hugeDomNode = document.createElement('div');
527
528: hugeDomNode.innerHTML = invalidLinesHTML;
529
530 for (let i = 0; i < ctx.linesLength; i++) {
531 const line = ctx.lines[i];
src/vs/editor/browser/widget/diffEditorWidget.ts:
2157
2158 let domNode = document.createElement('div');
2159 domNode.className = `view-lines line-delete ${MOUSE_CURSOR_TEXT_CSS_CLASS_NAME}`;
2160: domNode.innerHTML = sb.build();
2161 Configuration.applyFontInfoSlow(domNode, fontInfo);
2162
2163 let marginDomNode = document.createElement('div');
2164 marginDomNode.className = 'inline-deleted-margin-view-zone';
2165: marginDomNode.innerHTML = marginHTML.join('');
2166 Configuration.applyFontInfoSlow(marginDomNode, fontInfo);
2167
2168 return {
src/vs/editor/standalone/browser/colorizer.ts:
40 let text = domNode.firstChild ? domNode.firstChild.nodeValue : '';
41 domNode.className += ' ' + theme;
42 let render = (str: string) => {
43: domNode.innerHTML = str;
44 };
45 return this.colorize(modeService, text || '', mimeType, options).then(render, (err) => console.error(err));
46 }
src/vs/editor/standalone/browser/standaloneThemeServiceImpl.ts:
212 if (!this._globalStyleElement) {
213 this._globalStyleElement = dom.createStyleSheet();
214 this._globalStyleElement.className = 'monaco-colors';
215: this._globalStyleElement.innerHTML = this._css;
216 this._styleElements.push(this._globalStyleElement);
217 }
218 return Disposable.None;
221 private _registerShadowDomContainer(domNode: HTMLElement): IDisposable {
222 const styleElement = dom.createStyleSheet(domNode);
223 styleElement.className = 'monaco-colors';
224: styleElement.innerHTML = this._css;
225 this._styleElements.push(styleElement);
226 return {
227 dispose: () => {
291 ruleCollector.addRule(generateTokensCSSForColorMap(colorMap));
292
293 this._css = cssRules.join('\n');
294: this._styleElements.forEach(styleElement => styleElement.innerHTML = this._css);
295
296 TokenizationRegistry.setColorMap(colorMap);
297 this._onColorThemeChange.fire(theme);
src/vs/editor/test/browser/controller/imeTester.ts:
55 let content = this._model.getModelLineContent(i);
56 r += content + '<br/>';
57 }
58: output.innerHTML = r;
59 }
60 }
61
69 let title = document.createElement('div');
70 title.className = 'title';
71
72: title.innerHTML = description + '. Type <strong>' + inputStr + '</strong>';
73 container.appendChild(title);
74
75 let startBtn = document.createElement('button');
src/vs/workbench/contrib/notebook/browser/view/renderers/cellRenderer.ts:
454
455 private getMarkdownDragImage(templateData: MarkdownCellRenderTemplate): HTMLElement {
456 const dragImageContainer = DOM.$('.cell-drag-image.monaco-list-row.focused.markdown-cell-row');
457: dragImageContainer.innerHTML = templateData.container.outerHTML;
458
459 // Remove all rendered content nodes after the
460 const markdownContent = dragImageContainer.querySelector('.cell.markdown')!;
611 return null;
612 }
613
614: editorContainer.innerHTML = richEditorText;
615
616 return dragImageContainer;
617 }
src/vs/workbench/contrib/notebook/browser/view/renderers/webviewPreloads.ts:
375 addMouseoverListeners(outputNode, outputId);
376 const content = data.content;
377 if (content.type === RenderOutputType.Html) {
378: outputNode.innerHTML = content.htmlContent;
379 cellOutputContainer.appendChild(outputNode);
380 domEval(outputNode);
381 } else {
src/vs/workbench/contrib/webview/browser/pre/main.js:
386 // apply default styles
387 const defaultStyles = newDocument.createElement('style');
388 defaultStyles.id = '_defaultStyles';
389: defaultStyles.innerHTML = defaultCssRules;
390 newDocument.head.prepend(defaultStyles);
391
392 applyStyles(newDocument, newDocument.body);
src/vs/workbench/contrib/welcome/walkThrough/browser/walkThroughPart.ts:
281
282 const content = model.main.textEditorModel.getValue(EndOfLinePreference.LF);
283 if (!strings.endsWith(input.resource.path, '.md')) {
284: this.content.innerHTML = content;
285 this.updateSizeClasses();
286 this.decorateContent();
287 this.contentDisposables.push(this.keybindingService.onDidUpdateKeybindings(() => this.decorateContent()));
303 const innerContent = document.createElement('div');
304 innerContent.classList.add('walkThroughContent'); // only for markdown files
305 const markdown = this.expandMacros(content);
306: innerContent.innerHTML = marked(markdown, { renderer });
307 this.content.appendChild(innerContent);
308
309 model.snippets.forEach((snippet, i) => {

View File

@@ -2,43 +2,31 @@
# Flags: CaseSensitive WordMatch
# ContextLines: 2
14 results - 4 files
12 results - 4 files
src/vs/base/browser/dom.ts:
81 };
82
83: /** @deprecated ES6 - use classList*/
84 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
83 };
84
85: /** @deprecated ES6 - use classList*/
86 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
86 export const hasClass: (node: HTMLElement | SVGElement, className: string) => boolean = _classList.hasClass.bind(_classList);
87: /** @deprecated ES6 - use classList*/
88 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
88 export const addClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.addClass.bind(_classList);
89: /** @deprecated ES6 - use classList*/
90 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
90 export const addClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.addClasses.bind(_classList);
91: /** @deprecated ES6 - use classList*/
92 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
92 export const removeClass: (node: HTMLElement | SVGElement, className: string) => void = _classList.removeClass.bind(_classList);
93: /** @deprecated ES6 - use classList*/
94 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
95
94 export const removeClasses: (node: HTMLElement | SVGElement, ...classNames: string[]) => void = _classList.removeClasses.bind(_classList);
95: /** @deprecated ES6 - use classList*/
96 export const toggleClass: (node: HTMLElement | SVGElement, className: string, shouldHaveIt?: boolean) => void = _classList.toggleClass.bind(_classList);
97
src/vs/base/common/arrays.ts:
401
402 /**
403: * @deprecated ES6: use `Array.findIndex`
403: * @deprecated ES6: use `Array.find`
404 */
405 export function firstIndex<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean): number {
417
418 /**
419: * @deprecated ES6: use `Array.find`
420 */
421 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
568
569 /**
570: * @deprecated ES6: use `Array.find`
571 */
572 export function find<T>(arr: ArrayLike<T>, predicate: (value: T, index: number, arr: ArrayLike<T>) => any): T | undefined {
405 export function first<T>(array: ReadonlyArray<T>, fn: (item: T) => boolean, notFoundValue: T): T;
src/vs/base/common/objects.ts:
115
@@ -66,8 +54,8 @@ src/vs/base/common/strings.ts:
170 */
171 export function endsWith(haystack: string, needle: string): boolean {
861
862 /**
863: * @deprecated ES6
864 */
865 export function repeat(s: string, count: number): string {
857
858 /**
859: * @deprecated ES6
860 */
861 export function repeat(s: string, count: number): string {

View File

@@ -2,18 +2,52 @@
# Flags: RegExp
# ContextLines: 2
2 results - 2 files
8 results - 4 files
src/vs/base/browser/ui/tree/asyncDataTree.ts:
243 } : () => 'treeitem',
244 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
245: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
246 } : undefined,
247 getAriaLabel(e) {
241 } : () => 'treeitem',
242 isChecked: options.accessibilityProvider!.isChecked ? (e) => {
243: return !!(options.accessibilityProvider?.isChecked!(e.element as T));
244 } : undefined,
245 getAriaLabel(e) {
src/vs/workbench/contrib/debug/browser/debugConfigurationManager.ts:
254
255 return debugDynamicExtensions.map(e => {
256: const type = e.contributes?.debuggers![0].type!;
257 return {
258 label: this.getDebuggerLabel(type)!,
src/vs/platform/list/browser/listService.ts:
463
464 if (typeof options?.openOnSingleClick !== 'boolean' && options?.configurationService) {
465: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
466 this._register(options?.configurationService.onDidChangeConfiguration(() => {
467: this.openOnSingleClick = options?.configurationService!.getValue(openModeSettingKey) !== 'doubleClick';
468 }));
469 } else {
src/vs/workbench/contrib/notebook/browser/notebookEditorWidget.ts:
1526
1527 await this._ensureActiveKernel();
1528: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, undefined);
1529 }
1530
1535
1536 await this._ensureActiveKernel();
1537: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, undefined);
1538 }
1539
1553
1554 await this._ensureActiveKernel();
1555: await this._activeKernel?.cancelNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1556 }
1557
1567
1568 await this._ensureActiveKernel();
1569: await this._activeKernel?.executeNotebookCell!(this._notebookViewModel!.uri, cell.handle);
1570 }
1571
src/vs/workbench/contrib/webview/electron-browser/iframeWebviewElement.ts:
89 .then(() => this._resourceRequestManager.ensureReady())
90 .then(() => {
91: this.element?.contentWindow!.postMessage({ channel, args: data }, '*');
92 });
93 }

View File

@@ -1,3 +1,3 @@
disturl "https://atom.io/download/electron"
target "9.2.1"
target "9.3.0"
runtime "electron"

View File

@@ -1,5 +1,66 @@
# Change Log
## Version 1.24.0
* Release date: November 12, 2020
* Release status: General Availability
* SQL Project improvements
* Notebook improvements, including in WYSIWYG editor enhancements
* Azure Arc improvements
* Azure SQL Deployment UX improvements
* Azure Browse Connections Preview
* Bug Fixes
## Version 1.23.0
* Release date: October 14, 2020
* Release status: General Availability
* Added deployments of Azure SQL DB and VM
* Added PowerShell kernel results streaming support
* Added improvements to SQL Database Projects extension
* Bug Fixes
* Extension Updates:
* SQL Server Import
* Machine Learning
* Schema Compare
* Kusto
* SQL Assessment
* SQL Database Projects
* Azure Arc
* azdata
## Version 1.22.1
* Release date: September 30, 2020
* Release status: General Availability
* Fix bug #12615 Active connection filter doesn't untoggle | [#12615](https://github.com/microsoft/azuredatastudio/issues/12615)
* Fix bug #12572 Edit Data grid doesn't escape special characters | [#12572](https://github.com/microsoft/azuredatastudio/issues/12572)
* Fix bug #12570 Dashboard Explorer table doesn't escape special characters | [#12570](https://github.com/microsoft/azuredatastudio/issues/12570)
* Fix bug #12582 Delete row on Edit Data fails | [#12582](https://github.com/microsoft/azuredatastudio/issues/12582)
* Fix bug #12646 SQL Notebooks: Cells being treated isolated | [#12646](https://github.com/microsoft/azuredatastudio/issues/12646)
## Version 1.22.0
* Release date: September 22, 2020
* Release status: General Availability
* New Notebook Features
* Supports brand new text cell editing experience based on rich text formatting and seamless conversion to markdown, also known as WYSIWYG toolbar (What You See Is What You Get)
* Supports Kusto kernel
* Supports pinning of notebooks
* Added support for new version of Jupyter Books
* Improved Jupyter Shortcuts
* Introduced perf loading improvements
* Added Azure Arc extension - Users can try out Azure Arc public preview through Azure Data Studio. This includes:
* Deploy data controller
* Deploy Postgres
* Deploy Managed Instance for Azure Arc
* Connect to data controller
* Access data service dashboards
* Azure Arc Jupyter Book
* Added new deployment options
* Azure SQL Database Edge
* (Edge will require Azure SQL Edge Deployment Extension)
* Added SQL Database Projects extension - The SQL Database Projects extension brings project-based database development to Azure Data Studio. In this preview release, SQL projects can be created and published from Azure Data Studio.
* Added Kusto (KQL) extension - Brings native Kusto experiences in Azure Data Studio for data exploration and data analytics against massive amount of real-time streaming data stored in Azure Data Explorer. This preview release supports connecting and browsing Azure Data Explorer clusters, writing KQL queries as well as authoring notebooks with Kusto kernel.
* SQL Server Import extension GA - Announcing the GA of the SQL Server Import extension, features no longer in preview. This extension facilitates importing csv/txt files. Learn more about the extension in [this article](sql-server-import-extension.md).
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22September+2020+Release%22+is%3Aclosed).
## Version 1.21.0
* Release date: August 12, 2020
* Release status: General Availability

View File

@@ -19,7 +19,7 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
| [Linux DEB][linux-deb] |
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## Try out the latest insiders build from `main`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
@@ -29,6 +29,8 @@ Go to our [download page](https://aka.ms/azuredatastudio) for more specific inst
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/main/CHANGELOG.md) for additional details of what's in this release.
Go to our [download page](https://aka.ms/getazuredatastudio) for more specific instructions.
## **Feature Highlights**
@@ -129,10 +131,10 @@ Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the [Source EULA](LICENSE.txt).
[win-user]: https://go.microsoft.com/fwlink/?linkid=2138608
[win-system]: https://go.microsoft.com/fwlink/?linkid=2138704
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2138705
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2138609
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2138706
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2138507
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2138508
[win-user]: https://go.microsoft.com/fwlink/?linkid=2148607
[win-system]: https://go.microsoft.com/fwlink/?linkid=2148907
[win-zip]: https://go.microsoft.com/fwlink/?linkid=2148908
[osx-zip]: https://go.microsoft.com/fwlink/?linkid=2148710
[linux-zip]: https://go.microsoft.com/fwlink/?linkid=2148708
[linux-rpm]: https://go.microsoft.com/fwlink/?linkid=2148709
[linux-deb]: https://go.microsoft.com/fwlink/?linkid=2148806

1
build/.gitattributes vendored Normal file
View File

@@ -0,0 +1 @@
* text eol=lf

View File

@@ -15,7 +15,7 @@
"keywords": [],
"author": "",
"dependencies": {
"@actions/core": "^1.2.3",
"@actions/core": "^1.2.6",
"@actions/github": "^2.1.1",
"axios": "^0.19.2",
"ts-node": "^8.6.2",

View File

@@ -2,10 +2,10 @@
# yarn lockfile v1
"@actions/core@^1.2.3":
version "1.2.3"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.3.tgz#e844b4fa0820e206075445079130868f95bfca95"
integrity sha512-Wp4xnyokakM45Uuj4WLUxdsa8fJjKVl1fDTsPbTEcTcuu0Nb26IPQbOtjmnfaCPGcaoPOOqId8H9NapZ8gii4w==
"@actions/core@^1.2.6":
version "1.2.6"
resolved "https://registry.yarnpkg.com/@actions/core/-/core-1.2.6.tgz#a78d49f41a4def18e88ce47c2cac615d5694bf09"
integrity sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA==
"@actions/github@^2.1.1":
version "2.1.1"

View File

@@ -53,7 +53,7 @@ async function uploadBlob(blobService: azure.BlobService, quality: string, blobN
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(quality, blobName, filePath, blobOptions, err => err ? e(err) : c()));
}
function getEnv(name: string): string {

View File

@@ -17,7 +17,7 @@ const fileNames = [
];
async function assertContainer(blobService: azure.BlobService, container: string): Promise<void> {
await new Promise((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
await new Promise<void>((c, e) => blobService.createContainerIfNotExists(container, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesBlobExist(blobService: azure.BlobService, container: string, blobName: string): Promise<boolean | undefined> {
@@ -33,7 +33,7 @@ async function uploadBlob(blobService: azure.BlobService, container: string, blo
}
};
await new Promise((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
await new Promise<void>((c, e) => blobService.createBlockBlobFromLocalFile(container, blobName, file, blobOptions, err => err ? e(err) : c()));
}
async function publish(commit: string, files: readonly string[]): Promise<void> {

View File

@@ -43,6 +43,7 @@ function createDefaultConfig(quality: string): Config {
}
function getConfig(quality: string): Promise<Config> {
console.log(`Getting config for quality ${quality}`);
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
@@ -52,13 +53,13 @@ function getConfig(quality: string): Promise<Config> {
]
};
return new Promise<Config>((c, e) => {
return retry(() => new Promise<Config>((c, e) => {
client.queryDocuments(collection, query, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err && err.code !== 409) { return e(err); }
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0] as any as Config);
});
});
}));
}
interface Asset {
@@ -86,6 +87,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
updateTries++;
return new Promise<void>((c, e) => {
console.log(`Querying existing documents to update...`);
client.queryDocuments(collection, updateQuery, { enableCrossPartitionQuery: true }).toArray((err, results) => {
if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); }
@@ -101,6 +103,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
release.updates[platform] = type;
}
console.log(`Replacing existing document with updated version`);
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) { return c(update()); }
if (err) { return e(err); }
@@ -112,7 +115,8 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
});
}
return new Promise<void>((c, e) => {
return retry(() => new Promise<void>((c, e) => {
console.log(`Attempting to create document`);
client.createDocument(collection, release, err => {
if (err && err.code === 409) { return c(update()); }
if (err) { return e(err); }
@@ -120,7 +124,7 @@ function createOrUpdate(commit: string, quality: string, platform: string, type:
console.log('Build successfully published.');
c();
});
});
}));
}
async function assertContainer(blobService: azure.BlobService, quality: string): Promise<void> {
@@ -188,7 +192,6 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await uploadBlob(blobService, quality, blobName, file);
@@ -247,6 +250,22 @@ async function publish(commit: string, quality: string, platform: string, type:
await createOrUpdate(commit, quality, platform, type, release, asset, isUpdate);
}
const RETRY_TIMES = 10;
async function retry<T>(fn: () => Promise<T>): Promise<T> {
for (let run = 1; run <= RETRY_TIMES; run++) {
try {
return await fn();
} catch (err) {
if (!/ECONNRESET/.test(err.message)) {
throw err;
}
console.log(`Caught error ${err} - ${run}/${RETRY_TIMES}`);
}
}
throw new Error('Retried too many times');
}
function main(): void {
const commit = process.env['BUILD_SOURCEVERSION'];

View File

@@ -87,10 +87,6 @@ steps:
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-darwin-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
- script: |

View File

@@ -96,8 +96,6 @@ steps:
set -e
yarn gulp package-rebuild-extensions
yarn gulp vscode-darwin-min-ci
yarn gulp vscode-reh-darwin-min-ci
yarn gulp vscode-reh-web-darwin-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -125,19 +123,19 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-darwin
APP_NAME="`ls $APP_ROOT | head -n 1`"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots"
yarn smoketest --build "$APP_ROOT/$APP_NAME" --screenshots "$(build.artifactstagingdirectory)/smokeshots" --log "$(build.artifactstagingdirectory)/logs/darwin/smoke.log"
displayName: Run smoke tests (Electron)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e
node ./node_modules/playwright/install.js
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
displayName: Run smoke tests (Browser)
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
# - script: |
# set -e
# node ./node_modules/playwright/install.js
# VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/azuredatastudio-reh-web-darwin" \
# yarn smoketest --web --headless --screenshots "$(build.artifactstagingdirectory)/smokeshots"
# displayName: Run smoke tests (Browser)
# continueOnError: true
# condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
- script: |
set -e

View File

@@ -31,10 +31,10 @@ steps:
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git checkout origin/electron-x.y.z
git checkout origin/electron-11.x.y
git merge origin/master
# Push master branch into exploration branch
git push origin HEAD:electron-x.y.z
git push origin HEAD:electron-11.x.y
displayName: Sync & Merge Exploration

View File

@@ -52,21 +52,25 @@ steps:
git merge $(node -p "require('./package.json').distro")
displayName: Merge distro
- script: |
echo -n $VSCODE_ARCH > .build/arch
displayName: Prepare arch cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
- script: |
set -e
CHILD_CONCURRENCY=1 yarn --frozen-lockfile
CHILD_CONCURRENCY=1 npm_config_arch=$(NPM_ARCH) yarn --frozen-lockfile
displayName: Install dependencies
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheRestored'], 'true'))
@@ -85,64 +89,64 @@ steps:
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-linux-x64-min-ci
yarn gulp vscode-reh-linux-$(VSCODE_ARCH)-min-ci
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
yarn gulp vscode-reh-web-linux-x64-min-ci
yarn gulp vscode-reh-web-linux-$(VSCODE_ARCH)-min-ci
displayName: Build
- script: |
set -e
service xvfb start
displayName: Start xvfb
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
displayName: Run unit tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
DISPLAY=:10 yarn test-browser --build --browser chromium --tfs "Browser Unit Tests"
displayName: Run unit tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
# Figure out the full absolute path of the product we just built
# including the remote server and configure the integration tests
# to run with these builds instead of running out of sources.
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-x64" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-web-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-web-integration.sh --browser chromium
displayName: Run integration tests (Browser)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- script: |
set -e
APP_ROOT=$(agent.builddirectory)/VSCode-linux-x64
APP_ROOT=$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-x64" \
VSCODE_REMOTE_SERVER_PATH="$(agent.builddirectory)/vscode-reh-linux-$(VSCODE_ARCH)" \
DISPLAY=:10 ./resources/server/test/test-remote-integration.sh
displayName: Run remote integration tests (Electron)
condition: and(succeeded(), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'), eq(variables['VSCODE_STEP_ON_IT'], 'false'))
- task: PublishPipelineArtifact@0
inputs:
artifactName: crash-dump-linux
artifactName: 'crash-dump-linux-$(VSCODE_ARCH)'
targetPath: .build/crashes
displayName: 'Publish Crash Reports'
continueOnError: true
@@ -157,15 +161,26 @@ steps:
- script: |
set -e
yarn gulp "vscode-linux-x64-build-deb"
yarn gulp "vscode-linux-x64-build-rpm"
yarn gulp "vscode-linux-x64-prepare-snap"
displayName: Build packages
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-deb"
yarn gulp "vscode-linux-$(VSCODE_ARCH)-build-rpm"
displayName: Build deb, rpm packages
- script: |
set -e
yarn gulp "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
displayName: Prepare snap package
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
# needed for code signing
- task: UseDotNet@2
displayName: 'Install .NET Core SDK 2.x'
inputs:
version: 2.x
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '.build/linux/rpm/x86_64'
FolderPath: '.build/linux/rpm'
Pattern: '*.rpm'
signConfigType: inlineSignParams
inlineOperation: |
@@ -186,14 +201,16 @@ steps:
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_ARCH="$(VSCODE_ARCH)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- task: PublishPipelineArtifact@0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: snap-x64
artifactName: 'snap-$(VSCODE_ARCH)'
targetPath: .build/linux/snap-tarball
condition: and(succeeded(), eq(variables['VSCODE_ARCH'], 'x64'))
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'

View File

@@ -4,11 +4,10 @@ REPO="$(pwd)"
ROOT="$REPO/.."
# Publish tarball
PLATFORM_LINUX="linux-x64"
PLATFORM_LINUX="linux-$VSCODE_ARCH"
BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$VSCODE_ARCH-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$VSCODE_ARCH-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
rm -rf $ROOT/code-*.tar.*
@@ -28,24 +27,36 @@ rm -rf $ROOT/vscode-server-*.tar.*
node build/azure-pipelines/common/createAsset.js "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$SERVER_TARBALL_PATH"
# Publish DEB
PLATFORM_DEB="linux-deb-x64"
DEB_ARCH="amd64"
case $VSCODE_ARCH in
x64) DEB_ARCH="amd64" ;;
*) DEB_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_DEB" package "$DEB_FILENAME" "$DEB_PATH"
# Publish RPM
PLATFORM_RPM="linux-rpm-x64"
RPM_ARCH="x86_64"
case $VSCODE_ARCH in
x64) RPM_ARCH="x86_64" ;;
armhf) RPM_ARCH="armv7hl" ;;
arm64) RPM_ARCH="aarch64" ;;
*) RPM_ARCH="$VSCODE_ARCH" ;;
esac
PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
node build/azure-pipelines/common/createAsset.js "$PLATFORM_RPM" package "$RPM_FILENAME" "$RPM_PATH"
# Publish Snap
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-x64.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)
if [ "$VSCODE_ARCH" == "x64" ]; then
# Publish Snap
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)
fi

View File

@@ -91,8 +91,7 @@ steps:
- script: |
set -e
yarn gulp vscode-linux-x64-min-ci
yarn gulp vscode-reh-linux-x64-min-ci
yarn gulp vscode-reh-web-linux-x64-min-ci
yarn gulp vscode-web-min-ci
displayName: Build
env:
VSCODE_MIXIN_PASSWORD: $(github-distro-mixin-password)
@@ -134,7 +133,8 @@ steps:
set -e
APP_ROOT=$(agent.builddirectory)/azuredatastudio-linux-x64
APP_NAME=$(node -p "require(\"$APP_ROOT/resources/app/product.json\").applicationName")
INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export INTEGRATION_TEST_ELECTRON_PATH="$APP_ROOT/$APP_NAME" \
export NO_CLEANUP=1
DISPLAY=:10 node ./scripts/test-extensions-unit.js ${{ extension }}
displayName: 'Run ${{ extension }} Stable Extension Unit Tests'
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))
@@ -149,6 +149,15 @@ steps:
continueOnError: true
condition: and(succeeded(), eq(variables['RUN_UNSTABLE_TESTS'], 'true'))
- script: |
set -e
mkdir -p $(Build.ArtifactStagingDirectory)/logs/linux-x64
cd /tmp
tar -czvf $(Build.ArtifactStagingDirectory)/logs/linux-x64/logs-linux-x64.tar.gz adsuser*
displayName: Archive Logs
continueOnError: true
condition: succeededOrFailed()
- script: |
set -e
yarn gulp vscode-linux-x64-build-deb
@@ -221,6 +230,7 @@ steps:
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'
condition: succeededOrFailed()
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'

View File

@@ -13,6 +13,12 @@ resources:
- container: vscode-x64
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
endpoint: VSCodeHub
- container: vscode-arm64
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-arm64
endpoint: VSCodeHub
- container: vscode-armhf
image: vscodehub.azurecr.io/vscode-linux-build-agent:stretch-armhf
endpoint: VSCodeHub
- container: snapcraft
image: snapcore/snapcraft:stable
@@ -64,6 +70,9 @@ stages:
- job: Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: vscode-x64
variables:
VSCODE_ARCH: x64
NPM_ARCH: x64
steps:
- template: linux/product-build-linux.yml
@@ -72,22 +81,28 @@ stages:
- Linux
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX'], 'true'))
container: snapcraft
variables:
VSCODE_ARCH: x64
steps:
- template: linux/snap-build-linux.yml
- job: LinuxArmhf
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARMHF'], 'true'))
container: vscode-armhf
variables:
VSCODE_ARCH: armhf
NPM_ARCH: armv7l
steps:
- template: linux/product-build-linux-multiarch.yml
- template: linux/product-build-linux.yml
- job: LinuxArm64
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ARM64'], 'true'))
container: vscode-arm64
variables:
VSCODE_ARCH: arm64
NPM_ARCH: arm64
steps:
- template: linux/product-build-linux-multiarch.yml
- template: linux/product-build-linux.yml
- job: LinuxAlpine
condition: and(succeeded(), eq(variables['VSCODE_BUILD_LINUX_ALPINE'], 'true'))

View File

@@ -52,9 +52,13 @@ steps:
displayName: Merge distro
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
- script: |
echo -n $VSCODE_ARCH > .build/arch
displayName: Prepare arch cache flag
- task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))
@@ -67,7 +71,7 @@ steps:
- task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
inputs:
keyfile: 'build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
keyfile: '.build/arch, build/.cachesalt, .yarnrc, remote/.yarnrc, **/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
vstsFeed: 'npm-vscode'
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'), ne(variables['CacheRestored'], 'true'))
@@ -112,8 +116,8 @@ steps:
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
displayName: Compile
condition: and(succeeded(), ne(variables['CacheExists-Compilation'], 'true'))

View File

@@ -17,7 +17,7 @@ jobs:
- template: sql-product-compile.yml
- job: macOS
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
pool:
vmImage: macOS-latest
dependsOn:
@@ -27,7 +27,7 @@ jobs:
timeoutInMinutes: 180
- job: macOS_Signing
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_MACOS'], 'true'), eq(variables['signed'], true), ne(variables['VSCODE_QUALITY'], 'saw'))
pool:
vmImage: macOS-latest
dependsOn:
@@ -50,7 +50,7 @@ jobs:
timeoutInMinutes: 70
- job: LinuxWeb
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'))
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WEB'], 'true'), ne(variables['VSCODE_QUALITY'], 'saw'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
@@ -61,15 +61,15 @@ jobs:
steps:
- template: web/sql-product-build-web.yml
- job: Docker
condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
pool:
vmImage: 'Ubuntu-16.04'
container: linux-x64
dependsOn:
- Linux
steps:
- template: docker/sql-product-build-docker.yml
# - job: Docker
# condition: and(succeeded(), eq(variables['VSCODE_BUILD_DOCKER'], 'true'))
# pool:
# vmImage: 'Ubuntu-16.04'
# container: linux-x64
# dependsOn:
# - Linux
# steps:
# - template: docker/sql-product-build-docker.yml
- job: Windows
condition: and(succeeded(), eq(variables['VSCODE_BUILD_WIN32'], 'true'))
@@ -98,7 +98,7 @@ jobs:
dependsOn:
- macOS
- Linux
- Docker
# - Docker
- Windows
- Windows_Test
- LinuxWeb

View File

@@ -96,8 +96,8 @@ steps:
yarn gulp compile-build
yarn gulp compile-extensions-build
yarn gulp minify-vscode
yarn gulp minify-vscode-reh
yarn gulp minify-vscode-reh-web
yarn gulp vscode-reh-linux-x64-min
yarn gulp vscode-reh-web-linux-x64-min
displayName: Compile
condition: and(succeeded(), ne(variables['CacheRestored-Compilation'], 'true'))

View File

@@ -12,9 +12,9 @@ $ServerZipLocation = "$Repo\.build\win32-$Arch\server"
$ServerZip = "$ServerZipLocation\azuredatastudio-server-win32-$Arch.zip"
# Create server archive
New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
# New-Item $ServerZipLocation -ItemType Directory # this will throw even when success for we don't want to exec this
$global:LASTEXITCODE = 0
exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
# exec { Rename-Item -Path $LegacyServer -NewName $ServerName } "Rename Item"
# exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r } "Zip Server"
exec { node build/azure-pipelines/common/copyArtifacts.js } "Copy Artifacts"

View File

@@ -95,8 +95,8 @@ steps:
$ErrorActionPreference = "Stop"
exec { yarn gulp "package-rebuild-extensions" }
exec { yarn gulp "vscode-win32-x64-min-ci" }
exec { yarn gulp "vscode-reh-win32-x64-min-ci" }
exec { yarn gulp "vscode-reh-web-win32-x64-min-ci" }
exec { yarn gulp "vscode-reh-win32-x64-min" }
exec { yarn gulp "vscode-reh-web-win32-x64-min" }
exec { yarn gulp "vscode-win32-x64-code-helper" }
exec { yarn gulp "vscode-win32-x64-inno-updater" }
displayName: Build
@@ -131,7 +131,7 @@ steps:
$AppRoot = "$(agent.builddirectory)\azuredatastudio-win32-x64"
$AppProductJson = Get-Content -Raw -Path "$AppRoot\resources\app\product.json" | ConvertFrom-Json
$AppNameShort = $AppProductJson.nameShort
exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\azuredatastudio-reh-win32-x64"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
# exec { $env:INTEGRATION_TEST_ELECTRON_PATH = "$AppRoot\$AppNameShort.exe"; $env:VSCODE_REMOTE_SERVER_PATH = "$(agent.builddirectory)\azuredatastudio-reh-win32-x64"; .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests (Electron)
condition: and(succeeded(), eq(variables['RUN_TESTS'], 'true'))

View File

@@ -104,6 +104,7 @@ const indentationFilter = [
'!extensions/admin-tool-ext-win/ssmsmin/**',
'!extensions/resource-deployment/notebooks/**',
'!extensions/mssql/notebooks/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/integration-tests/testData/**',
'!extensions/arc/src/controller/generated/**',
'!extensions/sql-database-projects/resources/templates/*.xml',
@@ -178,7 +179,9 @@ const copyrightFilter = [
'!extensions/mssql/src/prompts/**',
'!extensions/kusto/src/prompts/**',
'!extensions/notebook/resources/jupyter_config/**',
'!extensions/azurehybridtoolkit/notebooks/**',
'!extensions/query-history/images/**',
'!extensions/sql/build/update-grammar.js',
'!**/*.gif',
'!**/*.xlf',
'!**/*.dacpac',

View File

@@ -261,7 +261,7 @@ function packageTask(platform, arch, sourceFolderName, destinationFolderName, op
.pipe(fileLengthFilter.restore)
.pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions())
.pipe(electron(_.extend({}, config, { platform, arch, ffmpegChromium: true })))
.pipe(electron(_.extend({}, config, { platform, arch: arch === 'armhf' ? 'arm' : arch, ffmpegChromium: true })))
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version'], { dot: true }));
if (platform === 'linux') {
@@ -345,7 +345,7 @@ const BUILD_TARGETS = [
{ platform: 'darwin', arch: null, opts: { stats: true } },
{ platform: 'linux', arch: 'ia32' },
{ platform: 'linux', arch: 'x64' },
{ platform: 'linux', arch: 'arm' },
{ platform: 'linux', arch: 'armhf' },
{ platform: 'linux', arch: 'arm64' },
];
BUILD_TARGETS.forEach(buildTarget => {

View File

@@ -23,7 +23,7 @@ const commit = util.getVersion(root);
const linuxPackageRevision = Math.floor(new Date().getTime() / 1000);
function getDebPackageArch(arch) {
return { x64: 'amd64', arm: 'armhf', arm64: 'arm64' }[arch];
return { x64: 'amd64', armhf: 'armhf', arm64: 'arm64' }[arch];
}
function prepareDebPackage(arch) {
@@ -53,6 +53,11 @@ function prepareDebPackage(arch) {
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const workspaceMime = gulp.src('resources/linux/code-workspace.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('usr/share/mime/packages/' + product.applicationName + '-workspace.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('usr/share/pixmaps/' + product.linuxIconName + '.png'));
@@ -96,7 +101,7 @@ function prepareDebPackage(arch) {
.pipe(replace('@@UPDATEURL@@', product.updateUrl || '@@UPDATEURL@@'))
.pipe(rename('DEBIAN/postinst'));
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, icon, bash_completion, zsh_completion, code);
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, workspaceMime, icon, bash_completion, zsh_completion, code);
return all.pipe(vfs.dest(destination));
};
@@ -116,7 +121,7 @@ function getRpmBuildPath(rpmArch) {
}
function getRpmPackageArch(arch) {
return { x64: 'x86_64', arm: 'armhf', arm64: 'arm64' }[arch];
return { x64: 'x86_64', armhf: 'armv7hl', arm64: 'aarch64' }[arch];
}
function prepareRpmPackage(arch) {
@@ -145,6 +150,11 @@ function prepareRpmPackage(arch) {
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const workspaceMime = gulp.src('resources/linux/code-workspace.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('BUILD/usr/share/mime/packages/' + product.applicationName + '-workspace.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('BUILD/usr/share/pixmaps/' + product.linuxIconName + '.png'));
@@ -175,7 +185,7 @@ function prepareRpmPackage(arch) {
const specIcon = gulp.src('resources/linux/rpm/code.xpm', { base: '.' })
.pipe(rename('SOURCES/' + product.applicationName + '.xpm'));
const all = es.merge(code, desktops, appdata, icon, bash_completion, zsh_completion, spec, specIcon);
const all = es.merge(code, desktops, appdata, workspaceMime, icon, bash_completion, zsh_completion, spec, specIcon);
return all.pipe(vfs.dest(getRpmBuildPath(rpmArch)));
};
@@ -249,33 +259,23 @@ function buildSnapPackage(arch) {
const BUILD_TARGETS = [
{ arch: 'x64' },
{ arch: 'arm' },
{ arch: 'armhf' },
{ arch: 'arm64' },
];
BUILD_TARGETS.forEach((buildTarget) => {
const arch = buildTarget.arch;
BUILD_TARGETS.forEach(({ arch }) => {
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
{
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
// gulp.task(prepareDebTask);
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
}
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
{
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
// gulp.task(prepareRpmTask);
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
}
{
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
}
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
});

View File

@@ -55,7 +55,7 @@ function getElectron(arch) {
return () => {
const electronOpts = _.extend({}, exports.config, {
platform: process.platform,
arch,
arch: arch === 'armhf' ? 'arm' : arch,
ffmpegChromium: true,
keepDefaultApp: true
});

View File

@@ -61,7 +61,7 @@ function getElectron(arch: string): () => NodeJS.ReadWriteStream {
return () => {
const electronOpts = _.extend({}, config, {
platform: process.platform,
arch,
arch: arch === 'armhf' ? 'arm' : arch,
ffmpegChromium: true,
keepDefaultApp: true
});

View File

@@ -207,25 +207,25 @@ const externalExtensions = [
// they get packaged separately. Adding extension name here, will make the build to create
// a separate vsix package for the extension and the extension will be excluded from the main package.
// Any extension not included here will be installed by default.
'admin-pack',
'admin-tool-ext-win',
'agent',
'arc',
'asde-deployment',
'azdata',
'import',
'profiler',
'admin-pack',
'dacpac',
'schema-compare',
'azurehybridtoolkit',
'cms',
'query-history',
'dacpac',
'import',
'kusto',
'liveshare',
'sql-database-projects',
'machine-learning',
'profiler',
'query-history',
'schema-compare',
'sql-assessment',
'asde-deployment',
'sql-database-projects',
'sql-migration',
'data-workspace'
];
// extensions that require a rebuild since they have native parts
const rebuildExtensions = [

View File

@@ -241,25 +241,25 @@ const externalExtensions = [
// they get packaged separately. Adding extension name here, will make the build to create
// a separate vsix package for the extension and the extension will be excluded from the main package.
// Any extension not included here will be installed by default.
'admin-pack',
'admin-tool-ext-win',
'agent',
'arc',
'asde-deployment',
'azdata',
'import',
'profiler',
'admin-pack',
'dacpac',
'schema-compare',
'azurehybridtoolkit',
'cms',
'query-history',
'dacpac',
'import',
'kusto',
'liveshare',
'sql-database-projects',
'machine-learning',
'profiler',
'query-history',
'schema-compare',
'sql-assessment',
'asde-deployment',
'sql-database-projects',
'sql-migration',
'data-workspace'
];
// extensions that require a rebuild since they have native parts

View File

@@ -206,6 +206,10 @@
"name": "vs/workbench/contrib/webview",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/webviewPanel",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/customEditor",
"project": "vscode-workbench"

View File

@@ -1004,7 +1004,7 @@ function createResource(project: string, slug: string, xlfFile: File, apiHostnam
* https://dev.befoolish.co/tx-docs/public/projects/updating-content#what-happens-when-you-update-files
*/
function updateResource(project: string, slug: string, xlfFile: File, apiHostname: string, credentials: string): Promise<any> {
return new Promise((resolve, reject) => {
return new Promise<void>((resolve, reject) => {
const data = JSON.stringify({ content: xlfFile.contents.toString() });
const options = {
hostname: apiHostname,

View File

@@ -53,6 +53,13 @@ const CORE_TYPES = [
'trimLeft',
'trimRight'
];
// Types that are defined in a common layer but are known to be only
// available in native environments should not be allowed in browser
const NATIVE_TYPES = [
'NativeParsedArgs',
'INativeEnvironmentService',
'INativeWindowConfiguration'
];
const RULES = [
// Tests: skip
{
@@ -68,6 +75,37 @@ const RULES = [
'MessageEvent',
'data'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/argv.ts
{
target: '**/{vs,sql}/platform/environment/common/argv.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/environment.ts
{
target: '**/{vs,sql}/platform/environment/common/environment.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
]
},
// Common: vs/platform/windows/common/windows.ts
{
target: '**/{vs,sql}/platform/windows/common/windows.ts',
disallowedTypes: [ /* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -81,6 +119,7 @@ const RULES = [
// Safe access to global
'global'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -90,6 +129,7 @@ const RULES = [
{
target: '**/{vs,sql}/**/common/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts',
'@types/node' // no node.js
@@ -99,6 +139,7 @@ const RULES = [
{
target: '**/{vs,sql}/**/browser/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -107,6 +148,7 @@ const RULES = [
{
target: '**/src/{vs,sql}/editor/contrib/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -132,7 +174,7 @@ const RULES = [
},
// Electron (sandbox)
{
target: '**/vs/**/electron-sandbox/**',
target: '**/{vs,sql}/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
@@ -162,7 +204,7 @@ let hasErrors = false;
function checkFile(program, sourceFile, rule) {
checkNode(sourceFile);
function checkNode(node) {
var _a;
var _a, _b;
if (node.kind !== ts.SyntaxKind.Identifier) {
return ts.forEachChild(node, checkNode); // recurse down
}
@@ -170,6 +212,12 @@ function checkFile(program, sourceFile, rule) {
if ((_a = rule.allowedTypes) === null || _a === void 0 ? void 0 : _a.some(allowed => allowed === text)) {
return; // override
}
if ((_b = rule.disallowedTypes) === null || _b === void 0 ? void 0 : _b.some(disallowed => disallowed === text)) {
const { line, character } = sourceFile.getLineAndCharacterOfPosition(node.getStart());
console.log(`[build/lib/layersChecker.ts]: Reference to '${text}' violates layer '${rule.target}' (${sourceFile.fileName} (${line + 1},${character + 1})`);
hasErrors = true;
return;
}
const checker = program.getTypeChecker();
const symbol = checker.getSymbolAtLocation(node);
if (symbol) {

View File

@@ -55,6 +55,14 @@ const CORE_TYPES = [
'trimRight'
];
// Types that are defined in a common layer but are known to be only
// available in native environments should not be allowed in browser
const NATIVE_TYPES = [
'NativeParsedArgs',
'INativeEnvironmentService',
'INativeWindowConfiguration'
];
const RULES = [
// Tests: skip
@@ -73,6 +81,40 @@ const RULES = [
'MessageEvent',
'data'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/argv.ts
{
target: '**/{vs,sql}/platform/environment/common/argv.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/environment/common/environment.ts
{
target: '**/{vs,sql}/platform/environment/common/environment.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
]
},
// Common: vs/platform/windows/common/windows.ts
{
target: '**/{vs,sql}/platform/windows/common/windows.ts',
disallowedTypes: [/* Ignore native types that are defined from here */],
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -88,6 +130,7 @@ const RULES = [
// Safe access to global
'global'
],
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -98,6 +141,7 @@ const RULES = [
{
target: '**/{vs,sql}/**/common/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'lib.dom.d.ts', // no DOM
'@types/node' // no node.js
@@ -108,6 +152,7 @@ const RULES = [
{
target: '**/{vs,sql}/**/browser/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -117,6 +162,7 @@ const RULES = [
{
target: '**/src/{vs,sql}/editor/contrib/**',
allowedTypes: CORE_TYPES,
disallowedTypes: NATIVE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
]
@@ -145,7 +191,7 @@ const RULES = [
// Electron (sandbox)
{
target: '**/vs/**/electron-sandbox/**',
target: '**/{vs,sql}/**/electron-sandbox/**',
allowedTypes: CORE_TYPES,
disallowedDefinitions: [
'@types/node' // no node.js
@@ -181,6 +227,7 @@ interface IRule {
skip?: boolean;
allowedTypes?: string[];
disallowedDefinitions?: string[];
disallowedTypes?: string[];
}
let hasErrors = false;
@@ -199,6 +246,14 @@ function checkFile(program: ts.Program, sourceFile: ts.SourceFile, rule: IRule)
return; // override
}
if (rule.disallowedTypes?.some(disallowed => disallowed === text)) {
const { line, character } = sourceFile.getLineAndCharacterOfPosition(node.getStart());
console.log(`[build/lib/layersChecker.ts]: Reference to '${text}' violates layer '${rule.target}' (${sourceFile.fileName} (${line + 1},${character + 1})`);
hasErrors = true;
return;
}
const checker = program.getTypeChecker();
const symbol = checker.getSymbolAtLocation(node);
if (symbol) {

View File

@@ -15,7 +15,7 @@ const yarn = process.platform === 'win32' ? 'yarn.cmd' : 'yarn';
const rootDir = path.resolve(__dirname, '..', '..');
function runProcess(command: string, args: ReadonlyArray<string> = []) {
return new Promise((resolve, reject) => {
return new Promise<void>((resolve, reject) => {
const child = spawn(command, args, { cwd: rootDir, stdio: 'inherit', env: process.env });
child.on('exit', err => !err ? resolve() : process.exit(err ?? 1));
child.on('error', reject);

View File

@@ -60,12 +60,12 @@
"git": {
"name": "electron",
"repositoryUrl": "https://github.com/electron/electron",
"commitHash": "03c7a54dc534ce1867d4393b9b1a6989d4a7e005"
"commitHash": "fb03807cd21915ddc3aa2521ba4f5ba14597bd7e"
}
},
"isOnlyProductionDependency": true,
"license": "MIT",
"version": "9.2.1"
"version": "9.3.0"
},
{
"component": {

View File

@@ -2,7 +2,7 @@
"name": "agent",
"displayName": "SQL Server Agent",
"description": "Manage and troubleshoot SQL Server Agent jobs",
"version": "0.48.0",
"version": "0.49.0",
"publisher": "Microsoft",
"preview": true,
"license": "https://raw.githubusercontent.com/Microsoft/azuredatastudio/main/LICENSE.txt",

View File

@@ -2,7 +2,7 @@
Welcome to Microsoft Azure Arc Extension for Azure Data Studio!
**This extension is only applicable to customers in the Azure Arc data services private preview.**
**This extension is only applicable to customers in the Azure Arc data services public preview.**
## Overview

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8.7 7.9L15.8 15L15 15.8L7.9 8.7L0.8 15.8L0 15L7.1 7.9L0 0.8L0.8 0L7.9 7.1L15 0L15.8 0.8L8.7 7.9Z" fill="#0078D4"/>
</svg>

After

Width:  |  Height:  |  Size: 228 B

View File

@@ -0,0 +1,3 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 2048 2048" width="16" height="16">
<path d="M960 1920q-133 0-255-34t-230-96-194-150-150-195-97-229T0 960q0-133 34-255t96-230 150-194 195-150 229-97T960 0q133 0 255 34t230 96 194 150 150 195 97 229 34 256q0 133-34 255t-96 230-150 194-195 150-229 97-256 34zm0-1792q-115 0-221 30t-198 84-169 130-130 168-84 199-30 221q0 114 30 220t84 199 130 169 168 130 199 84 221 30q114 0 220-30t199-84 169-130 130-168 84-199 30-221q0-114-30-220t-84-199-130-169-168-130-199-84-221-30zm-64 640h128v640H896V768zm0-256h128v128H896V512z" />
</svg>

After

Width:  |  Height:  |  Size: 581 B

View File

@@ -0,0 +1,3 @@
<svg width="16" height="14" viewBox="0 0 16 14" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M14 0H14.4L14.7 0.2L14.9 0.5C14.9524 0.570883 14.9885 0.652432 15.0058 0.738849C15.023 0.825265 15.0211 0.914429 15 1V14H2.8L0.999997 12.2V1C0.985033 0.85904 1.02046 0.717335 1.1 0.6L1.3 0.3L1.6 0.1H14V0ZM14 1H13V7H3V1H2V11.8L3.2 13H4V9H11V13H14V1ZM4 6H12V1H4V6ZM10 10H5V13H6V11H7V13H10V10Z" fill="#0078D4"/>
</svg>

After

Width:  |  Height:  |  Size: 421 B

View File

@@ -8,5 +8,5 @@
not_numbered: true
expand_sections: true
sections:
- title: TSG100 - The Azure Arc Postgres troubleshooter
- title: TSG100 - The Azure Arc enabled PostgreSQL Hyperscale troubleshooter
url: postgres/tsg100-troubleshoot-postgres

View File

@@ -3,5 +3,5 @@
- This chapter contains notebooks for troubleshooting Postgres on Azure Arc
## Notebooks in this Chapter
- [TSG100 - The Azure Arc Postgres troubleshooter](tsg100-troubleshoot-postgres.ipynb)
- [TSG100 - The Azure Arc enabled PostgreSQL Hyperscale troubleshooter](tsg100-troubleshoot-postgres.ipynb)

View File

@@ -3,5 +3,5 @@
not_numbered: true
expand_sections: true
sections:
- title: TSG100 - The Azure Arc Postgres troubleshooter
- title: TSG100 - The Azure Arc enabled PostgreSQL Hyperscale troubleshooter
url: postgres/tsg100-troubleshoot-postgres

View File

@@ -4,13 +4,14 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"TSG100 - The Azure Arc Postgres troubleshooter\n",
"==============================================\n",
"TSG100 - The Azure Arc enabled PostgreSQL Hyperscale troubleshooter\n",
"===================================================================\n",
"\n",
"Description\n",
"-----------\n",
"\n",
"Follow these steps to troubleshoot an Azure Arc Postgres Server.\n",
"Follow these steps to troubleshoot an Azure Arc enabled PostgreSQL\n",
"Hyperscale Server.\n",
"\n",
"Steps\n",
"-----\n",
@@ -34,6 +35,7 @@
"# the user will be prompted to select a server.\n",
"namespace = os.environ.get('POSTGRES_SERVER_NAMESPACE')\n",
"name = os.environ.get('POSTGRES_SERVER_NAME')\n",
"version = os.environ.get('POSTGRES_SERVER_VERSION')\n",
"\n",
"tail_lines = 50"
]
@@ -143,7 +145,7 @@
" if cmd.startswith(\"kubectl \") and \"AZDATA_OPENSHIFT\" in os.environ:\n",
" cmd_actual[0] = cmd_actual[0].replace(\"kubectl\", \"oc\")\n",
"\n",
" # To aid supportabilty, determine which binary file will actually be executed on the machine\n",
" # To aid supportability, determine which binary file will actually be executed on the machine\n",
" #\n",
" which_binary = None\n",
"\n",
@@ -400,11 +402,11 @@
"import math\n",
"\n",
"# If a server was provided, get it\n",
"if namespace and name:\n",
" server = json.loads(run(f'kubectl get dbs -n {namespace} {name} -o json', return_output=True))\n",
"if namespace and name and version:\n",
" server = json.loads(run(f'kubectl get postgresql-{version} -n {namespace} {name} -o json', return_output=True))\n",
"else:\n",
" # Otherwise prompt the user to select a server\n",
" servers = json.loads(run(f'kubectl get dbs --all-namespaces -o json', return_output=True))['items']\n",
" servers = json.loads(run(f'kubectl get postgresqls --all-namespaces -o json', return_output=True))['items']\n",
" if not servers:\n",
" raise Exception('No Postgres servers found')\n",
"\n",
@@ -425,6 +427,7 @@
" server = servers[i-1]\n",
" namespace = server['metadata']['namespace']\n",
" name = server['metadata']['name']\n",
" version = server['kind'][len('postgresql-'):]\n",
" break\n",
"\n",
"display(Markdown(f'#### Got server {namespace}.{name}'))"
@@ -446,10 +449,10 @@
"uid = server['metadata']['uid']\n",
"\n",
"display(Markdown(f'#### Server summary'))\n",
"run(f'kubectl get dbs -n {namespace} {name}')\n",
"run(f'kubectl get postgresql-{version} -n {namespace} {name}')\n",
"\n",
"display(Markdown(f'#### Resource summary'))\n",
"run(f'kubectl get pods,pvc,svc,ep -n {namespace} -l dusky.microsoft.com/serviceId={uid}')"
"run(f'kubectl get sts,pods,pvc,svc,ep -n {namespace} -l postgresqls.arcdata.microsoft.com/cluster-id={uid}')"
]
},
{
@@ -466,7 +469,7 @@
"outputs": [],
"source": [
"display(Markdown(f'#### Troubleshooting server {namespace}.{name}'))\n",
"run(f'kubectl describe dbs -n {namespace} {name}')"
"run(f'kubectl describe postgresql-{version} -n {namespace} {name}')"
]
},
{
@@ -482,7 +485,7 @@
"metadata": {},
"outputs": [],
"source": [
"pods = json.loads(run(f'kubectl get pods -n {namespace} -l dusky.microsoft.com/serviceId={uid} -o json', return_output=True))['items']\n",
"pods = json.loads(run(f'kubectl get pods -n {namespace} -l postgresqls.arcdata.microsoft.com/cluster-id={uid} -o json', return_output=True))['items']\n",
"\n",
"# Summarize and describe each pod\n",
"for pod in pods:\n",
@@ -529,8 +532,7 @@
" con_restarts = con_status.get('restartCount', 0)\n",
"\n",
" display(Markdown(f'#### Troubleshooting container {namespace}.{pod_name}/{con_name} ({i+1}/{len(cons)})\\n'\n",
" f'#### {\"S\" if con_started else \"Not s\"}tarted and '\n",
" f'{\"\" if con_ready else \"not \"}ready with {con_restarts} restarts'))\n",
" f'#### {\"R\" if con_ready else \"Not r\"}eady with {con_restarts} restarts'))\n",
"\n",
" run(f'kubectl logs -n {namespace} {pod_name} {con_name} --tail {tail_lines}')\n",
"\n",
@@ -554,7 +556,7 @@
"outputs": [],
"source": [
"display(Markdown(f'#### Troubleshooting PersistentVolumeClaims'))\n",
"run(f'kubectl describe pvc -n {namespace} -l dusky.microsoft.com/serviceId={uid}')"
"run(f'kubectl describe pvc -n {namespace} -l postgresqls.arcdata.microsoft.com/cluster-id={uid}')"
]
},
{

View File

@@ -47,7 +47,7 @@
"|Tools|Description|Installation|\n",
"|---|---|---|\n",
"|kubectl | Command-line tool for monitoring the underlying Kubernetes cluster | [Installation](https://kubernetes.io/docs/tasks/tools/install-kubectl/#install-kubectl-binary-using-native-package-management) |\n",
"|azdata | Command-line tool for installing and managing resources in an Azure Arc cluster |[Installation](https://github.com/microsoft/Azure-data-services-on-Azure-Arc/blob/master/scenarios/001-install-client-tools.md) |"
"|Azure Data CLI (azdata) | Command-line tool for installing and managing resources in an Azure Arc cluster |[Installation](https://docs.microsoft.com/sql/azdata/install/deploy-install-azdata) |"
],
"metadata": {
"azdata_cell_guid": "714582b9-10ee-409e-ab12-15a4825c9471"
@@ -65,13 +65,7 @@
{
"cell_type": "code",
"source": [
"import pandas,sys,os,json,html,getpass,time, tempfile\n",
"pandas_version = pandas.__version__.split('.')\n",
"pandas_major = int(pandas_version[0])\n",
"pandas_minor = int(pandas_version[1])\n",
"pandas_patch = int(pandas_version[2])\n",
"if not (pandas_major > 0 or (pandas_major == 0 and pandas_minor > 24) or (pandas_major == 0 and pandas_minor == 24 and pandas_patch >= 2)):\n",
" sys.exit('Please upgrade the Notebook dependency before you can proceed, you can do it by running the \"Reinstall Notebook dependencies\" command in command palette (View menu -> Command Palette…).')\n",
"import sys,os,json,html,getpass,time, tempfile\n",
"def run_command(command):\n",
" print(\"Executing: \" + command)\n",
" !{command}\n",
@@ -90,7 +84,7 @@
"cell_type": "markdown",
"source": [
"### **Set variables**\n",
"Generated by Azure Data Studio using the values collected in the Azure Arc Data controller create wizard"
"Generated by Azure Data Studio using the values collected in the 'Create Azure Arc data controller' wizard."
],
"metadata": {
"azdata_cell_guid": "4b266b2d-bd1b-4565-92c9-3fc146cdce6d"
@@ -129,18 +123,22 @@
{
"cell_type": "code",
"source": [
"if \"AZDATA_NB_VAR_ARC_DOCKER_PASSWORD\" in os.environ:\n",
" arc_docker_password = os.environ[\"AZDATA_NB_VAR_ARC_DOCKER_PASSWORD\"]\n",
"if \"AZDATA_NB_VAR_ARC_ADMIN_PASSWORD\" in os.environ:\n",
" arc_admin_password = os.environ[\"AZDATA_NB_VAR_ARC_ADMIN_PASSWORD\"]\n",
"else:\n",
" if arc_admin_password == \"\":\n",
" arc_admin_password = getpass.getpass(prompt = 'Azure Arc Data controller password')\n",
" arc_admin_password = getpass.getpass(prompt = 'Azure Arc Data Controller password')\n",
" if arc_admin_password == \"\":\n",
" sys.exit(f'Password is required.')\n",
" confirm_password = getpass.getpass(prompt = 'Confirm password')\n",
" if arc_admin_password != confirm_password:\n",
" sys.exit(f'Passwords do not match.')"
" sys.exit(f'Passwords do not match.')\n",
"\n",
"os.environ[\"SPN_CLIENT_ID\"] = sp_client_id\n",
"os.environ[\"SPN_TENANT_ID\"] = sp_tenant_id\n",
"if \"AZDATA_NB_VAR_SP_CLIENT_SECRET\" in os.environ:\n",
" os.environ[\"SPN_CLIENT_SECRET\"] = os.environ[\"AZDATA_NB_VAR_SP_CLIENT_SECRET\"]\n",
"os.environ[\"SPN_AUTHORITY\"] = \"https://login.microsoftonline.com\""
],
"metadata": {
"azdata_cell_guid": "e7e10828-6cae-45af-8c2f-1484b6d4f9ac",
@@ -175,7 +173,7 @@
{
"cell_type": "markdown",
"source": [
"### **Create Azure Arc Data controller**"
"### **Create Azure Arc Data Controller**"
],
"metadata": {
"azdata_cell_guid": "efe78cd3-ed73-4c9b-b586-fdd6c07dd37f"
@@ -184,16 +182,14 @@
{
"cell_type": "code",
"source": [
"print (f'Creating Azure Arc controller: {arc_data_controller_name} using configuration {arc_cluster_context}')\n",
"print (f'Creating Azure Arc Data Controller: {arc_data_controller_name} using configuration {arc_cluster_context}')\n",
"os.environ[\"ACCEPT_EULA\"] = 'yes'\n",
"os.environ[\"AZDATA_USERNAME\"] = arc_admin_username\n",
"os.environ[\"AZDATA_PASSWORD\"] = arc_admin_password\n",
"os.environ[\"DOCKER_USERNAME\"] = arc_docker_username\n",
"os.environ[\"DOCKER_PASSWORD\"] = arc_docker_password\n",
"if os.name == 'nt':\n",
" print(f'If you don\\'t see output produced by azdata, you can run the following command in a terminal window to check the deployment status:\\n\\t {os.environ[\"AZDATA_NB_VAR_KUBECTL\"]} get pods -n {arc_data_controller_namespace}')\n",
"run_command(f'azdata arc dc create --connectivity-mode {arc_data_controller_connectivity_mode} -n {arc_data_controller_name} -ns {arc_data_controller_namespace} -s {arc_subscription} -g {arc_resource_group} -l {arc_data_controller_location} -sc {arc_data_controller_storage_class} --profile-name {arc_profile}')\n",
"print(f'Azure Arc Data controller cluster: {arc_data_controller_name} created.') "
"print(f'Azure Arc Data Controller: {arc_data_controller_name} created.') "
],
"metadata": {
"azdata_cell_guid": "373947a1-90b9-49ee-86f4-17a4c7d4ca76",
@@ -205,7 +201,7 @@
{
"cell_type": "markdown",
"source": [
"### **Setting context to created Azure Arc Data controller**"
"### **Setting context to created Azure Arc Data Controller**"
],
"metadata": {
"azdata_cell_guid": "a3ddc701-811d-4058-b3fb-b7295fcf50ae"
@@ -214,7 +210,7 @@
{
"cell_type": "code",
"source": [
"# Setting context to data controller.\n",
"# Setting context to Data Controller.\n",
"#\n",
"run_command(f'kubectl config set-context --current --namespace {arc_data_controller_namespace}')"
],
@@ -227,7 +223,7 @@
{
"cell_type": "markdown",
"source": [
"### **Login to the data controller.**\n"
"### **Login to the Data Controller.**\n"
],
"metadata": {
"azdata_cell_guid": "9376b2ab-0edf-478f-9e3c-5ff46ae3501a"
@@ -236,9 +232,9 @@
{
"cell_type": "code",
"source": [
"# Login to the data controller.\n",
"# Login to the Data Controller.\n",
"#\n",
"run_command(f'azdata login -n {arc_data_controller_namespace}')"
"run_command(f'azdata login --namespace {arc_data_controller_namespace}')"
],
"metadata": {
"azdata_cell_guid": "9aed0c5a-2c8a-4ad7-becb-60281923a196"
@@ -247,4 +243,4 @@
"execution_count": null
}
]
}
}

View File

@@ -25,12 +25,12 @@
"source": [
"![Microsoft](https://raw.githubusercontent.com/microsoft/azuredatastudio/main/extensions/arc/images/microsoft-small-logo.png)\n",
" \n",
"## Deploy a PostgreSQL server group on an existing Azure Arc data cluster\n",
"## Create a PostgreSQL Hyperscale - Azure Arc on an existing Azure Arc Data Controller\n",
" \n",
"This notebook walks through the process of deploying a PostgreSQL server group on an existing Azure Arc data cluster.\n",
"This notebook walks through the process of creating a PostgreSQL Hyperscale - Azure Arc on an existing Azure Arc Data Controller.\n",
" \n",
"* Follow the instructions in the **Prerequisites** cell to install the tools if not already installed.\n",
"* Make sure you have the target Azure Arc data cluster already created.\n",
"* Make sure you have the target Azure Arc Data Controller already created.\n",
"\n",
"<span style=\"color:red\"><font size=\"3\">Please press the \"Run All\" button to run the notebook</font></span>"
],
@@ -41,7 +41,21 @@
{
"cell_type": "markdown",
"source": [
"### **Check prerequisites**"
"### **Prerequisites** \n",
"Ensure the following tools are installed and added to PATH before proceeding.\n",
" \n",
"|Tools|Description|Installation|\n",
"|---|---|---|\n",
"|Azure Data CLI (azdata) | Command-line tool for installing and managing resources in an Azure Arc cluster |[Installation](https://docs.microsoft.com/sql/azdata/install/deploy-install-azdata) |"
],
"metadata": {
"azdata_cell_guid": "20fe3985-a01e-461c-bce0-235f7606cc3c"
}
},
{
"cell_type": "markdown",
"source": [
"### **Setup and Check Prerequisites**"
],
"metadata": {
"azdata_cell_guid": "68531b91-ddce-47d7-a1d8-2ddc3d17f3e7"
@@ -75,80 +89,20 @@
{
"cell_type": "markdown",
"source": [
"#### **Ensure Postgres Server Group name and password exist**"
"### **Set variables**\n",
"\n",
"#### \n",
"\n",
"Generated by Azure Data Studio using the values collected in the 'Deploy PostgreSQL Hyperscale - Azure Arc instance' wizard"
],
"metadata": {
"azdata_cell_guid": "68ec0760-27d1-4ded-9a9f-89077c40b8bb"
}
},
{
"cell_type": "code",
"source": [
"# Required Values\n",
"env_var = \"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME\" in os.environ\n",
"if env_var:\n",
" server_group_name = os.environ[\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME was not defined. Exiting\\n')\n",
"env_var = \"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PASSWORD\" in os.environ\n",
"if env_var:\n",
" postgres_password = os.environ[\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PASSWORD\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PASSWORD was not defined. Exiting\\n') \n",
"env_var = \"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_DATA\" in os.environ\n",
"if env_var:\n",
" postgres_storage_class_data = os.environ[\"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_DATA\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_DATA was not defined. Exiting\\n') \n",
"env_var = \"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_LOGS\" in os.environ\n",
"if env_var:\n",
" postgres_storage_class_logs = os.environ[\"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_LOGS\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_LOGS was not defined. Exiting\\n') \n",
"env_var = \"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_BACKUPS\" in os.environ\n",
"if env_var:\n",
" postgres_storage_class_backups = os.environ[\"AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_BACKUPS\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_POSTGRES_STORAGE_CLASS_BACKUPS was not defined. Exiting\\n') \n",
""
],
"metadata": {
"azdata_cell_guid": "53769960-e1f8-4477-b4cf-3ab1ea34348b",
"tags": []
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
"#### **Get optional parameters for the PostgreSQL server group**"
],
"metadata": {
"azdata_cell_guid": "68ec0760-27d1-4ded-9a9f-89077c40b8bb"
}
},
{
"cell_type": "code",
"source": [
"server_group_workers = os.environ[\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_WORKERS\"]\n",
"server_group_port = os.environ.get(\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PORT\")\n",
"server_group_cores_request = os.environ.get(\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_REQUEST\")\n",
"server_group_cores_limit = os.environ.get(\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CORES_LIMIT\")\n",
"server_group_memory_request = os.environ.get(\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_REQUEST\")\n",
"server_group_memory_limit = os.environ.get(\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_LIMIT\")"
],
"metadata": {
"azdata_cell_guid": "53769960-e1f8-4477-b4cf-3ab1ea34348b",
"tags": []
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
"### **Installing PostgreSQL server group**"
"### **Creating the PostgreSQL Hyperscale - Azure Arc instance**"
],
"metadata": {
"azdata_cell_guid": "90b0e162-2987-463f-9ce6-12dda1267189"
@@ -157,17 +111,37 @@
{
"cell_type": "code",
"source": [
"print (f'Creating a PostgreSQL server group on Azure Arc')\n",
"# Login to the data controller.\n",
"#\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_CONTROLLER_PASSWORD\"]\n",
"cmd = f'azdata login -e {controller_endpoint} -u {controller_username}'\n",
"out=run_command()"
],
"metadata": {
"azdata_cell_guid": "71366399-5963-4e24-b2f2-6bb5bffba4ec"
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "code",
"source": [
"print (f'Creating the PostgreSQL Hyperscale - Azure Arc instance')\n",
"\n",
"workers_option = f' -w {server_group_workers}' if server_group_workers else \"\"\n",
"port_option = f' --port \"{server_group_port}\"' if server_group_port else \"\"\n",
"cores_request_option = f' -cr \"{server_group_cores_request}\"' if server_group_cores_request else \"\"\n",
"cores_limit_option = f' -cl \"{server_group_cores_limit}\"' if server_group_cores_limit else \"\"\n",
"memory_request_option = f' -mr \"{server_group_memory_request}Mi\"' if server_group_memory_request else \"\"\n",
"memory_limit_option = f' -ml \"{server_group_memory_limit}Mi\"' if server_group_memory_limit else \"\"\n",
"workers_option = f' -w {postgres_server_group_workers}' if postgres_server_group_workers else \"\"\n",
"port_option = f' --port \"{postgres_server_group_port}\"' if postgres_server_group_port else \"\"\n",
"engine_version_option = f' -ev {postgres_server_group_engine_version}' if postgres_server_group_engine_version else \"\"\n",
"extensions_option = f' --extensions \"{postgres_server_group_extensions}\"' if postgres_server_group_extensions else \"\"\n",
"volume_size_data_option = f' -vsd {postgres_server_group_volume_size_data}Gi' if postgres_server_group_volume_size_data else \"\"\n",
"volume_size_logs_option = f' -vsl {postgres_server_group_volume_size_logs}Gi' if postgres_server_group_volume_size_logs else \"\"\n",
"volume_size_backups_option = f' -vsb {postgres_server_group_volume_size_backups}Gi' if postgres_server_group_volume_size_backups else \"\"\n",
"cores_request_option = f' -cr \"{postgres_server_group_cores_request}\"' if postgres_server_group_cores_request else \"\"\n",
"cores_limit_option = f' -cl \"{postgres_server_group_cores_limit}\"' if postgres_server_group_cores_limit else \"\"\n",
"memory_request_option = f' -mr \"{postgres_server_group_memory_request}Gi\"' if postgres_server_group_memory_request else \"\"\n",
"memory_limit_option = f' -ml \"{postgres_server_group_memory_limit}Gi\"' if postgres_server_group_memory_limit else \"\"\n",
"\n",
"os.environ[\"AZDATA_PASSWORD\"] = postgres_password\n",
"cmd = f'azdata arc postgres server create -n {server_group_name} -scd {postgres_storage_class_data} -scl {postgres_storage_class_logs} -scb {postgres_storage_class_backups}{workers_option}{port_option}{cores_request_option}{cores_limit_option}{memory_request_option}{memory_limit_option}'\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PASSWORD\"]\n",
"cmd = f'azdata arc postgres server create -n {postgres_server_group_name} -scd {postgres_storage_class_data} -scl {postgres_storage_class_logs} -scb {postgres_storage_class_backups}{workers_option}{port_option}{engine_version_option}{extensions_option}{volume_size_data_option}{volume_size_logs_option}{volume_size_backups_option}{cores_request_option}{cores_limit_option}{memory_request_option}{memory_limit_option}'\n",
"out=run_command()"
],
"metadata": {
@@ -177,4 +151,4 @@
"execution_count": null
}
]
}
}

View File

@@ -25,12 +25,12 @@
"source": [
"![Microsoft](https://raw.githubusercontent.com/microsoft/azuredatastudio/main/extensions/arc/images/microsoft-small-logo.png)\n",
" \n",
"## Deploy Azure SQL managed instance on an existing Azure Arc data cluster\n",
"## Create SQL managed instance - Azure Arc on an existing Azure Arc Data Controller\n",
" \n",
"This notebook walks through the process of deploying a <a href=\"https://docs.microsoft.com/azure/sql-database/sql-database-managed-instance\">Azure SQL managed instance</a> on an existing Azure Arc data cluster.\n",
"This notebook walks through the process of creating a <a href=\"https://docs.microsoft.com/azure/sql-database/sql-database-managed-instance\">SQL managed instance - Azure Arc</a> on an existing Azure Arc Data Controller.\n",
" \n",
"* Follow the instructions in the **Prerequisites** cell to install the tools if not already installed.\n",
"* Make sure you have the target Azure Arc data cluster already created.\n",
"* Make sure you have the target Azure Arc Data Controller already created.\n",
"\n",
"<span style=\"color:red\"><font size=\"3\">Please press the \"Run All\" button to run the notebook</font></span>"
],
@@ -41,7 +41,21 @@
{
"cell_type": "markdown",
"source": [
"### **Check prerequisites**"
"### **Prerequisites** \n",
"Ensure the following tools are installed and added to PATH before proceeding.\n",
" \n",
"|Tools|Description|Installation|\n",
"|---|---|---|\n",
"|Azure Data CLI (azdata) | Command-line tool for installing and managing resources in an Azure Arc cluster |[Installation](https://docs.microsoft.com/sql/azdata/install/deploy-install-azdata) |"
],
"metadata": {
"azdata_cell_guid": "d1c8258e-9efd-4380-a48c-cd675423ed2f"
}
},
{
"cell_type": "markdown",
"source": [
"### **Setup and Check Prerequisites**"
],
"metadata": {
"azdata_cell_guid": "68531b91-ddce-47d7-a1d8-2ddc3d17f3e7"
@@ -75,49 +89,20 @@
{
"cell_type": "markdown",
"source": [
"#### **Ensure SQL instance name, username and password exist**"
"### **Set variables**\n",
"\n",
"#### \n",
"\n",
"Generated by Azure Data Studio using the values collected in the 'Deploy Azure SQL managed instance - Azure Arc' wizard"
],
"metadata": {
"azdata_cell_guid": "68ec0760-27d1-4ded-9a9f-89077c40b8bb"
}
},
{
"cell_type": "code",
"source": [
"# Required Values\n",
"env_var = \"AZDATA_NB_VAR_SQL_INSTANCE_NAME\" in os.environ\n",
"if env_var:\n",
" mssql_instance_name = os.environ[\"AZDATA_NB_VAR_SQL_INSTANCE_NAME\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_SQL_INSTANCE_NAME was not defined. Exiting\\n')\n",
"env_var = \"AZDATA_NB_VAR_SQL_PASSWORD\" in os.environ\n",
"if env_var:\n",
" mssql_password = os.environ[\"AZDATA_NB_VAR_SQL_PASSWORD\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_SQL_PASSWORD was not defined. Exiting\\n') \n",
"env_var = \"AZDATA_NB_VAR_SQL_STORAGE_CLASS_DATA\" in os.environ\n",
"if env_var:\n",
" mssql_storage_class_data = os.environ[\"AZDATA_NB_VAR_SQL_STORAGE_CLASS_DATA\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_SQL_STORAGE_CLASS_DATA was not defined. Exiting\\n') \n",
"env_var = \"AZDATA_NB_VAR_SQL_STORAGE_CLASS_LOGS\" in os.environ\n",
"if env_var:\n",
" mssql_storage_class_logs = os.environ[\"AZDATA_NB_VAR_SQL_STORAGE_CLASS_LOGS\"]\n",
"else:\n",
" sys.exit(f'environment variable: AZDATA_NB_VAR_SQL_STORAGE_CLASS_LOGS was not defined. Exiting\\n') \n",
""
],
"metadata": {
"azdata_cell_guid": "53769960-e1f8-4477-b4cf-3ab1ea34348b",
"tags": []
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "markdown",
"source": [
"### **Installing Managed SQL Instance**"
"### **Creating the SQL managed instance - Azure Arc instance**"
],
"metadata": {
"azdata_cell_guid": "90b0e162-2987-463f-9ce6-12dda1267189"
@@ -126,10 +111,31 @@
{
"cell_type": "code",
"source": [
"print (f'Creating Managed SQL Server instance on Azure Arc')\n",
"# Login to the data controller.\n",
"#\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_CONTROLLER_PASSWORD\"]\n",
"cmd = f'azdata login -e {controller_endpoint} -u {controller_username}'\n",
"out=run_command()"
],
"metadata": {
"azdata_cell_guid": "1437c536-17e8-4a7f-80c1-aa43ad02686c"
},
"outputs": [],
"execution_count": null
},
{
"cell_type": "code",
"source": [
"print (f'Creating the SQL managed instance - Azure Arc instance')\n",
"\n",
"os.environ[\"AZDATA_PASSWORD\"] = mssql_password\n",
"cmd = f'azdata arc sql mi create -n {mssql_instance_name} -scd {mssql_storage_class_data} -scl {mssql_storage_class_logs}'\n",
"cores_request_option = f' -cr \"{sql_cores_request}\"' if sql_cores_request else \"\"\n",
"cores_limit_option = f' -cl \"{sql_cores_limit}\"' if sql_cores_limit else \"\"\n",
"memory_request_option = f' -mr \"{sql_memory_request}Gi\"' if sql_memory_request else \"\"\n",
"memory_limit_option = f' -ml \"{sql_memory_limit}Gi\"' if sql_memory_limit else \"\"\n",
"\n",
"os.environ[\"AZDATA_USERNAME\"] = sql_username\n",
"os.environ[\"AZDATA_PASSWORD\"] = os.environ[\"AZDATA_NB_VAR_SQL_PASSWORD\"]\n",
"cmd = f'azdata arc sql mi create -n {sql_instance_name} -scd {sql_storage_class_data} -scl {sql_storage_class_logs}{cores_request_option}{cores_limit_option}{memory_request_option}{memory_limit_option}'\n",
"out=run_command()"
],
"metadata": {
@@ -139,4 +145,4 @@
"execution_count": null
}
]
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -12,75 +12,87 @@
"command.editConnection.title": "Edit Connection",
"arc.openDashboard": "Manage",
"resource.type.azure.arc.display.name": "Azure Arc data controller",
"resource.type.azure.arc.display.name": "Azure Arc data controller (preview)",
"resource.type.azure.arc.description": "Creates an Azure Arc data controller",
"arc.control.plane.new.wizard.title": "Create Azure Arc data controller",
"arc.control.plane.cluster.environment.title": "What is your target existing Kubernetes cluster environment?",
"arc.control.plane.select.cluster.title": "Select from existing Kubernetes clusters",
"arc.control.plane.kube.cluster.context": "Cluster context",
"arc.control.plane.container.registry.title": "Container registry details",
"arc.control.plane.container.registry.name": "Container registry login",
"arc.control.plane.container.registry.password": "Container registry password",
"arc.control.plane.cluster.config.profile.title": "Choose the config profile",
"arc.control.plane.cluster.config.profile": "Config profile",
"arc.control.plane.data.controller.create.title": "Provide details to create Azure Arc data controller",
"arc.control.plane.project.details.title": "Project details",
"arc.control.plane.project.details.description": "Select the subscription to manage deployed resources and costs. Use resource groups like folders to organize and manage all your resources.",
"arc.control.plane.data.controller.details.title": "Data controller details",
"arc.control.plane.data.controller.details.description": "Provide an Azure region and a name for your Azure Arc data controller. This name will be used to identify your Arc location for remote management and monitoring.",
"arc.control.plane.arc.data.controller.connectivity.mode": "Data controller connectivity mode",
"arc.control.plane.arc.data.controller.namespace": "Data controller namespace",
"arc.control.plane.arc.data.controller.namespace.validation.description": "Data controller namespace (lower case letters, digits and - only)",
"arc.control.plane.arc.data.controller.name": "Data controller name",
"arc.control.plane.arc.data.controller.name.validation.description": "Data controller name (lower case letters, digits and - only)",
"arc.control.plane.arc.data.controller.location": "Location",
"arc.control.plane.admin.account.title": "Administrator account",
"arc.control.plane.admin.account.name": "Data controller login",
"arc.control.plane.admin.account.password": "Password",
"arc.control.plane.admin.account.confirm.password": "Confirm password",
"arc.control.plane.data.controller.create.summary.title": "Review your configuration",
"arc.control.plane.summary.arc.data.controller": "Azure Arc data controller",
"arc.control.plane.summary.estimated.cost.per.month": "Estimated cost per month",
"arc.control.plane.summary.arc.by.microsoft" : "by Microsoft",
"arc.control.plane.summary.free" : "Free",
"arc.control.plane.summary.arc.terms.of.use" : "Terms of use",
"arc.control.plane.summary.arc.terms.separator" : "|",
"arc.control.plane.summary.arc.terms.privacy.policy" : "Privacy policy",
"arc.control.plane.summary.terms" : "Terms",
"arc.control.plane.summary.terms.description": "By clicking 'Script to notebook', I (a) agree to the legal terms and privacy statement(s) associated with the Marketplace offering(s) listed above; (b) authorize Microsoft to bill my current payment method for the fees associated with the offering(s), with the same billing frequency as my Azure subscription; and (c) agree that Microsoft may share my contact, usage and transactional information with the provider(s) of the offering(s) for support, billing and other transactional activities. Microsoft does not provide rights for third-party offerings. For additional details see {0}.",
"arc.control.plane.summary.terms.link.text": "Azure Marketplace Terms",
"arc.control.plane.summary.kubernetes": "Kubernetes",
"arc.control.plane.summary.kube.config.file.path": "Kube config file path",
"arc.control.plane.summary.cluster.context": "Cluster context",
"arc.control.plane.summary.profile": "Config profile",
"arc.control.plane.summary.username": "Username",
"arc.control.plane.summary.docker.username": "Docker username",
"arc.control.plane.summary.azure": "Azure",
"arc.control.plane.summary.subscription": "Subscription",
"arc.control.plane.summary.resource.group": "Resource group",
"arc.control.plane.summary.data.controller.connectivity.mode": "Data controller connectivity mode",
"arc.control.plane.summary.data.controller.name": "Data controller name",
"arc.control.plane.summary.data.controller.namespace": "Data controller namespace",
"arc.control.plane.summary.location": "Location",
"arc.control.plane.arc.data.controller.agreement": "I accept {0} and {1}.",
"arc.data.controller.new.wizard.title": "Create Azure Arc data controller",
"arc.data.controller.cluster.environment.title": "What is your target existing Kubernetes cluster environment?",
"arc.data.controller.select.cluster.title": "Select from existing Kubernetes clusters",
"arc.data.controller.kube.cluster.context": "Cluster context",
"arc.data.controller.cluster.config.profile.title": "Choose the config profile",
"arc.data.controller.cluster.config.profile": "Config profile",
"arc.data.controller.create.azureconfig.title": "Azure and Connectivity Configuration",
"arc.data.controller.connectivitymode.description": "Select the connectivity mode for the controller.",
"arc.data.controller.create.controllerconfig.title": "Controller Configuration",
"arc.data.controller.project.details.title": "Azure details",
"arc.data.controller.project.details.description": "Select the subscription to manage deployed resources and costs. Use resource groups like folders to organize and manage all your resources.",
"arc.data.controller.details.title": "Data controller details",
"arc.data.controller.details.description": "Provide a namespace, name and storage class for your Azure Arc data controller. This name will be used to identify your Arc instance for remote management and monitoring.",
"arc.data.controller.namespace": "Data controller namespace",
"arc.data.controller.namespace.validation.description": "Namespace must consist of lower case alphanumeric characters or '-', start/end with an alphanumeric character, and be 63 characters or fewer in length.",
"arc.data.controller.name": "Data controller name",
"arc.data.controller.name.validation.description": "Name must consist of lower case alphanumeric characters, '-' or '.', start/end with an alphanumeric character and be 253 characters or less in length.",
"arc.data.controller.location": "Location",
"arc.data.controller.admin.account.title": "Administrator account",
"arc.data.controller.admin.account.name": "Data controller login",
"arc.data.controller.admin.account.password": "Password",
"arc.data.controller.admin.account.confirm.password": "Confirm password",
"arc.data.controller.connectivitymode": "Connectivity Mode",
"arc.data.controller.direct": "Direct",
"arc.data.controller.indirect": "Indirect",
"arc.data.controller.serviceprincipal.description": "When deploying a controller in direct connected mode a Service Principal is required for uploading metrics to Azure. {0} about how to create this Service Principal and assign it the correct roles.",
"arc.data.controller.spclientid": "Service Principal Client ID",
"arc.data.controller.spclientid.description": "The Application (client) ID of the created Service Principal",
"arc.data.controller.spclientid.validation.description": "The client ID must be a GUID in the format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"arc.data.controller.spclientsecret": "Service Principal Client Secret",
"arc.data.controller.spclientsecret.description": "The password generated during creation of the Service Principal",
"arc.data.controller.sptenantid": "Service Principal Tenant ID",
"arc.data.controller.sptenantid.description": "The Tenant ID of the Service Principal. This must be the same as the Tenant ID of the subscription selected to create this controller for.",
"arc.data.controller.sptenantid.validation.description": "The tenant ID must be a GUID in the format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"arc.data.controller.create.summary.title": "Review your configuration",
"arc.data.controller.summary.arc.data.controller": "Azure Arc data controller",
"arc.data.controller.summary.estimated.cost.per.month": "Estimated cost per month",
"arc.data.controller.summary.arc.by.microsoft" : "by Microsoft",
"arc.data.controller.summary.free" : "Free",
"arc.data.controller.summary.arc.terms.of.use" : "Terms of use",
"arc.data.controller.summary.arc.terms.separator" : "|",
"arc.data.controller.summary.arc.terms.privacy.policy" : "Privacy policy",
"arc.data.controller.summary.terms" : "Terms",
"arc.data.controller.summary.terms.description": "By clicking 'Script to notebook', I (a) agree to the legal terms and privacy statement(s) associated with the Marketplace offering(s) listed above; (b) authorize Microsoft to bill my current payment method for the fees associated with the offering(s), with the same billing frequency as my Azure subscription; and (c) agree that Microsoft may share my contact, usage and transactional information with the provider(s) of the offering(s) for support, billing and other transactional activities. Microsoft does not provide rights for third-party offerings. For additional details see {0}.",
"arc.data.controller.summary.terms.link.text": "Azure Marketplace Terms",
"arc.data.controller.summary.kubernetes": "Kubernetes",
"arc.data.controller.summary.kube.config.file.path": "Kube config file path",
"arc.data.controller.summary.cluster.context": "Cluster context",
"arc.data.controller.summary.profile": "Config profile",
"arc.data.controller.summary.username": "Username",
"arc.data.controller.summary.azure": "Azure",
"arc.data.controller.summary.subscription": "Subscription",
"arc.data.controller.summary.resource.group": "Resource group",
"arc.data.controller.summary.data.controller.name": "Data controller name",
"arc.data.controller.summary.data.controller.namespace": "Data controller namespace",
"arc.data.controller.summary.controller": "Controller",
"arc.data.controller.summary.location": "Location",
"arc.data.controller.agreement": "I accept {0} and {1}.",
"arc.data.controller.readmore": "Read more",
"microsoft.agreement.privacy.statement":"Microsoft Privacy Statement",
"arc.agreement.azdata.eula":"azdata license terms",
"deploy.arc.control.plane.action":"Script to notebook",
"deploy.script.action":"Script to notebook",
"deploy.done.action":"Deploy",
"resource.type.arc.sql.display.name": "Azure SQL managed instance - Azure Arc (preview)",
"resource.type.arc.postgres.display.name": "PostgreSQL server groups - Azure Arc (preview)",
"resource.type.arc.postgres.display.name": "PostgreSQL Hyperscale server groups - Azure Arc (preview)",
"resource.type.arc.sql.description": "Managed SQL Instance service for app developers in a customer-managed environment",
"resource.type.arc.postgres.description": "Deploy PostgreSQL server groups into an Azure Arc environment",
"resource.type.picker.display.name": "Resource Type",
"sql.managed.instance.display.name": "Azure SQL managed instance - Azure Arc",
"postgres.server.group.display.name": "PostgreSQL server groups - Azure Arc",
"arc.sql.new.dialog.title": "Deploy Azure SQL managed instance - Azure Arc (preview)",
"arc.sql.settings.section.title": "SQL Connection information",
"resource.type.arc.postgres.description": "Deploy PostgreSQL Hyperscale server groups into an Azure Arc environment",
"arc.controller": "Target Azure Arc Controller",
"arc.sql.wizard.title": "Deploy Azure SQL managed instance - Azure Arc (preview)",
"arc.sql.wizard.page1.title": "Provide Azure SQL managed instance parameters",
"arc.sql.connection.settings.section.title": "SQL Connection information",
"arc.sql.instance.settings.section.title": "SQL Instance settings",
"arc.azure.section.title": "Azure information",
"arc.sql.instance.name": "Instance name (lower case letters and digits only)",
"arc.sql.instance.name": "Instance name",
"arc.sql.username": "Username",
"arc.sql.invalid.username": "sa username is disabled, please choose another username",
"arc.sql.invalid.instance.name": "Instance name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 13 characters or fewer in length.",
"arc.storage-class.dc.label": "Storage Class",
"arc.sql.storage-class.dc.description": "The storage class to be used for all data and logs persistent volumes for all data controller pods that require them.",
"arc.storage-class.data.label": "Storage Class (Data)",
@@ -90,6 +102,14 @@
"arc.sql.storage-class.logs.description": "The storage class to be used for logs (/var/log)",
"arc.postgres.storage-class.logs.description": "The storage class to be used for logs persistent volumes",
"arc.storage-class.backups.label": "Storage Class (Backups)",
"arc.cores-limit.label": "Cores Limit",
"arc.sql.cores-limit.description": "The cores limit of the managed instance as an integer.",
"arc.cores-request.label": "Cores Request",
"arc.sql.cores-request.description": "The request for cores of the managed instance as an integer.",
"arc.memory-limit.label": "Memory Limit",
"arc.sql.memory-limit.description": "The limit of the capacity of the managed instance as an integer.",
"arc.memory-request.label": "Memory Request",
"arc.sql.memory-request.description": "The request for the capacity of the managed instance as an integer amount of memory in GBs.",
"arc.postgres.storage-class.backups.description": "The storage class to be used for backup persistent volumes",
"arc.password": "Password",
"arc.confirm.password": "Confirm password",
@@ -97,19 +117,39 @@
"arc.azure.subscription": "Azure subscription",
"arc.azure.resource.group": "Azure resource group",
"arc.azure.location": "Azure location",
"arc.postgres.new.dialog.title": "Deploy a PostgreSQL server group on Azure Arc (preview)",
"arc.postgres.settings.section.title": "PostgreSQL server group settings",
"arc.postgres.settings.resource.title": "PostgreSQL server group resource settings",
"arc.postgres.wizard.title": "Deploy an Azure Arc enabled PostgreSQL Hyperscale server group (Preview)",
"arc.postgres.wizard.page1.title": "Provide Azure enabled PostgreSQL Hyperscale server group parameters",
"arc.postgres.settings.section.title": "General settings",
"arc.postgres.settings.resource.title": "Resource settings",
"arc.postgres.settings.storage.title": "Storage settings",
"arc.postgres.server.group.name": "Server group name",
"arc.postgres.server.group.name.validation.description": "Server group name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 10 characters or fewer in length.",
"arc.postgres.server.group.workers": "Number of workers",
"arc.postgres.server.group.name.validation.description": "Server group name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 12 characters or fewer in length.",
"arc.postgres.server.group.workers.label": "Number of workers",
"arc.postgres.server.group.workers.description": "The number of worker nodes to provision in a sharded cluster, or zero (the default) for single-node Postgres.",
"arc.postgres.server.group.port": "Port",
"arc.postgres.server.group.cores.request": "Min CPU cores (per node) to reserve",
"arc.postgres.server.group.cores.limit": "Max CPU cores (per node) to allow",
"arc.postgres.server.group.memory.request": "Min memory MB (per node) to reserve",
"arc.postgres.server.group.memory.limit": "Max memory MB (per node) to allow",
"arc.agreement": "I accept {0}, {1} and {2}.",
"arc.agreement.sql.terms.conditions":"Azure SQL managed instance - Azure Arc terms and conditions",
"arc.agreement.postgres.terms.conditions":"PostgreSQL server groups - Azure Arc terms and conditions",
"arc.deploy.action":"Deploy"
"arc.postgres.server.group.engine.version": "Engine Version",
"arc.postgres.server.group.extensions.label": "Extensions",
"arc.postgres.server.group.extensions.description": "A comma-separated list of the Postgres extensions that should be loaded on startup. Please refer to the postgres documentation for supported values.",
"arc.postgres.server.group.volume.size.data.label": "Volume Size GB (Data)",
"arc.postgres.server.group.volume.size.data.description": "The size of the storage volume to be used for data in GB.",
"arc.postgres.server.group.volume.size.logs.label": "Volume Size GB (Logs)",
"arc.postgres.server.group.volume.size.logs.description": "The size of the storage volume to be used for logs in GB.",
"arc.postgres.server.group.volume.size.backups.label": "Volume Size GB (Backups)",
"arc.postgres.server.group.volume.size.backups.description": "The size of the storage volume to be used for backups in GB.",
"arc.postgres.server.group.cores.request.label": "CPU request (cores per node)",
"arc.postgres.server.group.cores.request.description": "The minimum number of CPU cores that must be available per node to schedule the service. Fractional cores are supported.",
"arc.postgres.server.group.cores.limit.label": "CPU limit (cores per node)",
"arc.postgres.server.group.cores.limit.description": "The maximum number of CPU cores for the Postgres instance that can be used per node. Fractional cores are supported.",
"arc.postgres.server.group.memory.request.label": "Memory request (GB per node)",
"arc.postgres.server.group.memory.request.description": "The memory request of the Postgres instance per node in GB.",
"arc.postgres.server.group.memory.limit.label": "Memory limit (GB per node)",
"arc.postgres.server.group.memory.limit.description": "The memory limit of the Postgres instance per node in GB.",
"arc.agreement": "I accept {0} and {1}.",
"arc.agreement.sql.terms.conditions": "Azure SQL managed instance - Azure Arc terms and conditions",
"arc.agreement.postgres.terms.conditions": "Azure Arc enabled PostgreSQL Hyperscale terms and conditions",
"should.be.integer": "Value must be an integer",
"requested.cores.less.than.or.equal.to.cores.limit": "Requested cores must be less than or equal to cores limit",
"cores.limit.greater.than.or.equal.to.requested.cores": "Cores limit must be greater than or equal to requested cores",
"requested.memory.less.than.or.equal.to.memory.limit": "Requested memory must be less than or equal to memory limit",
"memory.limit.greater.than.or.equal.to.requested.memory": "Memory limit must be greater than or equal to requested memory"
}

View File

@@ -0,0 +1,41 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as arc from 'arc';
import { PasswordToControllerDialog } from '../ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
import { ControllerTreeNode } from '../ui/tree/controllerTreeNode';
import { UserCancelledError } from './utils';
export function arcApi(treeDataProvider: AzureArcTreeDataProvider): arc.IExtension {
return {
getRegisteredDataControllers: () => getRegisteredDataControllers(treeDataProvider),
getControllerPassword: (controllerInfo: arc.ControllerInfo) => getControllerPassword(treeDataProvider, controllerInfo),
reacquireControllerPassword: (controllerInfo: arc.ControllerInfo) => reacquireControllerPassword(treeDataProvider, controllerInfo)
};
}
export async function reacquireControllerPassword(treeDataProvider: AzureArcTreeDataProvider, controllerInfo: arc.ControllerInfo): Promise<string> {
const dialog = new PasswordToControllerDialog(treeDataProvider);
dialog.showDialog(controllerInfo);
const model = await dialog.waitForClose();
if (!model) {
throw new UserCancelledError();
}
return model.password;
}
export async function getControllerPassword(treeDataProvider: AzureArcTreeDataProvider, controllerInfo: arc.ControllerInfo): Promise<string> {
return await treeDataProvider.getPassword(controllerInfo);
}
export async function getRegisteredDataControllers(treeDataProvider: AzureArcTreeDataProvider): Promise<arc.DataController[]> {
return (await treeDataProvider.getChildren())
.filter(node => node instanceof ControllerTreeNode)
.map(node => ({
label: (node as ControllerTreeNode).model.label,
info: (node as ControllerTreeNode).model.info
}));
}

View File

@@ -0,0 +1,83 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { Deferred } from './promise';
const enum Status {
notStarted,
inProgress,
done
}
interface State<T> {
entry?: T,
error?: Error,
status: Status,
id: number,
pendingOperation: Deferred<void>
}
/**
* An implementation of Cache Manager which ensures that only one call to populate cache miss is pending at a given time.
* All remaining calls for retrieval are awaited until the one in progress finishes and then all awaited calls are resolved with the value
* from the cache.
*/
export class CacheManager<K, T> {
private _cache = new Map<K, State<T>>();
private _id = 0;
public async getCacheEntry(key: K, retrieveEntry: (key: K) => Promise<T>): Promise<T> {
const cacheHit: State<T> | undefined = this._cache.get(key);
// each branch either throws or returns the password.
if (cacheHit === undefined) {
// populate a new state entry and add it to the cache
const state: State<T> = {
status: Status.notStarted,
id: this._id++,
pendingOperation: new Deferred<void>()
};
this._cache.set(key, state);
// now that we have the state entry initialized, retry to fetch the cacheEntry
let returnValue: T = await this.getCacheEntry(key, retrieveEntry);
await state.pendingOperation;
return returnValue!;
} else {
switch (cacheHit.status) {
case Status.notStarted: {
cacheHit.status = Status.inProgress;
// retrieve and populate the missed cache hit.
try {
cacheHit.entry = await retrieveEntry(key);
} catch (error) {
cacheHit.error = error;
} finally {
cacheHit.status = Status.done;
// we do not reject here even in error case because we do not want our awaits on pendingOperation to throw
// We track our own error state and when all done we throw if an error had happened. This results
// in the rejection of the promised returned by this method.
cacheHit.pendingOperation.resolve();
}
return await this.getCacheEntry(key, retrieveEntry);
}
case Status.inProgress: {
await cacheHit.pendingOperation;
return await this.getCacheEntry(key, retrieveEntry);
}
case Status.done: {
if (cacheHit.error !== undefined) {
await cacheHit.pendingOperation;
throw cacheHit.error;
}
else {
await cacheHit.pendingOperation;
return cacheHit.entry!;
}
}
}
}
}
}

View File

@@ -0,0 +1,39 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as os from 'os';
import * as path from 'path';
import * as yamljs from 'yamljs';
import * as loc from '../localizedConstants';
import { throwUnless } from './utils';
export interface KubeClusterContext {
name: string;
isCurrentContext: boolean;
}
export function getKubeConfigClusterContexts(configFile: string): Promise<KubeClusterContext[]> {
const config: any = yamljs.load(configFile);
const rawContexts = <any[]>config['contexts'];
throwUnless(rawContexts && rawContexts.length, loc.noContextFound(configFile));
const currentContext = <string>config['current-context'];
throwUnless(currentContext, loc.noCurrentContextFound(configFile));
const contexts: KubeClusterContext[] = [];
rawContexts.forEach(rawContext => {
const name = <string>rawContext['name'];
throwUnless(name, loc.noNameInContext(configFile));
if (name) {
contexts.push({
name: name,
isCurrentContext: name === currentContext
});
}
});
return Promise.resolve(contexts);
}
export function getDefaultKubeConfigPath(): string {
return path.join(os.homedir(), '.kube', 'config');
}

View File

@@ -67,7 +67,7 @@ export function getResourceTypeIcon(resourceType: string | undefined): IconPath
/**
* Returns the text to display for known connection modes
* @param connectionMode The string repsenting the connection mode
* @param connectionMode The string representing the connection mode
*/
export function getConnectionModeDisplayText(connectionMode: string | undefined): string {
connectionMode = connectionMode ?? '';
@@ -148,15 +148,15 @@ async function promptInputBox(title: string, options: vscode.InputBoxOptions): P
}
/**
* Opens an input box prompting the user to enter in the name of a resource to delete
* @param name The name of the resource to delete
* Opens an input box prompting the user to enter in the name of an instance to delete
* @param name The name of the instance to delete
* @returns Promise resolving to true if the user confirmed the name, false if the input box was closed for any other reason
*/
export async function promptForResourceDeletion(name: string): Promise<boolean> {
const title = loc.resourceDeletionWarning(name);
export async function promptForInstanceDeletion(name: string): Promise<boolean> {
const title = loc.instanceDeletionWarning(name);
const options: vscode.InputBoxOptions = {
placeHolder: name,
validateInput: input => input !== name ? loc.invalidResourceDeletionName(name) : ''
validateInput: input => input !== name ? loc.invalidInstanceDeletionName(name) : ''
};
return await promptInputBox(title, options) !== undefined;
@@ -189,28 +189,15 @@ export async function promptAndConfirmPassword(validate: (input: string) => stri
/**
* Gets the message to display for a given error object that may be a variety of types.
* @param error The error object
* @param useMessageWithLink Whether to use the messageWithLink - if available
*/
export function getErrorMessage(error: any): string {
export function getErrorMessage(error: any, useMessageWithLink: boolean = false): string {
if (useMessageWithLink && error.messageWithLink) {
return error.messageWithLink;
}
return error.message ?? error;
}
/**
* Parses an instance name from the controller. An instance name will either be just its name
* e.g. myinstance or namespace_name e.g. mynamespace_my-instance.
* @param instanceName The instance name in one of the formats described
*/
export function parseInstanceName(instanceName: string | undefined): string {
instanceName = instanceName ?? '';
const parts: string[] = instanceName.split('_');
if (parts.length === 2) {
instanceName = parts[1];
}
else if (parts.length > 2) {
throw new Error(`Cannot parse resource '${instanceName}'. Acceptable formats are 'namespace_name' or 'name'.`);
}
return instanceName;
}
/**
* Parses an address into its separate ip and port values. Address must be in the form <ip>:<port>
* @param address The address to parse
@@ -225,3 +212,88 @@ export function parseIpAndPort(address: string): { ip: string, port: string } {
port: sections[1]
};
}
export function createCredentialId(controllerId: string, resourceType: string, instanceName: string): string {
return `${controllerId}::${resourceType}::${instanceName}`;
}
/**
* Calculates the gibibyte (GiB) conversion of a quantity that could currently be represented by a range
* of SI suffixes (E, P, T, G, M, K, m) or their power-of-two equivalents (Ei, Pi, Ti, Gi, Mi, Ki)
* @param value The string of a quantity to be converted
* @returns String of GiB conversion
*/
export function convertToGibibyteString(value: string): string {
if (!value) {
throw new Error(`Value provided is not a valid Kubernetes resource quantity`);
}
let base10ToBase2Multiplier;
let floatValue = parseFloat(value);
let splitValue = value.split(String(floatValue));
let unit = splitValue[1];
if (unit === 'K') {
base10ToBase2Multiplier = 1000 / 1024;
floatValue = (floatValue * base10ToBase2Multiplier) / Math.pow(1024, 2);
} else if (unit === 'M') {
base10ToBase2Multiplier = Math.pow(1000, 2) / Math.pow(1024, 2);
floatValue = (floatValue * base10ToBase2Multiplier) / 1024;
} else if (unit === 'G') {
base10ToBase2Multiplier = Math.pow(1000, 3) / Math.pow(1024, 3);
floatValue = floatValue * base10ToBase2Multiplier;
} else if (unit === 'T') {
base10ToBase2Multiplier = Math.pow(1000, 4) / Math.pow(1024, 4);
floatValue = (floatValue * base10ToBase2Multiplier) * 1024;
} else if (unit === 'P') {
base10ToBase2Multiplier = Math.pow(1000, 5) / Math.pow(1024, 5);
floatValue = (floatValue * base10ToBase2Multiplier) * Math.pow(1024, 2);
} else if (unit === 'E') {
base10ToBase2Multiplier = Math.pow(1000, 6) / Math.pow(1024, 6);
floatValue = (floatValue * base10ToBase2Multiplier) * Math.pow(1024, 3);
} else if (unit === 'm') {
floatValue = (floatValue / 1000) / Math.pow(1024, 3);
} else if (unit === '') {
floatValue = floatValue / Math.pow(1024, 3);
} else if (unit === 'Ki') {
floatValue = floatValue / Math.pow(1024, 2);
} else if (unit === 'Mi') {
floatValue = floatValue / 1024;
} else if (unit === 'Gi') {
floatValue = floatValue;
} else if (unit === 'Ti') {
floatValue = floatValue * 1024;
} else if (unit === 'Pi') {
floatValue = floatValue * Math.pow(1024, 2);
} else if (unit === 'Ei') {
floatValue = floatValue * Math.pow(1024, 3);
} else {
throw new Error(`${value} is not a valid Kubernetes resource quantity`);
}
return String(floatValue);
}
/*
* Throws an Error with given {@link message} unless {@link condition} is true.
* This also tells the typescript compiler that the condition is 'truthy' in the remainder of the scope
* where this function was called.
*
* @param condition
* @param message
*/
export function throwUnless(condition: any, message?: string): asserts condition {
if (!condition) {
throw new Error(message);
}
}
export async function tryExecuteAction<T>(action: () => T | PromiseLike<T>): Promise<{ result: T | undefined, error: any }> {
let error: any, result: T | undefined;
try {
result = await action();
} catch (e) {
error = e;
}
return { result, error };
}

View File

@@ -7,6 +7,11 @@ import * as vscode from 'vscode';
export const refreshActionId = 'arc.refresh';
export const credentialNamespace = 'arcCredentials';
export const controllerTroubleshootDocsUrl = 'https://aka.ms/arc-data-tsg';
export const miaaTroubleshootDocsUrl = 'https://aka.ms/miaa-tsg';
export interface IconPath {
dark: string;
light: string;
@@ -35,7 +40,10 @@ export class IconPathHelper {
public static controller: IconPath;
public static health: IconPath;
public static success: IconPath;
public static save: IconPath;
public static discard: IconPath;
public static fail: IconPath;
public static information: IconPath;
public static setExtensionContext(context: vscode.ExtensionContext) {
IconPathHelper.context = context;
@@ -111,10 +119,22 @@ export class IconPathHelper {
light: context.asAbsolutePath('images/success.svg'),
dark: context.asAbsolutePath('images/success.svg'),
};
IconPathHelper.save = {
light: context.asAbsolutePath('images/save.svg'),
dark: context.asAbsolutePath('images/save.svg'),
};
IconPathHelper.discard = {
light: context.asAbsolutePath('images/discard.svg'),
dark: context.asAbsolutePath('images/discard.svg'),
};
IconPathHelper.fail = {
light: context.asAbsolutePath('images/fail.svg'),
dark: context.asAbsolutePath('images/fail.svg'),
};
IconPathHelper.information = {
light: context.asAbsolutePath('images/information.svg'),
dark: context.asAbsolutePath('images/information.svg'),
};
}
}

View File

@@ -4,9 +4,12 @@
*--------------------------------------------------------------------------------------------*/
import * as arc from 'arc';
import * as rd from 'resource-deployment';
import * as vscode from 'vscode';
import { arcApi } from './common/api';
import { IconPathHelper, refreshActionId } from './constants';
import * as loc from './localizedConstants';
import { ArcControllersOptionsSourceProvider } from './providers/arcControllersOptionsSourceProvider';
import { ConnectToControllerDialog } from './ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from './ui/tree/azureArcTreeDataProvider';
import { ControllerTreeNode } from './ui/tree/controllerTreeNode';
@@ -25,6 +28,14 @@ export async function activate(context: vscode.ExtensionContext): Promise<arc.IE
});
vscode.commands.registerCommand('arc.connectToController', async () => {
const nodes = await treeDataProvider.getChildren();
if (nodes.length > 0) {
const response = await vscode.window.showErrorMessage(loc.onlyOneControllerSupported, loc.yes, loc.no);
if (response !== loc.yes) {
return;
}
await treeDataProvider.removeController(nodes[0] as ControllerTreeNode);
}
const dialog = new ConnectToControllerDialog(treeDataProvider);
dialog.showDialog();
const model = await dialog.waitForClose();
@@ -54,28 +65,12 @@ export async function activate(context: vscode.ExtensionContext): Promise<arc.IE
}
});
await checkArcDeploymentExtension();
// register option sources
const rdApi = <rd.IExtension>vscode.extensions.getExtension(rd.extension.name)?.exports;
rdApi.registerOptionsSourceProvider(new ArcControllersOptionsSourceProvider(treeDataProvider));
return {
getRegisteredDataControllers: async () => {
return (await treeDataProvider.getChildren())
.filter(node => node instanceof ControllerTreeNode)
.map(node => (node as ControllerTreeNode).model.info);
}
};
return arcApi(treeDataProvider);
}
export function deactivate(): void {
}
async function checkArcDeploymentExtension(): Promise<void> {
const version = vscode.extensions.getExtension('Microsoft.arcdeployment')?.packageJSON.version;
if (version && version !== '0.3.2') {
// If we have an older version of the deployment extension installed then uninstall it now since it's replaced
// by this extension. (the latest version of the Arc Deployment extension will uninstall itself so don't do
// anything here if that's already updated)
await vscode.commands.executeCommand('workbench.extensions.uninstallExtension', 'Microsoft.arcdeployment');
vscode.window.showInformationMessage(loc.arcDeploymentDeprecation);
}
}

View File

@@ -8,13 +8,13 @@ import { getErrorMessage } from './common/utils';
const localize = nls.loadMessageBundle();
export const arcDeploymentDeprecation = localize('arc.arcDeploymentDeprecation', "The Arc Deployment extension has been replaced by the Arc extension and has been uninstalled.");
export function arcControllerDashboard(name: string): string { return localize('arc.controllerDashboard', "Azure Arc Controller Dashboard (Preview) - {0}", name); }
export function miaaDashboard(name: string): string { return localize('arc.miaaDashboard', "Managed Instance Dashboard (Preview) - {0}", name); }
export function postgresDashboard(name: string): string { return localize('arc.postgresDashboard', "Postgres Dashboard (Preview) - {0}", name); }
export function arcControllerDashboard(name: string): string { return localize('arc.controllerDashboard', "Azure Arc Data Controller Dashboard (Preview) - {0}", name); }
export function miaaDashboard(name: string): string { return localize('arc.miaaDashboard', "SQL managed instance - Azure Arc Dashboard (Preview) - {0}", name); }
export function postgresDashboard(name: string): string { return localize('arc.postgresDashboard', "PostgreSQL Hyperscale - Azure Arc Dashboard (Preview) - {0}", name); }
export const dataControllersType = localize('arc.dataControllersType', "Azure Arc Data Controller");
export const pgSqlType = localize('arc.pgSqlType', "PostgreSQL Server group - Azure Arc");
export const miaaType = localize('arc.miaaType', "SQL instance - Azure Arc");
export const pgSqlType = localize('arc.pgSqlType', "PostgreSQL Hyperscale - Azure Arc");
export const miaaType = localize('arc.miaaType', "SQL managed instance - Azure Arc");
export const overview = localize('arc.overview', "Overview");
export const connectionStrings = localize('arc.connectionStrings', "Connection Strings");
@@ -32,6 +32,8 @@ export const resourceHealth = localize('arc.resourceHealth', "Resource health");
export const newInstance = localize('arc.createNew', "New Instance");
export const deleteText = localize('arc.delete', "Delete");
export const saveText = localize('arc.save', "Save");
export const discardText = localize('arc.discard', "Discard");
export const resetPassword = localize('arc.resetPassword', "Reset Password");
export const openInAzurePortal = localize('arc.openInAzurePortal', "Open in Azure Portal");
export const resourceGroup = localize('arc.resourceGroup', "Resource Group");
@@ -59,6 +61,10 @@ export const yes = localize('arc.yes', "Yes");
export const no = localize('arc.no', "No");
export const feedback = localize('arc.feedback', "Feedback");
export const selectConnectionString = localize('arc.selectConnectionString', "Select from available client connection strings below.");
export const addingWokerNodes = localize('arc.addingWokerNodes', "adding worker nodes");
export const workerNodesDescription = localize('arc.workerNodesDescription', "Expand your server group and scale your database by adding worker nodes.");
export const postgresConfigurationInformation = localize('arc.postgres.configurationInformation', "You can configure the number of CPU cores and storage size that will apply to both worker nodes and coordinator node. Each worker node will have the same configuration. Adjust the number of CPU cores and memory settings for your server group.");
export const workerNodesInformation = localize('arc.workerNodeInformation', "In preview it is not possible to reduce the number of worker nodes. Please refer to documentation linked above for more information.");
export const vCores = localize('arc.vCores', "vCores");
export const ram = localize('arc.ram', "RAM");
export const refresh = localize('arc.refresh', "Refresh");
@@ -72,8 +78,12 @@ export const direct = localize('arc.direct', "Direct");
export const indirect = localize('arc.indirect', "Indirect");
export const loading = localize('arc.loading', "Loading...");
export const refreshToEnterCredentials = localize('arc.refreshToEnterCredentials', "Refresh node to enter credentials");
export const noInstancesAvailable = localize('arc.noInstancesAvailable', "No instances available");
export const connectToController = localize('arc.connectToController', "Connect to Existing Controller");
export function connectToSql(name: string): string { return localize('arc.connectToSql', "Connect to SQL managed instance - Azure Arc ({0})", name); }
export const passwordToController = localize('arc.passwordToController', "Provide Password to Controller");
export const controllerUrl = localize('arc.controllerUrl', "Controller URL");
export const serverEndpoint = localize('arc.serverEndpoint', "Server Endpoint");
export const controllerName = localize('arc.controllerName', "Name");
export const defaultControllerName = localize('arc.defaultControllerName', "arc-dc");
export const username = localize('arc.username', "Username");
@@ -81,6 +91,7 @@ export const password = localize('arc.password', "Password");
export const rememberPassword = localize('arc.rememberPassword', "Remember Password");
export const connect = localize('arc.connect', "Connect");
export const cancel = localize('arc.cancel', "Cancel");
export const ok = localize('arc.ok', "Ok");
export const notConfigured = localize('arc.notConfigured', "Not Configured");
// Database States - see https://docs.microsoft.com/sql/relational-databases/databases/database-states
@@ -109,9 +120,28 @@ export const databaseName = localize('arc.databaseName', "Database name");
export const enterNewPassword = localize('arc.enterNewPassword', "Enter a new password");
export const confirmNewPassword = localize('arc.confirmNewPassword', "Confirm the new password");
export const learnAboutPostgresClients = localize('arc.learnAboutPostgresClients', "Learn more about Azure PostgreSQL Hyperscale client interfaces");
export const scalingCompute = localize('arc.scalingCompute', "scaling compute vCores and memory.");
export const postgresComputeAndStorageDescriptionPartOne = localize('arc.postgresComputeAndStorageDescriptionPartOne', "You can scale your Azure Arc enabled");
export const miaaComputeAndStorageDescriptionPartOne = localize('arc.miaaComputeAndStorageDescriptionPartOne', "You can scale your Azure SQL managed instance - Azure Arc by");
export const postgresComputeAndStorageDescriptionPartTwo = localize('arc.postgres.computeAndStorageDescriptionPartTwo', "PostgreSQL Hyperscale server group by");
export const computeAndStorageDescriptionPartThree = localize('arc.computeAndStorageDescriptionPartThree', "without downtime and by");
export const computeAndStorageDescriptionPartFour = localize('arc.computeAndStorageDescriptionPartFour', "Before doing so, you need to ensure");
export const computeAndStorageDescriptionPartFive = localize('arc.computeAndStorageDescriptionPartFive', "there are sufficient resources available");
export const computeAndStorageDescriptionPartSix = localize('arc.computeAndStorageDescriptionPartSix', "in your Kubernetes cluster to honor this configuration.");
export const node = localize('arc.node', "node");
export const nodes = localize('arc.nodes', "nodes");
export const workerNodes = localize('arc.workerNodes', "Worker Nodes");
export const storagePerNode = localize('arc.storagePerNode', "storage per node");
export const workerNodeCount = localize('arc.workerNodeCount', "Worker node count:");
export const configurationPerNode = localize('arc.configurationPerNode', "Configuration (per node)");
export const coresLimit = localize('arc.coresLimit', "CPU limit:");
export const coresRequest = localize('arc.coresRequest', "CPU request:");
export const memoryLimit = localize('arc.memoryLimit', "Memory limit (in GB):");
export const memoryRequest = localize('arc.memoryRequest', "Memory request (in GB):");
export const workerValidationErrorMessage = localize('arc.workerValidationErrorMessage', "The number of workers cannot be decreased.");
export const coresValidationErrorMessage = localize('arc.coresValidationErrorMessage', "Valid CPU resource quantities are strictly positive.");
export const memoryRequestValidationErrorMessage = localize('arc.memoryRequestValidationErrorMessage', "Memory request must be at least 0.25Gib");
export const memoryLimitValidationErrorMessage = localize('arc.memoryLimitValidationErrorMessage', "Memory limit must be at least 0.25Gib");
export const arcResources = localize('arc.arcResources', "Azure Arc Resources");
export const enterANonEmptyPassword = localize('arc.enterANonEmptyPassword', "Enter a non empty password or press escape to exit.");
export const thePasswordsDoNotMatch = localize('arc.thePasswordsDoNotMatch', "The passwords do not match. Confirm the password or press escape to exit.");
@@ -121,9 +151,13 @@ export const condition = localize('arc.condition', "Condition");
export const details = localize('arc.details', "Details");
export const lastUpdated = localize('arc.lastUpdated', "Last updated");
export const noExternalEndpoint = localize('arc.noExternalEndpoint', "No External Endpoint has been configured so this information isn't available.");
export const podsReady = localize('arc.podsReady', "pods ready");
export function databaseCreated(name: string): string { return localize('arc.databaseCreated', "Database {0} created", name); }
export function resourceDeleted(name: string): string { return localize('arc.resourceDeleted', "Resource '{0}' deleted", name); }
export function deletingInstance(name: string): string { return localize('arc.deletingInstance', "Deleting instance '{0}'...", name); }
export function updatingInstance(name: string): string { return localize('arc.updatingInstance', "Updating instance '{0}'...", name); }
export function instanceDeleted(name: string): string { return localize('arc.instanceDeleted', "Instance '{0}' deleted", name); }
export function instanceUpdated(name: string): string { return localize('arc.instanceUpdated', "Instance '{0}' updated", name); }
export function copiedToClipboard(name: string): string { return localize('arc.copiedToClipboard', "{0} copied to clipboard", name); }
export function clickTheTroubleshootButton(resourceType: string): string { return localize('arc.clickTheTroubleshootButton', "Click the troubleshoot button to open the Azure Arc {0} troubleshooting notebook.", resourceType); }
export function numVCores(vCores: string | undefined): string {
@@ -144,15 +178,30 @@ export const connectionRequired = localize('arc.connectionRequired', "A connecti
export const couldNotFindControllerRegistration = localize('arc.couldNotFindControllerRegistration', "Could not find controller registration.");
export function refreshFailed(error: any): string { return localize('arc.refreshFailed', "Refresh failed. {0}", getErrorMessage(error)); }
export function openDashboardFailed(error: any): string { return localize('arc.openDashboardFailed', "Error opening dashboard. {0}", getErrorMessage(error)); }
export function resourceDeletionFailed(name: string, error: any): string { return localize('arc.resourceDeletionFailed', "Failed to delete resource {0}. {1}", name, getErrorMessage(error)); }
export function instanceDeletionFailed(name: string, error: any): string { return localize('arc.instanceDeletionFailed', "Failed to delete instance {0}. {1}", name, getErrorMessage(error)); }
export function instanceUpdateFailed(name: string, error: any): string { return localize('arc.instanceUpdateFailed', "Failed to update instance {0}. {1}", name, getErrorMessage(error)); }
export function pageDiscardFailed(error: any): string { return localize('arc.pageDiscardFailed', "Failed to discard user input. {0}", getErrorMessage(error)); }
export function databaseCreationFailed(name: string, error: any): string { return localize('arc.databaseCreationFailed', "Failed to create database {0}. {1}", name, getErrorMessage(error)); }
export function connectToControllerFailed(url: string, error: any): string { return localize('arc.connectToControllerFailed', "Could not connect to controller {0}. {1}", url, getErrorMessage(error)); }
export function connectToSqlFailed(serverName: string, error: any): string { return localize('arc.connectToSqlFailed', "Could not connect to SQL managed instance - Azure Arc Instance {0}. {1}", serverName, getErrorMessage(error)); }
export function fetchConfigFailed(name: string, error: any): string { return localize('arc.fetchConfigFailed', "An unexpected error occurred retrieving the config for '{0}'. {1}", name, getErrorMessage(error)); }
export function fetchEndpointsFailed(name: string, error: any): string { return localize('arc.fetchEndpointsFailed', "An unexpected error occurred retrieving the endpoints for '{0}'. {1}", name, getErrorMessage(error)); }
export function fetchRegistrationsFailed(name: string, error: any): string { return localize('arc.fetchRegistrationsFailed', "An unexpected error occurred retrieving the registrations for '{0}'. {1}", name, getErrorMessage(error)); }
export function fetchDatabasesFailed(name: string, error: any): string { return localize('arc.fetchDatabasesFailed', "An unexpected error occurred retrieving the databases for '{0}'. {1}", name, getErrorMessage(error)); }
export function resourceDeletionWarning(name: string): string { return localize('arc.resourceDeletionWarning', "Warning! Deleting a resource is permanent and cannot be undone. To delete the resource '{0}' type the name '{0}' below to proceed.", name); }
export function invalidResourceDeletionName(name: string): string { return localize('arc.invalidResourceDeletionName', "The value '{0}' does not match the instance name. Try again or press escape to exit", name); }
export function instanceDeletionWarning(name: string): string { return localize('arc.instanceDeletionWarning', "Warning! Deleting an instance is permanent and cannot be undone. To delete the instance '{0}' type the name '{0}' below to proceed.", name); }
export function invalidInstanceDeletionName(name: string): string { return localize('arc.invalidInstanceDeletionName', "The value '{0}' does not match the instance name. Try again or press escape to exit", name); }
export function couldNotFindAzureResource(name: string): string { return localize('arc.couldNotFindAzureResource', "Could not find Azure resource for {0}", name); }
export function passwordResetFailed(error: any): string { return localize('arc.passwordResetFailed', "Failed to reset password. {0}", getErrorMessage(error)); }
export function errorConnectingToController(error: any): string { return localize('arc.errorConnectingToController', "Error connecting to controller. {0}", getErrorMessage(error)); }
export function errorConnectingToController(error: any): string { return localize('arc.errorConnectingToController', "Error connecting to controller. {0}", getErrorMessage(error, true)); }
export function passwordAcquisitionFailed(error: any): string { return localize('arc.passwordAcquisitionFailed', "Failed to acquire password. {0}", getErrorMessage(error)); }
export const invalidPassword = localize('arc.invalidPassword', "The password did not work, try again.");
export function errorVerifyingPassword(error: any): string { return localize('arc.errorVerifyingPassword', "Error encountered while verifying password. {0}", getErrorMessage(error)); }
export const onlyOneControllerSupported = localize('arc.onlyOneControllerSupported', "Only one controller connection is currently supported at this time. Do you wish to remove the existing connection and add a new one?");
export const noControllersConnected = localize('noControllersConnected', "No Azure Arc controllers are currently connected. Please run the command: 'Connect to Existing Azure Arc Controller' and then try again");
export const variableValueFetchForUnsupportedVariable = (variableName: string) => localize('getVariableValue.unknownVariableName', "Attempt to get variable value for unknown variable:{0}", variableName);
export const isPasswordFetchForUnsupportedVariable = (variableName: string) => localize('getIsPassword.unknownVariableName', "Attempt to get isPassword for unknown variable:{0}", variableName);
export const noControllerInfoFound = (name: string) => localize('noControllerInfoFound', "Controller Info could not be found with name: {0}", name);
export const noPasswordFound = (controllerName: string) => localize('noPasswordFound', "Password could not be retrieved for controller: {0} and user did not provide a password. Please retry later.", controllerName);
export const noContextFound = (configFile: string) => localize('noContextFound', "No 'contexts' found in the config file: {0}", configFile);
export const noCurrentContextFound = (configFile: string) => localize('noCurrentContextFound', "No context is marked as 'current-context' in the config file: {0}", configFile);
export const noNameInContext = (configFile: string) => localize('noNameInContext', "No name field was found in a cluster context in the config file: {0}", configFile);

View File

@@ -6,7 +6,7 @@
import { ControllerInfo, ResourceType } from 'arc';
import * as azdataExt from 'azdata-ext';
import * as vscode from 'vscode';
import { parseInstanceName, UserCancelledError } from '../common/utils';
import { UserCancelledError } from '../common/utils';
import * as loc from '../localizedConstants';
import { ConnectToControllerDialog } from '../ui/dialogs/connectControllerDialog';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
@@ -20,7 +20,6 @@ export type Registration = {
export class ControllerModel {
private readonly _azdataApi: azdataExt.IExtension;
private _endpoints: azdataExt.DcEndpointListResult[] = [];
private _namespace: string = '';
private _registrations: Registration[] = [];
private _controllerConfig: azdataExt.DcConfigShowResult | undefined = undefined;
@@ -93,7 +92,7 @@ export class ControllerModel {
}
public async refresh(showErrors: boolean = true, promptReconnect: boolean = false): Promise<void> {
await this.azdataLogin(promptReconnect);
this._registrations = [];
const newRegistrations: Registration[] = [];
await Promise.all([
this._azdataApi.azdata.arc.dc.config.show().then(result => {
this._controllerConfig = result.result;
@@ -125,7 +124,7 @@ export class ControllerModel {
}),
Promise.all([
this._azdataApi.azdata.arc.postgres.server.list().then(result => {
this._registrations.push(...result.result.map(r => {
newRegistrations.push(...result.result.map(r => {
return {
instanceName: r.name,
state: r.state,
@@ -134,7 +133,7 @@ export class ControllerModel {
}));
}),
this._azdataApi.azdata.arc.sql.mi.list().then(result => {
this._registrations.push(...result.result.map(r => {
newRegistrations.push(...result.result.map(r => {
return {
instanceName: r.name,
state: r.state,
@@ -143,6 +142,7 @@ export class ControllerModel {
}));
})
]).then(() => {
this._registrations = newRegistrations;
this.registrationsLastUpdated = new Date();
this._onRegistrationsUpdated.fire(this._registrations);
})
@@ -157,10 +157,6 @@ export class ControllerModel {
return this._endpoints.find(e => e.name === name);
}
public get namespace(): string {
return this._namespace;
}
public get registrations(): Registration[] {
return this._registrations;
}
@@ -171,19 +167,10 @@ export class ControllerModel {
public getRegistration(type: ResourceType, name: string): Registration | undefined {
return this._registrations.find(r => {
return r.instanceType === type && parseInstanceName(r.instanceName) === name;
return r.instanceType === type && r.instanceName === name;
});
}
public async deleteRegistration(_type: ResourceType, _name: string) {
/* TODO chgagnon
if (r && !r.isDeleted && r.customObjectName) {
const r = this.getRegistration(type, name);
await this._registrationRouter.apiV1RegistrationNsNameIsDeletedDelete(this._namespace, r.customObjectName, true);
}
*/
}
/**
* property to for use a display label for this controller
*/

View File

@@ -3,13 +3,15 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { ResourceInfo } from 'arc';
import { MiaaResourceInfo } from 'arc';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as vscode from 'vscode';
import { Deferred } from '../common/promise';
import { UserCancelledError } from '../common/utils';
import { createCredentialId, parseIpAndPort, UserCancelledError } from '../common/utils';
import { credentialNamespace } from '../constants';
import * as loc from '../localizedConstants';
import { ConnectToSqlDialog } from '../ui/dialogs/connectSqlDialog';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
import { ControllerModel, Registration } from './controllerModel';
import { ResourceModel } from './resourceModel';
@@ -35,8 +37,8 @@ export class MiaaModel extends ResourceModel {
private _refreshPromise: Deferred<void> | undefined = undefined;
constructor(private _controllerModel: ControllerModel, info: ResourceInfo, registration: Registration, private _treeDataProvider: AzureArcTreeDataProvider) {
super(info, registration);
constructor(private _controllerModel: ControllerModel, private _miaaInfo: MiaaResourceInfo, registration: Registration, private _treeDataProvider: AzureArcTreeDataProvider) {
super(_miaaInfo, registration);
this._azdataApi = <azdataExt.IExtension>vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
}
@@ -155,83 +157,58 @@ export class MiaaModel extends ResourceModel {
if (this._connectionProfile) {
return;
}
let connection: azdata.connection.ConnectionProfile | azdata.connection.Connection | undefined;
const ipAndPort = parseIpAndPort(this.config?.status.externalEndpoint || '');
let connectionProfile: azdata.IConnectionProfile | undefined = {
serverName: `${ipAndPort.ip},${ipAndPort.port}`,
databaseName: '',
authenticationType: 'SqlLogin',
providerName: 'MSSQL',
connectionName: '',
userName: this._miaaInfo.userName || '',
password: '',
savePassword: true,
groupFullName: undefined,
saveProfile: true,
id: '',
groupId: undefined,
options: {}
};
// If we have the ID stored then try to retrieve the password from previous connections
if (this.info.connectionId) {
try {
const connections = await azdata.connection.getConnections();
const existingConnection = connections.find(conn => conn.connectionId === this.info.connectionId);
if (existingConnection) {
const credentials = await azdata.connection.getCredentials(this.info.connectionId);
if (credentials) {
existingConnection.options['password'] = credentials.password;
connection = existingConnection;
} else {
// We need the password so prompt the user for it
const connectionProfile: azdata.IConnectionProfile = {
serverName: existingConnection.options['serverName'],
databaseName: existingConnection.options['databaseName'],
authenticationType: existingConnection.options['authenticationType'],
providerName: 'MSSQL',
connectionName: '',
userName: existingConnection.options['user'],
password: '',
savePassword: false,
groupFullName: undefined,
saveProfile: true,
id: '',
groupId: undefined,
options: existingConnection.options
};
connection = await azdata.connection.openConnectionDialog(['MSSQL'], connectionProfile);
const credentialProvider = await azdata.credentials.getProvider(credentialNamespace);
const credentials = await credentialProvider.readCredential(createCredentialId(this._controllerModel.info.id, this.info.resourceType, this.info.name));
if (credentials.password) {
// Try to connect to verify credentials are still valid
connectionProfile.password = credentials.password;
// If we don't have a username for some reason then just continue on and we'll prompt for the username below
if (connectionProfile.userName) {
const result = await azdata.connection.connect(connectionProfile, false, false);
if (!result.connected) {
vscode.window.showErrorMessage(loc.connectToSqlFailed(connectionProfile.serverName, result.errorMessage));
const connectToSqlDialog = new ConnectToSqlDialog(this._controllerModel, this);
connectToSqlDialog.showDialog(connectionProfile);
connectionProfile = await connectToSqlDialog.waitForClose();
}
}
}
} catch (err) {
// ignore - the connection may not necessarily exist anymore and in that case we'll just reprompt for a connection
console.warn(`Unexpected error fetching password for MIAA instance ${err}`);
// ignore - something happened fetching the password so just reprompt
}
}
if (!connection) {
// We need the password so prompt the user for it
const connectionProfile: azdata.IConnectionProfile = {
// TODO chgagnon fill in external IP and port
// serverName: (this.registration.externalIp && this.registration.externalPort) ? `${this.registration.externalIp},${this.registration.externalPort}` : '',
serverName: '',
databaseName: '',
authenticationType: 'SqlLogin',
providerName: 'MSSQL',
connectionName: '',
userName: 'sa',
password: '',
savePassword: true,
groupFullName: undefined,
saveProfile: true,
id: '',
groupId: undefined,
options: {}
};
// Weren't able to load the existing connection so prompt user for new one
connection = await azdata.connection.openConnectionDialog(['MSSQL'], connectionProfile);
if (!connectionProfile?.userName || !connectionProfile?.password) {
// Need to prompt user for password since we don't have one stored
const connectToSqlDialog = new ConnectToSqlDialog(this._controllerModel, this);
connectToSqlDialog.showDialog(connectionProfile);
connectionProfile = await connectToSqlDialog.waitForClose();
}
if (connection) {
const profile = {
// The option name might be different here based on where it came from
serverName: connection.options['serverName'] || connection.options['server'],
databaseName: connection.options['databaseName'] || connection.options['database'],
authenticationType: connection.options['authenticationType'],
providerName: 'MSSQL',
connectionName: '',
userName: connection.options['user'],
password: connection.options['password'],
savePassword: false,
groupFullName: undefined,
saveProfile: true,
id: connection.connectionId,
groupId: undefined,
options: connection.options
};
this.updateConnectionProfile(profile);
if (connectionProfile) {
this.updateConnectionProfile(connectionProfile);
} else {
throw new UserCancelledError();
}
@@ -240,6 +217,7 @@ export class MiaaModel extends ResourceModel {
private async updateConnectionProfile(connectionProfile: azdata.IConnectionProfile): Promise<void> {
this._connectionProfile = connectionProfile;
this.info.connectionId = connectionProfile.id;
this._miaaInfo.userName = connectionProfile.userName;
await this._treeDataProvider.saveControllers();
}
}

View File

@@ -4,278 +4,103 @@
*--------------------------------------------------------------------------------------------*/
import { ResourceInfo } from 'arc';
import * as azdataExt from 'azdata-ext';
import * as vscode from 'vscode';
import * as loc from '../localizedConstants';
import { Registration } from './controllerModel';
import { ControllerModel, Registration } from './controllerModel';
import { ResourceModel } from './resourceModel';
export enum PodRole {
Monitor,
Router,
Shard
}
export interface V1Pod {
'apiVersion'?: string;
'kind'?: string;
'metadata'?: any; // V1ObjectMeta;
'spec'?: any; // V1PodSpec;
'status'?: V1PodStatus;
}
export interface V1PodStatus {
'conditions'?: any[]; // Array<V1PodCondition>;
'containerStatuses'?: Array<V1ContainerStatus>;
'ephemeralContainerStatuses'?: any[]; // Array<V1ContainerStatus>;
'hostIP'?: string;
'initContainerStatuses'?: any[]; // Array<V1ContainerStatus>;
'message'?: string;
'nominatedNodeName'?: string;
'phase'?: string;
'podIP'?: string;
'podIPs'?: any[]; // Array<V1PodIP>;
'qosClass'?: string;
'reason'?: string;
'startTime'?: Date | null;
}
export interface V1ContainerStatus {
'containerID'?: string;
'image'?: string;
'imageID'?: string;
'lastState'?: any; // V1ContainerState;
'name'?: string;
'ready'?: boolean;
'restartCount'?: number;
'started'?: boolean | null;
'state'?: any; // V1ContainerState;
}
export interface DuskyObjectModelsDatabaseService {
'apiVersion'?: string;
'kind'?: string;
'metadata'?: any; // V1ObjectMeta;
'spec'?: any; // DuskyObjectModelsDatabaseServiceSpec;
'status'?: any; // DuskyObjectModelsDatabaseServiceStatus;
'arc'?: any; // DuskyObjectModelsDatabaseServiceArcPayload;
}
export interface V1Status {
'apiVersion'?: string;
'code'?: number | null;
'details'?: any; // V1StatusDetails;
'kind'?: string;
'message'?: string;
'metadata'?: any; // V1ListMeta;
'reason'?: string;
'status'?: string;
'hasObject'?: boolean;
}
export interface DuskyObjectModelsDatabase {
'name'?: string;
'owner'?: string;
'sharded'?: boolean | null;
}
import { Deferred } from '../common/promise';
import { parseIpAndPort } from '../common/utils';
export class PostgresModel extends ResourceModel {
private _service?: DuskyObjectModelsDatabaseService;
private _pods?: V1Pod[];
private readonly _onServiceUpdated = new vscode.EventEmitter<DuskyObjectModelsDatabaseService>();
private readonly _onPodsUpdated = new vscode.EventEmitter<V1Pod[]>();
public onServiceUpdated = this._onServiceUpdated.event;
public onPodsUpdated = this._onPodsUpdated.event;
public serviceLastUpdated?: Date;
public podsLastUpdated?: Date;
private _config?: azdataExt.PostgresServerShowResult;
private readonly _azdataApi: azdataExt.IExtension;
constructor(info: ResourceInfo, registration: Registration) {
private readonly _onConfigUpdated = new vscode.EventEmitter<azdataExt.PostgresServerShowResult>();
public onConfigUpdated = this._onConfigUpdated.event;
public configLastUpdated?: Date;
private _refreshPromise?: Deferred<void>;
constructor(private _controllerModel: ControllerModel, info: ResourceInfo, registration: Registration) {
super(info, registration);
this._azdataApi = <azdataExt.IExtension>vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
}
/** Returns the service's Kubernetes namespace */
public get namespace(): string | undefined {
return ''; // TODO chgagnon return this.info.namespace;
/** Returns the configuration of Postgres */
public get config(): azdataExt.PostgresServerShowResult | undefined {
return this._config;
}
/** Returns the service's name */
public get name(): string {
return this.info.name;
/** Returns the major version of Postgres */
public get engineVersion(): string | undefined {
const kind = this._config?.kind;
return kind
? kind.substring(kind.lastIndexOf('-') + 1)
: undefined;
}
/** Returns the service's fully qualified name in the format namespace.name */
public get fullName(): string {
return `${this.namespace}.${this.name}`;
/** Returns the IP address and port of Postgres */
public get endpoint(): { ip: string, port: string } | undefined {
return this._config?.status.externalEndpoint
? parseIpAndPort(this._config.status.externalEndpoint)
: undefined;
}
/** Returns the service's spec */
public get service(): DuskyObjectModelsDatabaseService | undefined {
return this._service;
}
/** Returns the scale configuration of Postgres e.g. '3 nodes, 1.5 vCores, 1Gi RAM, 2Gi storage per node' */
public get scaleConfiguration(): string | undefined {
if (!this._config) {
return undefined;
}
/** Returns the service's pods */
public get pods(): V1Pod[] | undefined {
return this._pods;
}
const cpuLimit = this._config.spec.scheduling?.default?.resources?.limits?.cpu;
const ramLimit = this._config.spec.scheduling?.default?.resources?.limits?.memory;
const cpuRequest = this._config.spec.scheduling?.default?.resources?.requests?.cpu;
const ramRequest = this._config.spec.scheduling?.default?.resources?.requests?.memory;
const storage = this._config.spec.storage?.data?.size;
/** Refreshes the model */
public async refresh() {
await Promise.all([
/* TODO enable
this._databaseRouter.getDuskyDatabaseService(this.info.namespace || 'test', this.info.name).then(response => {
this._service = response.body;
this.serviceLastUpdated = new Date();
this._onServiceUpdated.fire(this._service);
}),
this._databaseRouter.getDuskyPods(this.info.namespace || 'test', this.info.name).then(response => {
this._pods = response.body;
this.podsLastUpdated = new Date();
this._onPodsUpdated.fire(this._pods!);
})
*/
]);
}
/**
* Updates the service
* @param func A function of modifications to apply to the service
*/
public async update(_func: (service: DuskyObjectModelsDatabaseService) => void): Promise<DuskyObjectModelsDatabaseService> {
return <any>undefined;
/*
// Get the latest spec of the service in case it has changed
const service = (await this._databaseRouter.getDuskyDatabaseService(this.info.namespace || 'test', this.info.name)).body;
service.status = undefined; // can't update the status
func(service);
return await this._databaseRouter.updateDuskyDatabaseService(this.namespace || 'test', this.name, service).then(r => {
this._service = r.body;
return this._service;
});
*/
}
/** Deletes the service */
public async delete(): Promise<V1Status> {
return <any>undefined;
// return (await this._databaseRouter.deleteDuskyDatabaseService(this.info.namespace || 'test', this.info.name)).body;
}
/** Creates a SQL database in the service */
public async createDatabase(_db: DuskyObjectModelsDatabase): Promise<DuskyObjectModelsDatabase> {
return <any>undefined;
// return (await this._databaseRouter.createDuskyDatabase(this.namespace || 'test', this.name, db)).body;
}
/**
* Returns the IP address and port of the service, preferring external IP over
* internal IP. If either field is not available it will be set to undefined.
*/
public get endpoint(): { ip?: string, port?: number } {
const externalIp = this._service?.status?.externalIP;
const internalIp = this._service?.status?.internalIP;
const externalPort = this._service?.status?.externalPort;
const internalPort = this._service?.status?.internalPort;
return externalIp ? { ip: externalIp, port: externalPort ?? undefined }
: internalIp ? { ip: internalIp, port: internalPort ?? undefined }
: { ip: undefined, port: undefined };
}
/** Returns the service's configuration e.g. '3 nodes, 1.5 vCores, 1GiB RAM, 2GiB storage per node' */
public get configuration(): string {
// TODO: Resource requests and limits can be configured per role. Figure out how
// to display that in the UI. For now, only show the default configuration.
const cpuLimit = this._service?.spec?.scheduling?._default?.resources?.limits?.['cpu'];
const ramLimit = this._service?.spec?.scheduling?._default?.resources?.limits?.['memory'];
const cpuRequest = this._service?.spec?.scheduling?._default?.resources?.requests?.['cpu'];
const ramRequest = this._service?.spec?.scheduling?._default?.resources?.requests?.['memory'];
const storage = this._service?.spec?.storage?.volumeSize;
const nodes = this.pods?.length;
// scale.shards was renamed to scale.workers. Check both for backwards compatibility.
const scale = this._config.spec.scale;
const nodes = (scale?.workers ?? scale?.shards ?? 0) + 1; // An extra node for the coordinator
let configuration: string[] = [];
if (nodes) {
configuration.push(`${nodes} ${nodes > 1 ? loc.nodes : loc.node}`);
}
configuration.push(`${nodes} ${nodes > 1 ? loc.nodes : loc.node}`);
// Prefer limits if they're provided, otherwise use requests if they're provided
if (cpuLimit || cpuRequest) {
configuration.push(`${this.formatCores(cpuLimit ?? cpuRequest!)} ${loc.vCores}`);
configuration.push(`${cpuLimit ?? cpuRequest!} ${loc.vCores}`);
}
if (ramLimit || ramRequest) {
configuration.push(`${this.formatMemory(ramLimit ?? ramRequest!)} ${loc.ram}`);
configuration.push(`${ramLimit ?? ramRequest!} ${loc.ram}`);
}
if (storage) {
configuration.push(`${this.formatMemory(storage)} ${loc.storagePerNode}`);
configuration.push(`${storage} ${loc.storagePerNode}`);
}
return configuration.join(', ');
}
/** Given a V1Pod, returns its PodRole or undefined if the role isn't known */
public static getPodRole(pod: V1Pod): PodRole | undefined {
const name = pod.metadata?.name;
const role = name?.substring(name.lastIndexOf('-'))[1];
switch (role) {
case 'm': return PodRole.Monitor;
case 'r': return PodRole.Router;
case 's': return PodRole.Shard;
default: return undefined;
/** Refreshes the model */
public async refresh() {
// Only allow one refresh to be happening at a time
if (this._refreshPromise) {
return this._refreshPromise.promise;
}
}
this._refreshPromise = new Deferred();
/** Given a PodRole, returns its localized name */
public static getPodRoleName(role?: PodRole): string {
switch (role) {
case PodRole.Monitor: return loc.monitor;
case PodRole.Router: return loc.coordinator;
case PodRole.Shard: return loc.worker;
default: return '';
try {
await this._controllerModel.azdataLogin();
this._config = (await this._azdataApi.azdata.arc.postgres.server.show(this.info.name)).result;
this.configLastUpdated = new Date();
this._onConfigUpdated.fire(this._config);
this._refreshPromise.resolve();
} catch (err) {
this._refreshPromise.reject(err);
throw err;
} finally {
this._refreshPromise = undefined;
}
}
/** Given a V1Pod returns its status */
public static getPodStatus(pod: V1Pod): string {
const phase = pod.status?.phase;
if (phase !== 'Running') {
return phase ?? '';
}
// Pods can be in the running phase while some
// containers are crashing, so check those too.
for (let c of pod.status?.containerStatuses?.filter(c => !c.ready) ?? []) {
const wReason = c.state?.waiting?.reason;
const tReason = c.state?.terminated?.reason;
if (wReason) { return wReason; }
if (tReason) { return tReason; }
}
return loc.running;
}
/**
* Converts millicores to cores (600m -> 0.6 cores)
* https://kubernetes.io/docs/concepts/configuration/manage-compute-resources-container/#meaning-of-cpu
* @param cores The millicores to format e.g. 600m
*/
private formatCores(cores: string): number {
return cores?.endsWith('m') ? +cores.slice(0, -1) / 1000 : +cores;
}
/**
* Formats the memory to end with 'B' e.g:
* 1 -> 1B
* 1K -> 1KB, 1Ki -> 1KiB
* 1M -> 1MB, 1Mi -> 1MiB
* 1G -> 1GB, 1Gi -> 1GiB
* 1T -> 1TB, 1Ti -> 1TiB
* https://kubernetes.io/docs/concepts/configuration/manage-compute-resources-container/#meaning-of-memory
* @param memory The amount + unit of memory to format e.g. 1K
*/
private formatMemory(memory: string): string {
return memory && !memory.endsWith('B') ? `${memory}B` : memory;
}
}

View File

@@ -0,0 +1,66 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as arc from 'arc';
import * as azdata from 'azdata';
import * as rd from 'resource-deployment';
import { getControllerPassword, getRegisteredDataControllers, reacquireControllerPassword } from '../common/api';
import { CacheManager } from '../common/cacheManager';
import { throwUnless } from '../common/utils';
import * as loc from '../localizedConstants';
import { AzureArcTreeDataProvider } from '../ui/tree/azureArcTreeDataProvider';
/**
* Class that provides options sources for an Arc Data Controller
*/
export class ArcControllersOptionsSourceProvider implements rd.IOptionsSourceProvider {
private _cacheManager = new CacheManager<string, string>();
readonly optionsSourceId = 'arc.controllers';
constructor(private _treeProvider: AzureArcTreeDataProvider) { }
async getOptions(): Promise<string[] | azdata.CategoryValue[]> {
const controllers = await getRegisteredDataControllers(this._treeProvider);
throwUnless(controllers !== undefined && controllers.length !== 0, loc.noControllersConnected);
return controllers.map(ci => {
return ci.label;
});
}
private async retrieveVariable(key: string): Promise<string> {
const [variableName, controllerLabel] = JSON.parse(key);
const controller = (await getRegisteredDataControllers(this._treeProvider)).find(ci => ci.label === controllerLabel);
throwUnless(controller !== undefined, loc.noControllerInfoFound(controllerLabel));
switch (variableName) {
case 'endpoint': return controller.info.url;
case 'username': return controller.info.username;
case 'password': return this.getPassword(controller);
default: throw new Error(loc.variableValueFetchForUnsupportedVariable(variableName));
}
}
getVariableValue(variableName: string, controllerLabel: string): Promise<string> {
// capture 'this' in an arrow function object
const retrieveVariable = (key: string) => this.retrieveVariable(key);
return this._cacheManager.getCacheEntry(JSON.stringify([variableName, controllerLabel]), retrieveVariable);
}
private async getPassword(controller: arc.DataController): Promise<string> {
let password = await getControllerPassword(this._treeProvider, controller.info);
if (!password) {
password = await reacquireControllerPassword(this._treeProvider, controller.info);
}
throwUnless(password !== undefined, loc.noPasswordFound(controller.label));
return password;
}
getIsPassword(variableName: string): boolean {
switch (variableName) {
case 'endpoint': return false;
case 'username': return false;
case 'password': return true;
default: throw new Error(loc.isPasswordFetchForUnsupportedVariable(variableName));
}
}
}

View File

@@ -1,2 +0,0 @@
/env
/__pycache__

View File

@@ -1,21 +0,0 @@
# Tests for deploying Arc resources via Jupyter notebook
## Prerequisites
- Python >= 3.6
- Pip package manager
- Azdata CLI installed and logged into an Arc controller
## Running the tests
### 1. (Optional, recommended) Create and activate a Python virtual environment
- `python -m venv env`
- `source env/bin/activate` (Linux)
- `env\Scripts\activate.bat` (Windows)
### 2. Upgrade pip
- `pip install --upgrade pip`
### 3. Install the dependencies
- `pip install -r requirements.txt`
### 4. Run the tests
- `pytest`

View File

@@ -0,0 +1,62 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import 'mocha';
import * as path from 'path';
import * as sinon from 'sinon';
import * as yamljs from 'yamljs';
import { getDefaultKubeConfigPath, getKubeConfigClusterContexts, KubeClusterContext } from '../../common/kubeUtils';
import { tryExecuteAction } from '../../common/utils';
const kubeConfig =
{
'contexts': [
{
'context': {
'cluster': 'docker-desktop',
'user': 'docker-desktop'
},
'name': 'docker-for-desktop'
},
{
'context': {
'cluster': 'kubernetes',
'user': 'kubernetes-admin'
},
'name': 'kubernetes-admin@kubernetes'
}
],
'current-context': 'docker-for-desktop'
};
describe('KubeUtils', function (): void {
const configFile = 'kubeConfig';
afterEach('KubeUtils cleanup', () => {
sinon.restore();
});
it('getDefaultKubeConfigPath', async () => {
getDefaultKubeConfigPath().should.endWith(path.join('.kube', 'config'));
});
describe('get Kube Config Cluster Contexts', () => {
it('success', async () => {
sinon.stub(yamljs, 'load').returns(<any>kubeConfig);
const verifyContexts = (contexts: KubeClusterContext[], testName: string) => {
contexts.length.should.equal(2, `test: ${testName} failed`);
contexts[0].name.should.equal('docker-for-desktop', `test: ${testName} failed`);
contexts[0].isCurrentContext.should.be.true(`test: ${testName} failed`);
contexts[1].name.should.equal('kubernetes-admin@kubernetes', `test: ${testName} failed`);
contexts[1].isCurrentContext.should.be.false(`test: ${testName} failed`);
};
verifyContexts(await getKubeConfigClusterContexts(configFile), 'getKubeConfigClusterContexts');
});
it('throws error when unable to load config file', async () => {
const error = new Error('unknown error accessing file');
sinon.stub(yamljs, 'load').throws(error); //erroring config file load
((await tryExecuteAction(() => getKubeConfigClusterContexts(configFile))).error).should.equal(error, `test: getKubeConfigClusterContexts failed`);
});
});
});

View File

@@ -7,7 +7,7 @@ import { ResourceType } from 'arc';
import 'mocha';
import * as should from 'should';
import * as vscode from 'vscode';
import { getAzurecoreApi, getConnectionModeDisplayText, getDatabaseStateDisplayText, getErrorMessage, getResourceTypeIcon, parseEndpoint, parseInstanceName, parseIpAndPort, promptAndConfirmPassword, promptForResourceDeletion, resourceTypeToDisplayName } from '../../common/utils';
import { getAzurecoreApi, getConnectionModeDisplayText, getDatabaseStateDisplayText, getErrorMessage, getResourceTypeIcon, parseEndpoint, parseIpAndPort, promptAndConfirmPassword, promptForInstanceDeletion, resourceTypeToDisplayName, convertToGibibyteString } from '../../common/utils';
import { ConnectionMode as ConnectionMode, IconPathHelper } from '../../constants';
import * as loc from '../../localizedConstants';
import { MockInputBox } from '../stubs';
@@ -47,24 +47,6 @@ describe('parseEndpoint Method Tests', function (): void {
});
});
describe('parseInstanceName Method Tests', () => {
it('Should parse valid instanceName with namespace correctly', function (): void {
should(parseInstanceName('mynamespace_myinstance')).equal('myinstance');
});
it('Should parse valid instanceName without namespace correctly', function (): void {
should(parseInstanceName('myinstance')).equal('myinstance');
});
it('Should return empty string when undefined value passed in', function (): void {
should(parseInstanceName(undefined)).equal('');
});
it('Should return empty string when empty string value passed in', function (): void {
should(parseInstanceName('')).equal('');
});
});
describe('getAzurecoreApi Method Tests', function () {
it('Should get azurecore API correctly', function (): void {
should(getAzurecoreApi()).not.be.undefined();
@@ -140,7 +122,7 @@ describe('promptForResourceDeletion Method Tests', function (): void {
});
it('Resolves as true when value entered is correct', function (done): void {
promptForResourceDeletion('myname').then((value: boolean) => {
promptForInstanceDeletion('myname').then((value: boolean) => {
value ? done() : done(new Error('Expected return value to be true'));
});
mockInputBox.value = 'myname';
@@ -148,14 +130,14 @@ describe('promptForResourceDeletion Method Tests', function (): void {
});
it('Resolves as false when input box is closed early', function (done): void {
promptForResourceDeletion('myname').then((value: boolean) => {
promptForInstanceDeletion('myname').then((value: boolean) => {
!value ? done() : done(new Error('Expected return value to be false'));
});
mockInputBox.hide();
});
it('Validation message is set when value entered is incorrect', async function (): Promise<void> {
promptForResourceDeletion('myname');
promptForInstanceDeletion('myname');
mockInputBox.value = 'wrong value';
await mockInputBox.triggerAccept();
should(mockInputBox.validationMessage).not.be.equal('', 'Validation message should not be empty after incorrect value entered');
@@ -260,22 +242,6 @@ describe('getErrorMessage Method Tests', function () {
});
});
describe('parseInstanceName Method Tests', function () {
it('2 part name', function (): void {
const name = 'MyName';
should(parseInstanceName(`MyNamespace_${name}`)).equal(name);
});
it('1 part name', function (): void {
const name = 'MyName';
should(parseInstanceName(name)).equal(name);
});
it('Invalid name', function (): void {
should(() => parseInstanceName('Some_Invalid_Name')).throwError();
});
});
describe('parseIpAndPort', function (): void {
it('Valid address', function (): void {
const ip = '127.0.0.1';
@@ -288,3 +254,116 @@ describe('parseIpAndPort', function (): void {
should(() => parseIpAndPort(ip)).throwError();
});
});
describe('convertToGibibyteString Method Tests', function () {
const tolerance = 0.001;
it('Value is in KB', function (): void {
const value = '44000K';
const conversion = 0.04097819;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in MB', function (): void {
const value = '1100M';
const conversion = 1.02445483;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in GB', function (): void {
const value = '1G';
const conversion = 0.931322575;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in TB', function (): void {
const value = '1T';
const conversion = 931.32257;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in PB', function (): void {
const value = '0.1P';
const conversion = 93132.25746;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in EB', function (): void {
const value = '1E';
const conversion = 931322574.6154;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in mB', function (): void {
const value = '1073741824000m';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in B', function (): void {
const value = '1073741824';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in KiB', function (): void {
const value = '1048576Ki';
const conversion = 1;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in MiB', function (): void {
const value = '256Mi';
const conversion = 0.25;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in GiB', function (): void {
const value = '1000Gi';
const conversion = 1000;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in TiB', function (): void {
const value = '1Ti';
const conversion = 1024;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in PiB', function (): void {
const value = '1Pi';
const conversion = 1048576;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is in EiB', function (): void {
const value = '1Ei';
const conversion = 1073741824;
const check = Math.abs(conversion - parseFloat(convertToGibibyteString(value)));
should(check).lessThanOrEqual(tolerance);
});
it('Value is empty', function (): void {
const value = '';
const error = new Error(`Value provided is not a valid Kubernetes resource quantity`);
should(() => convertToGibibyteString(value)).throwError(error);
});
it('Value is not a valid Kubernetes resource quantity', function (): void {
const value = '1J';
const error = new Error(`${value} is not a valid Kubernetes resource quantity`);
should(() => convertToGibibyteString(value)).throwError(error);
});
});

View File

@@ -0,0 +1,86 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdataExt from 'azdata-ext';
/**
* Simple fake Azdata Api used to mock the API during tests
*/
export class FakeAzdataApi implements azdataExt.IAzdataApi {
public postgresInstances: azdataExt.PostgresServerListResult[] = [];
public miaaInstances: azdataExt.SqlMiListResult[] = [];
//
// API Implementation
//
public get arc() {
const self = this;
return {
dc: {
create(_namespace: string, _name: string, _connectivityMode: string, _resourceGroup: string, _location: string, _subscription: string, _profileName?: string, _storageClass?: string): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); },
endpoint: {
async list(): Promise<azdataExt.AzdataOutput<azdataExt.DcEndpointListResult[]>> { return <any>{ result: [] }; }
},
config: {
list(): Promise<azdataExt.AzdataOutput<azdataExt.DcConfigListResult[]>> { throw new Error('Method not implemented.'); },
async show(): Promise<azdataExt.AzdataOutput<azdataExt.DcConfigShowResult>> { return <any>{ result: undefined! }; }
}
},
postgres: {
server: {
delete(_name: string): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); },
async list(): Promise<azdataExt.AzdataOutput<azdataExt.PostgresServerListResult[]>> { return <any>{ result: self.postgresInstances }; },
show(_name: string): Promise<azdataExt.AzdataOutput<azdataExt.PostgresServerShowResult>> { throw new Error('Method not implemented.'); },
edit(
_name: string,
_args: {
adminPassword?: boolean,
coresLimit?: string,
coresRequest?: string,
engineSettings?: string,
extensions?: string,
memoryLimit?: string,
memoryRequest?: string,
noWait?: boolean,
port?: number,
replaceEngineSettings?: boolean,
workers?: number
},
_additionalEnvVars?: { [key: string]: string }): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); }
}
},
sql: {
mi: {
delete(_name: string): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); },
async list(): Promise<azdataExt.AzdataOutput<azdataExt.SqlMiListResult[]>> { return <any>{ result: self.miaaInstances }; },
show(_name: string): Promise<azdataExt.AzdataOutput<azdataExt.SqlMiShowResult>> { throw new Error('Method not implemented.'); },
edit(
_name: string,
_args: {
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string,
noWait?: boolean
}): Promise<azdataExt.AzdataOutput<void>> { throw new Error('Method not implemented.'); }
}
}
};
}
getPath(): Promise<string> {
throw new Error('Method not implemented.');
}
login(_endpoint: string, _username: string, _password: string): Promise<azdataExt.AzdataOutput<any>> {
return <any>undefined;
}
version(): Promise<azdataExt.AzdataOutput<string>> {
throw new Error('Method not implemented.');
}
getSemVersion(): any {
throw new Error('Method not implemented.');
}
}

View File

@@ -0,0 +1,91 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as vscode from 'vscode';
export class FakeRadioButton implements azdata.RadioButtonComponent {
private _onDidClickEmitter = new vscode.EventEmitter<any>();
onDidClick = this._onDidClickEmitter.event;
constructor(props: azdata.RadioButtonProperties) {
this.label = props.label;
this.value = props.value;
this.checked = props.checked;
this.enabled = props.enabled;
}
//#region RadioButtonProperties implementation
label?: string;
value?: string;
checked?: boolean;
//#endregion
click() {
this.checked = true;
this._onDidClickEmitter.fire(this);
}
//#region Component Implementation
id: string = '';
updateProperties(_properties: { [key: string]: any; }): Thenable<void> {
throw new Error('Method not implemented.');
}
updateProperty(_key: string, _value: any): Thenable<void> {
throw new Error('Method not implemented.');
}
updateCssStyles(_cssStyles: { [key: string]: string; }): Thenable<void> {
throw new Error('Method not implemented.');
}
onValidityChanged: vscode.Event<boolean> = <vscode.Event<boolean>>{};
valid: boolean = false;
validate(): Thenable<boolean> {
throw new Error('Method not implemented.');
}
focus(): Thenable<void> {
throw new Error('Method not implemented.');
}
ariaHidden?: boolean | undefined;
//#endregion
//#region ComponentProperties Implementation
height?: number | string;
width?: number | string;
/**
* The position CSS property. Empty by default.
* This is particularly useful if laying out components inside a FlexContainer and
* the size of the component is meant to be a fixed size. In this case the position must be
* set to 'absolute', with the parent FlexContainer having 'relative' position.
* Without this the component will fail to correctly size itself
*/
position?: azdata.PositionType;
/**
* Whether the component is enabled in the DOM
*/
enabled?: boolean;
/**
* Corresponds to the display CSS property for the element
*/
display?: azdata.DisplayType;
/**
* Corresponds to the aria-label accessibility attribute for this component
*/
ariaLabel?: string;
/**
* Corresponds to the role accessibility attribute for this component
*/
ariaRole?: string;
/**
* Corresponds to the aria-selected accessibility attribute for this component
*/
ariaSelected?: boolean;
/**
* Matches the CSS style key and its available values.
*/
CSSStyles?: { [key: string]: string };
//#endregion
}

View File

@@ -43,7 +43,7 @@ describe('ControllerModel', function (): void {
});
it('Reads password from cred store', async function (): Promise<void> {
const password = 'password123';
const password = 'password123'; // [SuppressMessage("Microsoft.Security", "CS001:SecretInline", Justification="Test password, not actually used")]
// Set up cred store to return our password
const credProviderMock = TypeMoq.Mock.ofType<azdata.CredentialProvider>();

View File

@@ -1,2 +0,0 @@
pytest==5.3.5
notebook==6.0.3

View File

@@ -3,8 +3,66 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as TypeMoq from 'typemoq';
import * as vscode from 'vscode';
export function createModelViewMock() {
const mockModelBuilder = TypeMoq.Mock.ofType<azdata.ModelBuilder>();
const mockTextBuilder = setupMockComponentBuilder<azdata.TextComponent, azdata.TextComponentProperties>();
const mockInputBoxBuilder = setupMockComponentBuilder<azdata.InputBoxComponent, azdata.InputBoxProperties>();
const mockRadioButtonBuilder = setupMockComponentBuilder<azdata.RadioButtonComponent, azdata.RadioButtonProperties>();
const mockDivBuilder = setupMockContainerBuilder<azdata.DivContainer, azdata.DivContainerProperties, azdata.DivBuilder>();
const mockLoadingBuilder = setupMockLoadingBuilder();
mockModelBuilder.setup(b => b.loadingComponent()).returns(() => mockLoadingBuilder.object);
mockModelBuilder.setup(b => b.text()).returns(() => mockTextBuilder.object);
mockModelBuilder.setup(b => b.inputBox()).returns(() => mockInputBoxBuilder.object);
mockModelBuilder.setup(b => b.radioButton()).returns(() => mockRadioButtonBuilder.object);
mockModelBuilder.setup(b => b.divContainer()).returns(() => mockDivBuilder.object);
const mockModelView = TypeMoq.Mock.ofType<azdata.ModelView>();
mockModelView.setup(mv => mv.modelBuilder).returns(() => mockModelBuilder.object);
return { mockModelView, mockModelBuilder, mockTextBuilder, mockInputBoxBuilder, mockRadioButtonBuilder, mockDivBuilder };
}
function setupMockLoadingBuilder(
loadingBuilderGetter?: (item: azdata.Component) => azdata.LoadingComponentBuilder,
mockLoadingBuilder?: TypeMoq.IMock<azdata.LoadingComponentBuilder>
): TypeMoq.IMock<azdata.LoadingComponentBuilder> {
mockLoadingBuilder = mockLoadingBuilder ?? setupMockComponentBuilder<azdata.LoadingComponent, azdata.LoadingComponentProperties, azdata.LoadingComponentBuilder>();
let item: azdata.Component;
mockLoadingBuilder.setup(b => b.withItem(TypeMoq.It.isAny())).callback((_item) => item = _item).returns(() => loadingBuilderGetter ? loadingBuilderGetter(item) : mockLoadingBuilder!.object);
return mockLoadingBuilder;
}
export function setupMockComponentBuilder<T extends azdata.Component, P extends azdata.ComponentProperties, B extends azdata.ComponentBuilder<T, P> = azdata.ComponentBuilder<T, P>>(
componentGetter?: (props: P) => T,
mockComponentBuilder?: TypeMoq.IMock<B>,
): TypeMoq.IMock<B> {
mockComponentBuilder = mockComponentBuilder ?? TypeMoq.Mock.ofType<B>();
const returnComponent = TypeMoq.Mock.ofType<T>();
// Need to setup 'then' for when a mocked object is resolved otherwise the test will hang : https://github.com/florinn/typemoq/issues/66
returnComponent.setup((x: any) => x.then).returns(() => { });
let compProps: P;
mockComponentBuilder.setup(b => b.withProperties(TypeMoq.It.isAny())).callback((props: P) => compProps = props).returns(() => mockComponentBuilder!.object);
mockComponentBuilder.setup(b => b.component()).returns(() => {
return componentGetter ? componentGetter(compProps) : Object.assign<T, P>(Object.assign({}, returnComponent.object), compProps);
});
// For now just have these be passthrough - can hook up additional functionality later if needed
mockComponentBuilder.setup(b => b.withValidation(TypeMoq.It.isAny())).returns(() => mockComponentBuilder!.object);
return mockComponentBuilder;
}
export function setupMockContainerBuilder<T extends azdata.Container<any, any>, P extends azdata.ComponentProperties, B extends azdata.ContainerBuilder<T, any, any, any> = azdata.ContainerBuilder<T, any, any, any>>(
mockContainerBuilder?: TypeMoq.IMock<B>
): TypeMoq.IMock<B> {
mockContainerBuilder = mockContainerBuilder ?? setupMockComponentBuilder<T, P, B>();
// For now just have these be passthrough - can hook up additional functionality later if needed
mockContainerBuilder.setup(b => b.withItems(TypeMoq.It.isAny(), undefined)).returns(() => mockContainerBuilder!.object);
mockContainerBuilder.setup(b => b.withLayout(TypeMoq.It.isAny())).returns(() => mockContainerBuilder!.object);
return mockContainerBuilder;
}
export class MockInputBox implements vscode.InputBox {
private _value: string = '';
public get value(): string {

View File

@@ -1,111 +0,0 @@
##---------------------------------------------------------------------------------------------
## Copyright (c) Microsoft Corporation. All rights reserved.
## Licensed under the Source EULA. See License.txt in the project root for license information.
##--------------------------------------------------------------------------------------------
import json
import nbformat
import os
import random
import string
import sys
import uuid
from nbconvert.preprocessors import ExecutePreprocessor
from subprocess import Popen, PIPE, TimeoutExpired
## Variables
notebook_path = '../../notebooks/arcDeployment/'
## Helper functions
def generate_name(prefix, length=8):
return (prefix + '-' + ''.join(
[random.choice(string.ascii_lowercase)
for n in range(length - len(prefix) - 1)]))
def clear_env():
for k in [k for k in os.environ.keys() if k.startswith('AZDATA_NB_VAR_')]:
del os.environ[k]
def azdata(commands, timeout=None, stdin=None):
commands.insert(0, "azdata")
print('Executing command: \n', ' '.join(commands))
proc = Popen(commands, stdin=PIPE if stdin is not None else None, stdout=PIPE, stderr=PIPE, shell=os.name=='nt')
try:
(stdout, stderr) = proc.communicate(input=stdin, timeout=timeout)
except TimeoutExpired:
# https://docs.python.org/3.5/library/subprocess.html#subprocess.Popen.communicate
# The child process is not killed if the timeout expires, so in order to
# cleanup properly we should kill the child process and finish communication.
proc.kill()
(stdout, stderr) = proc.communicate(timeout=timeout)
sys.stdout.buffer.write(stdout)
sys.stderr.buffer.write(stderr)
raise
sys.stdout.buffer.write(stdout)
if proc.returncode != 0:
raise Exception(stderr)
else:
sys.stderr.buffer.write(stderr)
return (stdout.decode(sys.stdout.encoding),
stderr.decode(sys.stderr.encoding))
## Tests
def test_postgres_create():
# Load the notebook
with open(notebook_path + 'deploy.postgres.existing.arc.ipynb') as f:
nb = nbformat.read(f, as_version=nbformat.NO_CONVERT)
name = generate_name('pg')
try:
# Setup the environment
os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAME'] = name
subscription = os.environ['AZDATA_NB_VAR_ARC_SUBSCRIPTION'] = str(uuid.uuid4())
resource_group = os.environ['AZDATA_NB_VAR_ARC_RESOURCE_GROUP_NAME'] = 'test'
namespace = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_NAMESPACE'] = 'default'
workers = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_WORKERS'] = '1'
service_type = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_SERVICE_TYPE'] = 'NodePort'
data_size = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_DATA_SIZE'] = '512'
port = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_PORT'] = '5431'
extensions = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_EXTENSIONS'] = 'pg_cron,postgis'
cpu_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CPU_MIN'] = '1'
cpu_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_CPU_MAX'] = '2'
memory_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_MIN'] = '256'
memory_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_MEMORY_MAX'] = '1023'
backup_sizes = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_SIZES'] = '512,1023'
backup_full_interval = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_FULL_INTERVAL'] = '20'
backup_delta_interval = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_DELTA_INTERVAL'] = '10'
backup_retention_min = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_RETENTION_MIN'] = '1,1GB;2,2GB'
backup_retention_max = os.environ['AZDATA_NB_VAR_POSTGRES_SERVER_GROUP_BACKUP_RETENTION_MAX'] = '2,2GB;3,3GB'
# Execute the notebook that creates Postgres
ExecutePreprocessor(timeout=1200).preprocess(nb, {'metadata': {'path': notebook_path}})
# Verify that Postgres was created successfully
(out, _) = azdata(['postgres', 'server', 'show', '-n', name])
db = json.loads(out)
assert db['metadata']['name'] == name
assert db['metadata']['namespace'] == namespace
assert db['spec']['scale']['shards'] == int(workers)
assert db['spec']['service']['type'] == service_type
assert db['spec']['storage']['volumeSize'] == data_size + 'Mi'
assert db['spec']['service']['port'] == int(port)
assert [p['name'] for p in db['spec']['engine']['plugins']] == ['pg_cron' ,'postgis']
assert db['spec']['scheduling']['default']['resources']['requests']['cpu'] == cpu_min
assert db['spec']['scheduling']['default']['resources']['limits']['cpu'] == cpu_max
assert db['spec']['scheduling']['default']['resources']['requests']['memory'] == memory_min + 'Mi'
assert db['spec']['scheduling']['default']['resources']['limits']['memory'] == memory_max + 'Mi'
assert [t['storage']['volumeSize'] for t in db['spec']['backups']['tiers']] == [b + 'Mi' for b in backup_sizes.split(',')]
assert db['spec']['backups']['fullMinutes'] == int(backup_full_interval)
assert db['spec']['backups']['deltaMinutes'] == int(backup_delta_interval)
for i in range(len(db['spec']['backups']['tiers'])):
assert db['spec']['backups']['tiers'][i]['retention']['minimums'] == backup_retention_min.split(';')[i].split(',')
assert db['spec']['backups']['tiers'][i]['retention']['maximums'] == backup_retention_max.split(';')[i].split(',')
except Exception:
# Capture cell outputs to help with debugging
print([c['outputs'] for c in nb['cells'] if c.get('outputs')])
raise
finally:
clear_env()
azdata(['postgres', 'server', 'delete', '-n', name])

View File

@@ -0,0 +1,88 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as should from 'should';
import { getErrorMessage } from '../../../common/utils';
import { RadioOptionsGroup, RadioOptionsInfo } from '../../../ui/components/radioOptionsGroup';
import { FakeRadioButton } from '../../mocks/fakeRadioButton';
import { setupMockComponentBuilder, createModelViewMock } from '../../stubs';
const loadingError = new Error('Error loading options');
const radioOptionsInfo = <RadioOptionsInfo>{
values: [
'value1',
'value2'
],
defaultValue: 'value2'
};
const divItems: azdata.Component[] = [];
let radioOptionsGroup: RadioOptionsGroup;
describe('radioOptionsGroup', function (): void {
beforeEach(async () => {
const { mockModelView, mockRadioButtonBuilder, mockDivBuilder } = createModelViewMock();
mockRadioButtonBuilder.reset(); // reset any previous mock so that we can set our own.
setupMockComponentBuilder<azdata.RadioButtonComponent, azdata.RadioButtonProperties>(
(props) => new FakeRadioButton(props),
mockRadioButtonBuilder,
);
mockDivBuilder.reset(); // reset previous setups so new setups we are about to create will replace the setups instead creating a recording chain
// create new setups for the DivContainer with custom behavior
setupMockComponentBuilder<azdata.DivContainer, azdata.DivContainerProperties, azdata.DivBuilder>(
() => <azdata.DivContainer>{
addItem: (item) => { divItems.push(item); },
clearItems: () => { divItems.length = 0; },
get items() { return divItems; },
},
mockDivBuilder
);
radioOptionsGroup = new RadioOptionsGroup(mockModelView.object, (_disposable) => { });
await radioOptionsGroup.load(async () => radioOptionsInfo);
});
it('verify construction and load', async () => {
should(radioOptionsGroup).not.be.undefined();
should(radioOptionsGroup.value).not.be.undefined();
radioOptionsGroup.value!.should.equal('value2', 'radio options group should be the default checked value');
// verify all the radioButtons created in the group
verifyRadioGroup();
});
it('onClick', async () => {
// click the radioButton corresponding to 'value1'
(divItems as FakeRadioButton[]).filter(r => r.value === 'value1').pop()!.click();
radioOptionsGroup.value!.should.equal('value1', 'radio options group should correspond to the radioButton that we clicked');
// verify all the radioButtons created in the group
verifyRadioGroup();
});
it('load throws', async () => {
radioOptionsGroup.load(() => { throw loadingError; });
//in error case radioButtons array wont hold radioButtons but holds a TextComponent with value equal to error string
divItems.length.should.equal(1, 'There is should be only one element in the divContainer when loading error happens');
const label = divItems[0] as azdata.TextComponent;
should(label.value).not.be.undefined();
label.value!.should.deepEqual(getErrorMessage(loadingError));
should(label.CSSStyles).not.be.undefined();
should(label.CSSStyles!.color).not.be.undefined();
label.CSSStyles!.color.should.equal('Red');
});
});
function verifyRadioGroup() {
const radioButtons = divItems as FakeRadioButton[];
radioButtons.length.should.equal(radioOptionsInfo.values!.length);
radioButtons.forEach(rb => {
should(rb.label).not.be.undefined();
should(rb.value).not.be.undefined();
should(rb.enabled).not.be.undefined();
rb.label!.should.equal(rb.value);
rb.enabled!.should.be.true();
});
}

View File

@@ -3,16 +3,21 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import { ControllerInfo } from 'arc';
import { ControllerInfo, ResourceType } from 'arc';
import 'mocha';
import * as should from 'should';
import * as TypeMoq from 'typemoq';
import * as sinon from 'sinon';
import { v4 as uuid } from 'uuid';
import * as vscode from 'vscode';
import * as azdataExt from 'azdata-ext';
import { ControllerModel } from '../../../models/controllerModel';
import { MiaaModel } from '../../../models/miaaModel';
import { AzureArcTreeDataProvider } from '../../../ui/tree/azureArcTreeDataProvider';
import { ControllerTreeNode } from '../../../ui/tree/controllerTreeNode';
import { MiaaTreeNode } from '../../../ui/tree/miaaTreeNode';
import { FakeControllerModel } from '../../mocks/fakeControllerModel';
import { FakeAzdataApi } from '../../mocks/fakeAzdataApi';
describe('AzureArcTreeDataProvider tests', function (): void {
let treeDataProvider: AzureArcTreeDataProvider;
@@ -84,6 +89,27 @@ describe('AzureArcTreeDataProvider tests', function (): void {
let children = await treeDataProvider.getChildren();
should(children.length).equal(0, 'After loading we should have 0 children');
});
it('should return all children of controller after loading', async function (): Promise<void> {
const mockArcExtension = TypeMoq.Mock.ofType<vscode.Extension<any>>();
const mockArcApi = TypeMoq.Mock.ofType<azdataExt.IExtension>();
mockArcExtension.setup(x => x.exports).returns(() => {
return mockArcApi.object;
});
const fakeAzdataApi = new FakeAzdataApi();
fakeAzdataApi.postgresInstances = [{ name: 'pg1', state: '', workers: 0 }];
fakeAzdataApi.miaaInstances = [{ name: 'miaa1', state: '', replicas: '', serverEndpoint: '' }];
mockArcApi.setup(x => x.azdata).returns(() => fakeAzdataApi);
sinon.stub(vscode.extensions, 'getExtension').returns(mockArcExtension.object);
const controllerModel = new ControllerModel(treeDataProvider, { id: uuid(), url: '127.0.0.1', name: 'my-arc', username: 'sa', rememberPassword: true, resources: [] }, 'mypassword');
await treeDataProvider.addOrUpdateController(controllerModel, '');
const controllerNode = treeDataProvider.getControllerNode(controllerModel);
const children = await treeDataProvider.getChildren(controllerNode);
should(children.filter(c => c.label === fakeAzdataApi.postgresInstances[0].name).length).equal(1, 'Should have a Postgres child');
should(children.filter(c => c.label === fakeAzdataApi.miaaInstances[0].name).length).equal(1, 'Should have a MIAA child');
should(children.length).equal(2, 'Should have excatly 2 children');
});
});
describe('removeController', function (): void {
@@ -104,4 +130,31 @@ describe('AzureArcTreeDataProvider tests', function (): void {
should((await treeDataProvider.getChildren()).length).equal(0, 'Removing other node again should do nothing');
});
});
describe('openResourceDashboard', function (): void {
it('Opening dashboard for nonexistent controller node throws', async function (): Promise<void> {
const controllerModel = new ControllerModel(treeDataProvider, { id: uuid(), url: '127.0.0.1', name: 'my-arc', username: 'sa', rememberPassword: true, resources: [] });
const openDashboardPromise = treeDataProvider.openResourceDashboard(controllerModel, ResourceType.sqlManagedInstances, '');
await should(openDashboardPromise).be.rejected();
});
it('Opening dashboard for nonexistent resource throws', async function (): Promise<void> {
const controllerModel = new ControllerModel(treeDataProvider, { id: uuid(), url: '127.0.0.1', name: 'my-arc', username: 'sa', rememberPassword: true, resources: [] });
await treeDataProvider.addOrUpdateController(controllerModel, '');
const openDashboardPromise = treeDataProvider.openResourceDashboard(controllerModel, ResourceType.sqlManagedInstances, '');
await should(openDashboardPromise).be.rejected();
});
it('Opening dashboard for existing resource node succeeds', async function (): Promise<void> {
const controllerModel = new ControllerModel(treeDataProvider, { id: uuid(), url: '127.0.0.1', name: 'my-arc', username: 'sa', rememberPassword: true, resources: [] });
const miaaModel = new MiaaModel(controllerModel, { name: 'miaa-1', resourceType: ResourceType.sqlManagedInstances }, undefined!, treeDataProvider);
await treeDataProvider.addOrUpdateController(controllerModel, '');
const controllerNode = treeDataProvider.getControllerNode(controllerModel)!;
const resourceNode = new MiaaTreeNode(miaaModel, controllerModel);
sinon.stub(controllerNode, 'getResourceNode').returns(resourceNode);
const showDashboardStub = sinon.stub(resourceNode, 'openDashboard');
await treeDataProvider.openResourceDashboard(controllerModel, ResourceType.sqlManagedInstances, '');
should(showDashboardStub.calledOnce).be.true('showDashboard should have been called exactly once');
});
});
});

View File

@@ -3,7 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
declare module 'arc' {
import * as vscode from 'vscode';
/**
* Covers defining what the arc extension exports to other extensions
@@ -20,6 +19,10 @@ declare module 'arc' {
sqlManagedInstances = 'sqlManagedInstances'
}
export type MiaaResourceInfo = ResourceInfo & {
userName?: string
};
export type ResourceInfo = {
name: string,
resourceType: ResourceType | string,
@@ -35,7 +38,13 @@ declare module 'arc' {
resources: ResourceInfo[]
};
export interface DataController {
label: string,
info: ControllerInfo
}
export interface IExtension {
getRegisteredDataControllers(): Promise<ControllerInfo[]>;
getRegisteredDataControllers(): Promise<DataController[]>;
getControllerPassword(controllerInfo: ControllerInfo): Promise<string>;
reacquireControllerPassword(controllerInfo: ControllerInfo, password: string, retryCount?: number): Promise<string>;
}
}

View File

@@ -8,3 +8,4 @@
/// <reference path='../../../azurecore/src/azurecore.d.ts'/>
/// <reference path='../../../../src/vs/vscode.d.ts'/>
/// <reference path='../../../azdata/src/typings/azdata-ext.d.ts'/>
/// <reference path='../../../resource-deployment/src/typings/resource-deployment.d.ts'/>

View File

@@ -9,7 +9,7 @@ export abstract class Dashboard {
private dashboard!: azdata.window.ModelViewDashboard;
constructor(protected title: string) { }
constructor(protected title: string, protected readonly name: string) { }
public async showDashboard(): Promise<void> {
this.dashboard = this.createDashboard();
@@ -17,7 +17,7 @@ export abstract class Dashboard {
}
protected createDashboard(): azdata.window.ModelViewDashboard {
const dashboard = azdata.window.createModelViewDashboard(this.title);
const dashboard = azdata.window.createModelViewDashboard(this.title, this.name);
dashboard.registerTabs(async modelView => {
return await this.registerTabs(modelView);
});

View File

@@ -0,0 +1,72 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as vscode from 'vscode';
import { getErrorMessage } from '../../common/utils';
export interface RadioOptionsInfo {
values?: string[],
defaultValue: string
}
export class RadioOptionsGroup {
static id: number = 1;
private _divContainer!: azdata.DivContainer;
private _loadingBuilder: azdata.LoadingComponentBuilder;
private _currentRadioOption!: azdata.RadioButtonComponent;
constructor(private _view: azdata.ModelView, private _onNewDisposableCreated: (disposable: vscode.Disposable) => void, private _groupName: string = `RadioOptionsGroup${RadioOptionsGroup.id++}`) {
const divBuilder = this._view.modelBuilder.divContainer();
const divBuilderWithProperties = divBuilder.withProperties<azdata.DivContainerProperties>({ clickable: false });
this._divContainer = divBuilderWithProperties.component();
const loadingComponentBuilder = this._view.modelBuilder.loadingComponent();
this._loadingBuilder = loadingComponentBuilder.withItem(this._divContainer);
}
public component(): azdata.LoadingComponent {
return this._loadingBuilder.component();
}
async load(optionsInfoGetter: () => Promise<RadioOptionsInfo>): Promise<void> {
this.component().loading = true;
this._divContainer.clearItems();
try {
const optionsInfo = await optionsInfoGetter();
const options = optionsInfo.values!;
let defaultValue: string = optionsInfo.defaultValue!;
options.forEach((option: string) => {
const radioOption = this._view!.modelBuilder.radioButton().withProperties<azdata.RadioButtonProperties>({
label: option,
checked: option === defaultValue,
name: this._groupName,
value: option,
enabled: true
}).component();
if (radioOption.checked) {
this._currentRadioOption = radioOption;
}
this._onNewDisposableCreated(radioOption.onDidClick(() => {
if (this._currentRadioOption !== radioOption) {
// uncheck the previously saved radio option, the ui gets handled correctly even if we did not do this due to the use of the 'groupName',
// however, the checked properties on the radio button do not get updated, so while the stuff works even if we left the previous option checked,
// it is just better to keep things clean.
this._currentRadioOption.checked = false;
this._currentRadioOption = radioOption;
}
}));
this._divContainer.addItem(radioOption);
});
}
catch (e) {
const errorLabel = this._view!.modelBuilder.text().withProperties({ value: getErrorMessage(e), CSSStyles: { 'color': 'Red' } }).component();
this._divContainer.addItem(errorLabel);
}
this.component().loading = false;
}
get value(): string | undefined {
return this._currentRadioOption?.value;
}
}

View File

@@ -12,7 +12,7 @@ import * as loc from '../../../localizedConstants';
export class ControllerDashboard extends Dashboard {
constructor(private _controllerModel: ControllerModel) {
super(loc.arcControllerDashboard(_controllerModel.info.name));
super(loc.arcControllerDashboard(_controllerModel.info.name), 'ArcDataControllerDashboard');
}
public async showDashboard(): Promise<void> {

View File

@@ -7,8 +7,8 @@ import { ResourceType } from 'arc';
import * as azdata from 'azdata';
import * as azurecore from 'azurecore';
import * as vscode from 'vscode';
import { getConnectionModeDisplayText, getResourceTypeIcon, parseInstanceName, resourceTypeToDisplayName } from '../../../common/utils';
import { cssStyles, Endpoints, IconPathHelper, iconSize } from '../../../constants';
import { getConnectionModeDisplayText, getResourceTypeIcon, resourceTypeToDisplayName } from '../../../common/utils';
import { cssStyles, Endpoints, IconPathHelper, controllerTroubleshootDocsUrl, iconSize } from '../../../constants';
import * as loc from '../../../localizedConstants';
import { ControllerModel } from '../../../models/controllerModel';
import { DashboardPage } from '../../components/dashboardPage';
@@ -93,7 +93,7 @@ export class ControllerDashboardOverviewPage extends DashboardPage {
headerCssStyles: cssStyles.tableHeader,
rowCssStyles: cssStyles.tableRow
}, {
displayName: loc.compute,
displayName: loc.state,
valueType: azdata.DeclarativeDataType.string,
width: '34%',
isReadOnly: true,
@@ -178,18 +178,30 @@ export class ControllerDashboardOverviewPage extends DashboardPage {
this._openInAzurePortalButton.onDidClick(async () => {
const config = this._controllerModel.controllerConfig;
if (config) {
vscode.env.openExternal(vscode.Uri.parse(
await vscode.env.openExternal(vscode.Uri.parse(
`https://portal.azure.com/#resource/subscriptions/${config.spec.settings.azure.subscription}/resourceGroups/${config.spec.settings.azure.resourceGroup}/providers/Microsoft.AzureData/${ResourceType.dataControllers}/${config.metadata.name}`));
} else {
vscode.window.showErrorMessage(loc.couldNotFindControllerRegistration);
}
}));
const troubleshootButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.troubleshoot,
iconPath: IconPathHelper.wrench
}).component();
this.disposables.push(
troubleshootButton.onDidClick(async () => {
await vscode.env.openExternal(vscode.Uri.parse(controllerTroubleshootDocsUrl));
})
);
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems(
[
{ component: newInstance },
{ component: refreshButton, toolbarSeparatorAfter: true },
{ component: this._openInAzurePortalButton }
{ component: this._openInAzurePortalButton, toolbarSeparatorAfter: true },
{ component: troubleshootButton }
]
).component();
}
@@ -219,26 +231,18 @@ export class ControllerDashboardOverviewPage extends DashboardPage {
iconHeight: iconSize,
iconWidth: iconSize
}).component();
let nameComponent: azdata.Component;
if (r.instanceType === ResourceType.postgresInstances) {
nameComponent = this.modelView.modelBuilder.text()
.withProperties<azdata.TextComponentProperties>({
value: r.instanceName || '',
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
} else {
nameComponent = this.modelView.modelBuilder.hyperlink()
.withProperties<azdata.HyperlinkComponentProperties>({
label: r.instanceName || '',
url: ''
}).component();
(<azdata.HyperlinkComponent>nameComponent).onDidClick(async () => {
await this._controllerModel.treeDataProvider.openResourceDashboard(this._controllerModel, r.instanceType || '', parseInstanceName(r.instanceName));
});
}
// TODO chgagnon
return [imageComponent, nameComponent, resourceTypeToDisplayName(r.instanceType), '-'/* loc.numVCores(r.vCores) */];
const nameComponent = this.modelView.modelBuilder.hyperlink()
.withProperties<azdata.HyperlinkComponentProperties>({
label: r.instanceName || '',
url: ''
}).component();
this.disposables.push(nameComponent.onDidClick(async () => {
await this._controllerModel.treeDataProvider.openResourceDashboard(this._controllerModel, r.instanceType || '', r.instanceName);
}));
return [imageComponent, nameComponent, resourceTypeToDisplayName(r.instanceType), r.state];
});
this._arcResourcesLoadingComponent.loading = !this._controllerModel.registrationsLastUpdated;
}

View File

@@ -0,0 +1,369 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as vscode from 'vscode';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as loc from '../../../localizedConstants';
import { IconPathHelper, cssStyles } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
import { convertToGibibyteString } from '../../../common/utils';
import { MiaaModel } from '../../../models/miaaModel';
export class MiaaComputeAndStoragePage extends DashboardPage {
private configurationContainer?: azdata.DivContainer;
private coresLimitBox?: azdata.InputBoxComponent;
private coresRequestBox?: azdata.InputBoxComponent;
private memoryLimitBox?: azdata.InputBoxComponent;
private memoryRequestBox?: azdata.InputBoxComponent;
private discardButton?: azdata.ButtonComponent;
private saveButton?: azdata.ButtonComponent;
private saveArgs: {
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string
} = {};
private readonly _azdataApi: azdataExt.IExtension;
constructor(protected modelView: azdata.ModelView, private _miaaModel: MiaaModel) {
super(modelView);
this._azdataApi = vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
this.initializeConfigurationBoxes();
this.disposables.push(this._miaaModel.onConfigUpdated(
() => this.eventuallyRunOnInitialized(() => this.handleServiceUpdated())));
}
protected get title(): string {
return loc.computeAndStorage;
}
protected get id(): string {
return 'miaa-compute-and-storage';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.computeStorage;
}
protected get container(): azdata.Component {
const root = this.modelView.modelBuilder.divContainer().component();
const content = this.modelView.modelBuilder.divContainer().component();
root.addItem(content, { CSSStyles: { 'margin': '20px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorage,
CSSStyles: { ...cssStyles.title }
}).component());
const infoComputeStorage_p1 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.miaaComputeAndStorageDescriptionPartOne,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px', 'max-width': 'auto' }
}).component();
const memoryVCoreslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.scalingCompute,
url: 'https://docs.microsoft.com/azure/azure-arc/data/configure-managed-instance',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p4 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFour,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p5 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFive,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p6 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartSix,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const computeInfoAndLinks = this.modelView.modelBuilder.flexContainer()
.withLayout({ flexWrap: 'wrap' })
.withItems([
infoComputeStorage_p1,
memoryVCoreslink,
infoComputeStorage_p4,
infoComputeStorage_p5,
infoComputeStorage_p6
], { CSSStyles: { 'margin-right': '5px' } }).component();
content.addItem(computeInfoAndLinks, { CSSStyles: { 'min-height': '30px' } });
this.configurationContainer = this.modelView.modelBuilder.divContainer().component();
this.configurationContainer.addItems(this.createUserInputSection(), { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.configurationContainer, { CSSStyles: { 'margin-top': '30px' } });
this.initialized = true;
return root;
}
protected get toolbarContainer(): azdata.ToolbarContainer {
// Save Edits
this.saveButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.saveText,
iconPath: IconPathHelper.save,
enabled: false
}).component();
this.disposables.push(
this.saveButton.onDidClick(async () => {
this.saveButton!.enabled = false;
try {
await vscode.window.withProgress(
{
location: vscode.ProgressLocation.Notification,
title: loc.updatingInstance(this._miaaModel.info.name),
cancellable: false
},
async (_progress, _token): Promise<void> => {
try {
await this._azdataApi.azdata.arc.sql.mi.edit(
this._miaaModel.info.name, this.saveArgs);
} catch (err) {
this.saveButton!.enabled = true;
throw err;
}
await this._miaaModel.refresh();
}
);
vscode.window.showInformationMessage(loc.instanceUpdated(this._miaaModel.info.name));
this.discardButton!.enabled = false;
} catch (error) {
vscode.window.showErrorMessage(loc.instanceUpdateFailed(this._miaaModel.info.name, error));
}
}));
// Discard
this.discardButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.discardText,
iconPath: IconPathHelper.discard,
enabled: false
}).component();
this.disposables.push(
this.discardButton.onDidClick(async () => {
this.discardButton!.enabled = false;
try {
this.editCores();
this.editMemory();
} catch (error) {
vscode.window.showErrorMessage(loc.pageDiscardFailed(error));
} finally {
this.saveButton!.enabled = false;
}
}));
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems([
{ component: this.saveButton },
{ component: this.discardButton }
]).component();
}
private initializeConfigurationBoxes() {
this.coresLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
validationErrorMessage: loc.coresValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresLimitBox!))) {
this.saveArgs.coresLimit = undefined;
} else {
this.saveArgs.coresLimit = this.coresLimitBox!.value;
}
})
);
this.coresRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
validationErrorMessage: loc.coresValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresRequestBox!))) {
this.saveArgs.coresRequest = undefined;
} else {
this.saveArgs.coresRequest = this.coresRequestBox!.value;
}
})
);
this.memoryLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 2,
validationErrorMessage: loc.memoryLimitValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryLimitBox!))) {
this.saveArgs.memoryLimit = undefined;
} else {
this.saveArgs.memoryLimit = this.memoryLimitBox!.value + 'Gi';
}
})
);
this.memoryRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 2,
validationErrorMessage: loc.memoryRequestValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryRequestBox!))) {
this.saveArgs.memoryRequest = undefined;
} else {
this.saveArgs.memoryRequest = this.memoryRequestBox!.value + 'Gi';
}
})
);
}
private createUserInputSection(): azdata.Component[] {
if (this._miaaModel.configLastUpdated) {
this.editCores();
this.editMemory();
}
return [
this.createConfigurationSectionContainer(loc.coresRequest, this.coresRequestBox!),
this.createConfigurationSectionContainer(loc.coresLimit, this.coresLimitBox!),
this.createConfigurationSectionContainer(loc.memoryRequest, this.memoryRequestBox!),
this.createConfigurationSectionContainer(loc.memoryLimit, this.memoryLimitBox!)
];
}
private createConfigurationSectionContainer(key: string, input: azdata.Component): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: key,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(input, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private handleOnTextChanged(component: azdata.InputBoxComponent): boolean {
if ((!component.value)) {
// if there is no text found in the inputbox component return false
return false;
} else if ((!component.valid)) {
// if value given by user is not valid enable discard button for user
// to clear all inputs and return false
this.discardButton!.enabled = true;
return false;
} else {
// if a valid value has been entered into the input box, enable save and discard buttons
// so that user could choose to either edit instance or clear all inputs
// return true
this.saveButton!.enabled = true;
this.discardButton!.enabled = true;
return true;
}
}
private editCores(): void {
let currentCPUSize = this._miaaModel.config?.spec?.requests?.vcores;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresRequestBox!.placeHolder = currentCPUSize;
this.coresRequestBox!.value = '';
this.saveArgs.coresRequest = undefined;
currentCPUSize = this._miaaModel.config?.spec?.limits?.vcores;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresLimitBox!.placeHolder = currentCPUSize;
this.coresLimitBox!.value = '';
this.saveArgs.coresLimit = undefined;
}
private editMemory(): void {
let currentMemSizeConversion: string;
let currentMemorySize = this._miaaModel.config?.spec?.requests?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryRequestBox!.placeHolder = currentMemSizeConversion!;
this.memoryRequestBox!.value = '';
this.saveArgs.memoryRequest = undefined;
currentMemorySize = this._miaaModel.config?.spec?.limits?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryLimitBox!.placeHolder = currentMemSizeConversion!;
this.memoryLimitBox!.value = '';
this.saveArgs.memoryLimit = undefined;
}
private handleServiceUpdated() {
this.editCores();
this.editMemory();
}
}

View File

@@ -91,8 +91,7 @@ export class MiaaConnectionStringsPage extends DashboardPage {
$serverName = "${externalEndpoint.ip},${externalEndpoint.port}";
$conn = sqlsrv_connect($serverName, $connectionInfo);`),
new InputKeyValue(this.modelView.modelBuilder, 'Python', `dbname='master' user='${username}' host='${externalEndpoint.ip}' password='{your_password_here}' port='${externalEndpoint.port}' sslmode='true'`),
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${externalEndpoint.ip}; user=${username} password={your_password_here} port=${externalEndpoint.port} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'Web App', `Database=master; Data Source=${externalEndpoint.ip}; User Id=${username}; Password={your_password_here}`)
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${externalEndpoint.ip}; user=${username} password={your_password_here} port=${externalEndpoint.port} sslmode=require`)
];
}

View File

@@ -10,11 +10,12 @@ import { ControllerModel } from '../../../models/controllerModel';
import * as loc from '../../../localizedConstants';
import { MiaaConnectionStringsPage } from './miaaConnectionStringsPage';
import { MiaaModel } from '../../../models/miaaModel';
import { MiaaComputeAndStoragePage } from './miaaComputeAndStoragePage';
export class MiaaDashboard extends Dashboard {
constructor(private _controllerModel: ControllerModel, private _miaaModel: MiaaModel) {
super(loc.miaaDashboard(_miaaModel.info.name));
super(loc.miaaDashboard(_miaaModel.info.name), 'ArcMiaaDashboard');
}
public async showDashboard(): Promise<void> {
@@ -27,12 +28,14 @@ export class MiaaDashboard extends Dashboard {
protected async registerTabs(modelView: azdata.ModelView): Promise<(azdata.DashboardTab | azdata.DashboardTabGroup)[]> {
const overviewPage = new MiaaDashboardOverviewPage(modelView, this._controllerModel, this._miaaModel);
const connectionStringsPage = new MiaaConnectionStringsPage(modelView, this._controllerModel, this._miaaModel);
const computeAndStoragePage = new MiaaComputeAndStoragePage(modelView, this._miaaModel);
return [
overviewPage.tab,
{
title: loc.settings,
tabs: [
connectionStringsPage.tab
connectionStringsPage.tab,
computeAndStoragePage.tab
]
},
];

View File

@@ -7,8 +7,8 @@ import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as azurecore from 'azurecore';
import * as vscode from 'vscode';
import { getDatabaseStateDisplayText, promptForResourceDeletion } from '../../../common/utils';
import { cssStyles, Endpoints, IconPathHelper } from '../../../constants';
import { getDatabaseStateDisplayText, promptForInstanceDeletion } from '../../../common/utils';
import { cssStyles, IconPathHelper, miaaTroubleshootDocsUrl } from '../../../constants';
import * as loc from '../../../localizedConstants';
import { ControllerModel } from '../../../models/controllerModel';
import { MiaaModel } from '../../../models/miaaModel';
@@ -198,13 +198,22 @@ export class MiaaDashboardOverviewPage extends DashboardPage {
deleteButton.onDidClick(async () => {
deleteButton.enabled = false;
try {
if (await promptForResourceDeletion(this._miaaModel.info.name)) {
await this._azdataApi.azdata.arc.sql.mi.delete(this._miaaModel.info.name);
if (await promptForInstanceDeletion(this._miaaModel.info.name)) {
await vscode.window.withProgress(
{
location: vscode.ProgressLocation.Notification,
title: loc.deletingInstance(this._miaaModel.info.name),
cancellable: false
},
(_progress, _token) => {
return this._azdataApi.azdata.arc.sql.mi.delete(this._miaaModel.info.name);
}
);
await this._controllerModel.refreshTreeNode();
vscode.window.showInformationMessage(loc.resourceDeleted(this._miaaModel.info.name));
vscode.window.showInformationMessage(loc.instanceDeleted(this._miaaModel.info.name));
}
} catch (error) {
vscode.window.showErrorMessage(loc.resourceDeletionFailed(this._miaaModel.info.name, error));
vscode.window.showErrorMessage(loc.instanceDeletionFailed(this._miaaModel.info.name, error));
} finally {
deleteButton.enabled = true;
}
@@ -248,6 +257,17 @@ export class MiaaDashboardOverviewPage extends DashboardPage {
}
}));
const troubleshootButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.troubleshoot,
iconPath: IconPathHelper.wrench
}).component();
this.disposables.push(
troubleshootButton.onDidClick(async () => {
await vscode.env.openExternal(vscode.Uri.parse(miaaTroubleshootDocsUrl));
})
);
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems(
[
{ component: deleteButton },
@@ -332,19 +352,13 @@ export class MiaaDashboardOverviewPage extends DashboardPage {
}
private refreshDashboardLinks(): void {
const kibanaEndpoint = this._controllerModel.getEndpoint(Endpoints.logsui);
if (kibanaEndpoint && this._miaaModel.config) {
const kibanaQuery = `kubernetes_namespace:"${this._miaaModel.config.metadata.namespace}" and custom_resource_name :"${this._miaaModel.config.metadata.name}"`;
const kibanaUrl = `${kibanaEndpoint.endpoint}/app/kibana#/discover?_a=(query:(language:kuery,query:'${kibanaQuery}'))`;
if (this._miaaModel.config) {
const kibanaUrl = this._miaaModel.config.status.logSearchDashboard ?? '';
this._kibanaLink.label = kibanaUrl;
this._kibanaLink.url = kibanaUrl;
this._kibanaLoading!.loading = false;
}
const grafanaEndpoint = this._controllerModel.getEndpoint(Endpoints.metricsui);
if (grafanaEndpoint && this._miaaModel.config) {
const grafanaQuery = `var-hostname=${this._miaaModel.info.name}-0`;
const grafanaUrl = grafanaEndpoint ? `${grafanaEndpoint.endpoint}/d/40q72HnGk/sql-managed-instance-metrics?${grafanaQuery}` : '';
const grafanaUrl = this._miaaModel.config.status.metricsDashboard ?? '';
this._grafanaLink.label = grafanaUrl;
this._grafanaLink.url = grafanaUrl;
this._grafanaLoading!.loading = false;

View File

@@ -1,31 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as loc from '../../../localizedConstants';
import { IconPathHelper } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
export class PostgresBackupPage extends DashboardPage {
protected get title(): string {
return loc.backup;
}
protected get id(): string {
return 'postgres-backup';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.backup;
}
protected get container(): azdata.Component {
return this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({ value: loc.backup }).component();
}
protected get toolbarContainer(): azdata.ToolbarContainer {
return this.modelView.modelBuilder.toolbarContainer().component();
}
}

View File

@@ -0,0 +1,500 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as vscode from 'vscode';
import * as azdata from 'azdata';
import * as azdataExt from 'azdata-ext';
import * as loc from '../../../localizedConstants';
import { IconPathHelper, cssStyles } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
import { PostgresModel } from '../../../models/postgresModel';
import { convertToGibibyteString } from '../../../common/utils';
export class PostgresComputeAndStoragePage extends DashboardPage {
private workerContainer?: azdata.DivContainer;
private workerBox?: azdata.InputBoxComponent;
private coresLimitBox?: azdata.InputBoxComponent;
private coresRequestBox?: azdata.InputBoxComponent;
private memoryLimitBox?: azdata.InputBoxComponent;
private memoryRequestBox?: azdata.InputBoxComponent;
private discardButton?: azdata.ButtonComponent;
private saveButton?: azdata.ButtonComponent;
private saveArgs: {
workers?: number,
coresLimit?: string,
coresRequest?: string,
memoryLimit?: string,
memoryRequest?: string
} = {};
private readonly _azdataApi: azdataExt.IExtension;
constructor(protected modelView: azdata.ModelView, private _postgresModel: PostgresModel) {
super(modelView);
this._azdataApi = vscode.extensions.getExtension(azdataExt.extension.name)?.exports;
this.initializeConfigurationBoxes();
this.disposables.push(this._postgresModel.onConfigUpdated(
() => this.eventuallyRunOnInitialized(() => this.handleServiceUpdated())));
}
protected get title(): string {
return loc.computeAndStorage;
}
protected get id(): string {
return 'postgres-compute-and-storage';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.computeStorage;
}
protected get container(): azdata.Component {
const root = this.modelView.modelBuilder.divContainer().component();
const content = this.modelView.modelBuilder.divContainer().component();
root.addItem(content, { CSSStyles: { 'margin': '20px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorage,
CSSStyles: { ...cssStyles.title }
}).component());
const infoComputeStorage_p1 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.postgresComputeAndStorageDescriptionPartOne,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px', 'max-width': 'auto' }
}).component();
const infoComputeStorage_p2 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.postgresComputeAndStorageDescriptionPartTwo,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const workerNodeslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.addingWokerNodes,
url: 'https://docs.microsoft.com/azure/azure-arc/data/scale-up-down-postgresql-hyperscale-server-group-using-cli',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p3 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartThree,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const memoryVCoreslink = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.scalingCompute,
url: 'https://docs.microsoft.com/azure/azure-arc/data/scale-up-down-postgresql-hyperscale-server-group-using-cli',
CSSStyles: { 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p4 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFour,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p5 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartFive,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const infoComputeStorage_p6 = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.computeAndStorageDescriptionPartSix,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const computeInfoAndLinks = this.modelView.modelBuilder.flexContainer()
.withLayout({ flexWrap: 'wrap' })
.withItems([
infoComputeStorage_p1,
infoComputeStorage_p2,
workerNodeslink,
infoComputeStorage_p3,
memoryVCoreslink,
infoComputeStorage_p4,
infoComputeStorage_p5,
infoComputeStorage_p6
], { CSSStyles: { 'margin-right': '5px' } })
.component();
content.addItem(computeInfoAndLinks, { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.workerNodes,
CSSStyles: { ...cssStyles.title, 'margin-top': '25px' }
}).component());
this.workerContainer = this.modelView.modelBuilder.divContainer().component();
this.workerContainer.addItems(this.createUserInputSection(), { CSSStyles: { 'min-height': '30px' } });
content.addItem(this.workerContainer, { CSSStyles: { 'min-height': '30px' } });
this.initialized = true;
return root;
}
protected get toolbarContainer(): azdata.ToolbarContainer {
// Save Edits
this.saveButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.saveText,
iconPath: IconPathHelper.save,
enabled: false
}).component();
this.disposables.push(
this.saveButton.onDidClick(async () => {
this.saveButton!.enabled = false;
try {
await vscode.window.withProgress(
{
location: vscode.ProgressLocation.Notification,
title: loc.updatingInstance(this._postgresModel.info.name),
cancellable: false
},
async (_progress, _token): Promise<void> => {
try {
await this._azdataApi.azdata.arc.postgres.server.edit(
this._postgresModel.info.name, this.saveArgs);
} catch (err) {
// If an error occurs while editing the instance then re-enable the save button since
// the edit wasn't successfully applied
this.saveButton!.enabled = true;
throw err;
}
await this._postgresModel.refresh();
}
);
vscode.window.showInformationMessage(loc.instanceUpdated(this._postgresModel.info.name));
this.discardButton!.enabled = false;
} catch (error) {
vscode.window.showErrorMessage(loc.instanceUpdateFailed(this._postgresModel.info.name, error));
}
}));
// Discard
this.discardButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.discardText,
iconPath: IconPathHelper.discard,
enabled: false
}).component();
this.disposables.push(
this.discardButton.onDidClick(async () => {
this.discardButton!.enabled = false;
try {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
} catch (error) {
vscode.window.showErrorMessage(loc.pageDiscardFailed(error));
} finally {
this.saveButton!.enabled = false;
}
}));
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems([
{ component: this.saveButton },
{ component: this.discardButton }
]).component();
}
private initializeConfigurationBoxes() {
this.workerBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
validationErrorMessage: loc.workerValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.workerBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.workerBox!))) {
this.saveArgs.workers = undefined;
} else {
this.saveArgs.workers = parseInt(this.workerBox!.value!);
}
})
);
this.coresLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
validationErrorMessage: loc.coresValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresLimitBox!))) {
this.saveArgs.coresLimit = undefined;
} else {
this.saveArgs.coresLimit = this.coresLimitBox!.value;
}
})
);
this.coresRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 1,
validationErrorMessage: loc.coresValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.coresRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.coresRequestBox!))) {
this.saveArgs.coresRequest = undefined;
} else {
this.saveArgs.coresRequest = this.coresRequestBox!.value;
}
})
);
this.memoryLimitBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 0.25,
validationErrorMessage: loc.memoryLimitValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryLimitBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryLimitBox!))) {
this.saveArgs.memoryLimit = undefined;
} else {
this.saveArgs.memoryLimit = this.memoryLimitBox!.value + 'Gi';
}
})
);
this.memoryRequestBox = this.modelView.modelBuilder.inputBox().withProperties<azdata.InputBoxProperties>({
readOnly: false,
min: 0.25,
validationErrorMessage: loc.memoryRequestValidationErrorMessage,
inputType: 'number',
placeHolder: loc.loading
}).component();
this.disposables.push(
this.memoryRequestBox.onTextChanged(() => {
if (!(this.handleOnTextChanged(this.memoryRequestBox!))) {
this.saveArgs.memoryRequest = undefined;
} else {
this.saveArgs.memoryRequest = this.memoryRequestBox!.value + 'Gi';
}
})
);
}
private createUserInputSection(): azdata.Component[] {
if (this._postgresModel.configLastUpdated) {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
}
return [
this.createWorkerNodesSectionContainer(),
this.createCoresMemorySection(),
this.createConfigurationSectionContainer(loc.coresRequest, this.coresRequestBox!),
this.createConfigurationSectionContainer(loc.coresLimit, this.coresLimitBox!),
this.createConfigurationSectionContainer(loc.memoryRequest, this.memoryRequestBox!),
this.createConfigurationSectionContainer(loc.memoryLimit, this.memoryLimitBox!)
];
}
private createWorkerNodesSectionContainer(): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.workerNodeCount,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
const information = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
iconPath: IconPathHelper.information,
title: loc.workerNodesInformation,
width: '12px',
height: '12px',
enabled: false
}).component();
keyContainer.addItem(information, { CSSStyles: { 'margin-left': '5px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(this.workerBox!, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private createConfigurationSectionContainer(key: string, input: azdata.Component): azdata.FlexContainer {
const inputFlex = { flex: '0 1 150px' };
const keyFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const keyComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: key,
CSSStyles: { ...cssStyles.text, 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const keyContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
keyContainer.addItem(keyComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
flexContainer.addItem(keyContainer, keyFlex);
const inputContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
inputContainer.addItem(input, { CSSStyles: { 'margin-bottom': '15px', 'min-width': '50px', 'max-width': '225px' } });
flexContainer.addItem(inputContainer, inputFlex);
return flexContainer;
}
private handleOnTextChanged(component: azdata.InputBoxComponent): boolean {
if ((!component.value)) {
// if there is no text found in the inputbox component return false
return false;
} else if ((!component.valid)) {
// if value given by user is not valid enable discard button for user
// to clear all inputs and return false
this.discardButton!.enabled = true;
return false;
} else {
// if a valid value has been entered into the input box, enable save and discard buttons
// so that user could choose to either edit instance or clear all inputs
// return true
this.saveButton!.enabled = true;
this.discardButton!.enabled = true;
return true;
}
}
private editWorkerNodeCount() {
// scale.shards was renamed to scale.workers. Check both for backwards compatibility.
let scale = this._postgresModel.config?.spec.scale;
let currentWorkers = scale?.workers ?? scale?.shards ?? 0;
this.workerBox!.min = currentWorkers;
this.workerBox!.placeHolder = currentWorkers.toString();
this.workerBox!.value = '';
this.saveArgs.workers = undefined;
}
private createCoresMemorySection(): azdata.DivContainer {
const titleFlex = { flex: `0 1 250px` };
const flexContainer = this.modelView.modelBuilder.flexContainer().withLayout({
flexWrap: 'wrap',
alignItems: 'center'
}).component();
const titleComponent = this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({
value: loc.configurationPerNode,
CSSStyles: { ...cssStyles.title, 'font-weight': 'bold', 'margin-block-start': '0px', 'margin-block-end': '0px' }
}).component();
const titleContainer = this.modelView.modelBuilder.flexContainer().withLayout({ alignItems: 'center' }).component();
titleContainer.addItem(titleComponent, { CSSStyles: { 'margin-right': '0px', 'margin-bottom': '15px' } });
const information = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
iconPath: IconPathHelper.information,
title: loc.postgresConfigurationInformation,
width: '12px',
height: '12px',
enabled: false
}).component();
titleContainer.addItem(information, { CSSStyles: { 'margin-left': '5px', 'margin-bottom': '15px' } });
flexContainer.addItem(titleContainer, titleFlex);
let configurationSection = this.modelView.modelBuilder.divContainer().component();
configurationSection.addItem(flexContainer);
return configurationSection;
}
private editCores() {
let currentCPUSize = this._postgresModel.config?.spec.scheduling?.default?.resources?.requests?.cpu;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresRequestBox!.placeHolder = currentCPUSize;
this.coresRequestBox!.value = '';
this.saveArgs.coresRequest = undefined;
currentCPUSize = this._postgresModel.config?.spec.scheduling?.default?.resources?.limits?.cpu;
if (!currentCPUSize) {
currentCPUSize = '';
}
this.coresLimitBox!.placeHolder = currentCPUSize;
this.coresLimitBox!.value = '';
this.saveArgs.coresLimit = undefined;
}
private editMemory() {
let currentMemSizeConversion: string;
let currentMemorySize = this._postgresModel.config?.spec.scheduling?.default?.resources?.requests?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryRequestBox!.placeHolder = currentMemSizeConversion!;
this.memoryRequestBox!.value = '';
this.saveArgs.memoryRequest = undefined;
currentMemorySize = this._postgresModel.config?.spec.scheduling?.default?.resources?.limits?.memory;
if (!currentMemorySize) {
currentMemSizeConversion = '';
} else {
currentMemSizeConversion = convertToGibibyteString(currentMemorySize);
}
this.memoryLimitBox!.placeHolder = currentMemSizeConversion!;
this.memoryLimitBox!.value = '';
this.saveArgs.memoryLimit = undefined;
}
private handleServiceUpdated() {
this.editWorkerNodeCount();
this.editCores();
this.editMemory();
}
}

View File

@@ -1,31 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as azdata from 'azdata';
import * as loc from '../../../localizedConstants';
import { IconPathHelper } from '../../../constants';
import { DashboardPage } from '../../components/dashboardPage';
export class PostgresComputeStoragePage extends DashboardPage {
protected get title(): string {
return loc.computeAndStorage;
}
protected get id(): string {
return 'postgres-compute-storage';
}
protected get icon(): { dark: string; light: string; } {
return IconPathHelper.computeStorage;
}
protected get container(): azdata.Component {
return this.modelView.modelBuilder.text().withProperties<azdata.TextComponentProperties>({ value: loc.computeAndStorage }).component();
}
protected get toolbarContainer(): azdata.ToolbarContainer {
return this.modelView.modelBuilder.toolbarContainer().component();
}
}

View File

@@ -3,7 +3,6 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as vscode from 'vscode';
import * as azdata from 'azdata';
import * as loc from '../../../localizedConstants';
import { IconPathHelper, cssStyles } from '../../../constants';
@@ -12,13 +11,12 @@ import { DashboardPage } from '../../components/dashboardPage';
import { PostgresModel } from '../../../models/postgresModel';
export class PostgresConnectionStringsPage extends DashboardPage {
private loading?: azdata.LoadingComponent;
private keyValueContainer?: KeyValueContainer;
constructor(protected modelView: azdata.ModelView, private _postgresModel: PostgresModel) {
super(modelView);
this.disposables.push(this._postgresModel.onServiceUpdated(
this.disposables.push(this._postgresModel.onConfigUpdated(
() => this.eventuallyRunOnInitialized(() => this.handleServiceUpdated())));
}
@@ -51,7 +49,7 @@ export class PostgresConnectionStringsPage extends DashboardPage {
const link = this.modelView.modelBuilder.hyperlink().withProperties<azdata.HyperlinkComponentProperties>({
label: loc.learnAboutPostgresClients,
url: 'https://docs.microsoft.com/azure/postgresql/concepts-connection-libraries',
url: 'https://docs.microsoft.com/azure/azure-arc/data/get-connection-endpoints-and-connection-strings-postgres-hyperscale',
}).component();
const infoAndLink = this.modelView.modelBuilder.flexContainer().withLayout({ flexWrap: 'wrap' }).component();
@@ -61,44 +59,20 @@ export class PostgresConnectionStringsPage extends DashboardPage {
this.keyValueContainer = new KeyValueContainer(this.modelView.modelBuilder, this.getConnectionStrings());
this.disposables.push(this.keyValueContainer);
this.loading = this.modelView.modelBuilder.loadingComponent()
.withItem(this.keyValueContainer.container)
.withProperties<azdata.LoadingComponentProperties>({
loading: !this._postgresModel.serviceLastUpdated
}).component();
content.addItem(this.loading);
content.addItem(this.keyValueContainer.container);
this.initialized = true;
return root;
}
protected get toolbarContainer(): azdata.ToolbarContainer {
const refreshButton = this.modelView.modelBuilder.button().withProperties<azdata.ButtonProperties>({
label: loc.refresh,
iconPath: IconPathHelper.refresh
}).component();
this.disposables.push(
refreshButton.onDidClick(async () => {
refreshButton.enabled = false;
try {
this.loading!.loading = true;
await this._postgresModel.refresh();
} catch (error) {
vscode.window.showErrorMessage(loc.refreshFailed(error));
} finally {
refreshButton.enabled = true;
}
}));
return this.modelView.modelBuilder.toolbarContainer().withToolbarItems([
{ component: refreshButton }
]).component();
return this.modelView.modelBuilder.toolbarContainer().component();
}
private getConnectionStrings(): KeyValue[] {
const endpoint: { ip?: string, port?: number } = this._postgresModel.endpoint;
const endpoint = this._postgresModel.endpoint;
if (!endpoint) {
return [];
}
return [
new InputKeyValue(this.modelView.modelBuilder, 'ADO.NET', `Server=${endpoint.ip};Database=postgres;Port=${endpoint.port};User Id=postgres;Password={your_password_here};Ssl Mode=Require;`),
@@ -108,13 +82,11 @@ export class PostgresConnectionStringsPage extends DashboardPage {
new InputKeyValue(this.modelView.modelBuilder, 'PHP', `host=${endpoint.ip} port=${endpoint.port} dbname=postgres user=postgres password={your_password_here} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'psql', `psql "host=${endpoint.ip} port=${endpoint.port} dbname=postgres user=postgres password={your_password_here} sslmode=require"`),
new InputKeyValue(this.modelView.modelBuilder, 'Python', `dbname='postgres' user='postgres' host='${endpoint.ip}' password='{your_password_here}' port='${endpoint.port}' sslmode='true'`),
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${endpoint.ip}; dbname=postgres user=postgres password={your_password_here} port=${endpoint.port} sslmode=require`),
new InputKeyValue(this.modelView.modelBuilder, 'Web App', `Database=postgres; Data Source=${endpoint.ip}; User Id=postgres; Password={your_password_here}`)
new InputKeyValue(this.modelView.modelBuilder, 'Ruby', `host=${endpoint.ip}; dbname=postgres user=postgres password={your_password_here} port=${endpoint.port} sslmode=require`)
];
}
private handleServiceUpdated() {
this.keyValueContainer?.refresh(this.getConnectionStrings());
this.loading!.loading = false;
}
}

View File

@@ -10,15 +10,14 @@ import { ControllerModel } from '../../../models/controllerModel';
import { PostgresModel } from '../../../models/postgresModel';
import { PostgresOverviewPage } from './postgresOverviewPage';
import { PostgresConnectionStringsPage } from './postgresConnectionStringsPage';
import { PostgresPropertiesPage } from './postgresPropertiesPage';
import { Dashboard } from '../../components/dashboard';
import { PostgresDiagnoseAndSolveProblemsPage } from './postgresDiagnoseAndSolveProblemsPage';
import { PostgresSupportRequestPage } from './postgresSupportRequestPage';
import { PostgresResourceHealthPage } from './postgresResourceHealthPage';
import { PostgresComputeAndStoragePage } from './postgresComputeAndStoragePage';
export class PostgresDashboard extends Dashboard {
constructor(private _context: vscode.ExtensionContext, private _controllerModel: ControllerModel, private _postgresModel: PostgresModel) {
super(loc.postgresDashboard(_postgresModel.name));
super(loc.postgresDashboard(_postgresModel.info.name), 'ArcPgDashboard');
}
public async showDashboard(): Promise<void> {
@@ -32,10 +31,11 @@ export class PostgresDashboard extends Dashboard {
protected async registerTabs(modelView: azdata.ModelView): Promise<(azdata.DashboardTab | azdata.DashboardTabGroup)[]> {
const overviewPage = new PostgresOverviewPage(modelView, this._controllerModel, this._postgresModel);
const connectionStringsPage = new PostgresConnectionStringsPage(modelView, this._postgresModel);
const propertiesPage = new PostgresPropertiesPage(modelView, this._controllerModel, this._postgresModel);
const resourceHealthPage = new PostgresResourceHealthPage(modelView, this._postgresModel);
const computeAndStoragePage = new PostgresComputeAndStoragePage(modelView, this._postgresModel);
// TODO: Removed properties page while investigating bug where refreshed values don't appear in UI
// const propertiesPage = new PostgresPropertiesPage(modelView, this._controllerModel, this._postgresModel);
const diagnoseAndSolveProblemsPage = new PostgresDiagnoseAndSolveProblemsPage(modelView, this._context, this._postgresModel);
const supportRequestPage = new PostgresSupportRequestPage(modelView);
const supportRequestPage = new PostgresSupportRequestPage(modelView, this._controllerModel, this._postgresModel);
return [
overviewPage.tab,
@@ -43,13 +43,12 @@ export class PostgresDashboard extends Dashboard {
title: loc.settings,
tabs: [
connectionStringsPage.tab,
propertiesPage.tab
computeAndStoragePage.tab
]
},
{
title: loc.supportAndTroubleshooting,
tabs: [
resourceHealthPage.tab,
diagnoseAndSolveProblemsPage.tab,
supportRequestPage.tab
]

View File

@@ -50,8 +50,9 @@ export class PostgresDiagnoseAndSolveProblemsPage extends DashboardPage {
this.disposables.push(
troubleshootButton.onDidClick(() => {
process.env['POSTGRES_SERVER_NAMESPACE'] = this._postgresModel.namespace;
process.env['POSTGRES_SERVER_NAME'] = this._postgresModel.name;
process.env['POSTGRES_SERVER_NAMESPACE'] = this._postgresModel.config?.metadata.namespace;
process.env['POSTGRES_SERVER_NAME'] = this._postgresModel.info.name;
process.env['POSTGRES_SERVER_VERSION'] = this._postgresModel.engineVersion;
vscode.commands.executeCommand('bookTreeView.openBook', this._context.asAbsolutePath('notebooks/arcDataServices'), true, 'postgres/tsg100-troubleshoot-postgres');
}));

Some files were not shown because too many files have changed in this diff Show More