Compare commits

...

958 Commits
1.3.9 ... 1.9.0

Author SHA1 Message Date
Cory Rivera
78a42e1d11 Check if python executable exists before querying user package directory. (#6345) 2019-07-09 18:43:32 -07:00
Cory Rivera
d2e758c0d7 Fix python install issues caused by other preexisting Python versions. (#6294)
* Remove --user option when doing pip installs for our standalone Python version.

* Use force-reinstall option when installing sparkmagic since we use a custom version.

* Use force-reinstall when installing pip packages from Manage Packages dialog so that dependencies don't get split across multiple locations.

* Update PATH after install to include additional package directories.
2019-07-09 17:13:47 -07:00
Chris LaFreniere
6f5ad3a8a3 Use in proc markdown by default (#6315) 2019-07-09 15:27:00 -07:00
Charles Gagnon
3e446980df Update CMS version for July release (#6312) 2019-07-09 14:16:46 -07:00
Alan Ren
053636af9c Update product.json (#6307) 2019-07-09 11:02:51 -07:00
Cory Rivera
e3b166846d Manually define JupyterServerInstallation execOptions field in notebook unit tests. (#6292) 2019-07-09 10:44:31 -07:00
Charles Gagnon
fcba0d1558 Bump Agent and Import package versions for July release (#6275) 2019-07-09 10:42:48 -07:00
Karl Burtram
35e3a42017 Fix inconsistencies in langpack readme files (#6285) 2019-07-09 10:41:18 -07:00
Karl Burtram
401d4b2211 Update localization resource files (#6283)
* Update localization resource files

* Remove extra space from readme headers
2019-07-09 10:40:28 -07:00
Zbyněk Sailer
be3e7e3dc1 LOC CHECKIN | Microsoft/azuredatastudio master | 20190708 (#6274) 2019-07-09 10:39:51 -07:00
Kevin Cunnane
1e12e61243 Fix #6221 notebook shortcut and use new grid in stable (#6268)
- Fix #6221 Notebooks: Keyboard Shortcut for New Notebook has Changed.
  - Use Win+Alt+N instead of Win+Shift+N
- New Grid is now "stable" (forgot to do this in last PR)
2019-07-09 10:35:42 -07:00
Kevin Cunnane
f3b12dd5ac Fix #6287 Notebook editor deserialize on reload/save is broken (#6288)
- Notebook editors have their own mode
2019-07-09 10:35:15 -07:00
Charles Gagnon
73bb5501bd Fix Agent tabs not switching
(cherry picked from commit 6f15ebcf6fe1f4976e82e3ee71cfba0d35fa2b7c)
2019-07-09 10:30:49 -07:00
Charles Gagnon
47e3761159 Fix connectionId remapping (#6269) 2019-07-05 20:42:03 +00:00
Gene Lee
cf4dd48784 Added Big Data Cluster Viewlet to ADS (#6204) 2019-07-03 21:52:19 -06:00
Aditya Bist
1404133283 Data explorer/context (#6242)
* fixed context for data explorer

* added more files

* initial servers and database context menu actions finished

* added all actions for servers and databases with correct conditions

* added nodetype and nodelabel for subtype actions

* added nodeinfo logic to oe shim

* fixed context for cms

* added all scripting actions to data explorer

* review comments

* fix import

* fix correct context key

* removed unused import

* fix typo
2019-07-03 17:33:34 -07:00
Aditya Bist
ecfcb92a89 Data explorer/context menu initial (#6264)
* fixed context for data explorer

* added more files

* initial servers and database context menu actions finished

* added all actions for servers and databases with correct conditions

* added nodetype and nodelabel for subtype actions

* added nodeinfo logic to oe shim

* fixed context for cms

* added all scripting actions to data explorer

* review comments

* fix import

* fix correct context key

* removed unused import

* separate PR for commands and menus added

* rename mssql context menu nodes

* remove command id constants
2019-07-03 16:03:09 -07:00
Kevin Cunnane
10b066d300 Share Notebook grid rendering with Query editor (#6241)
This is a staged refactor to use the exact same grid logic in the Notebook and query editors, including context menu support, font settings, and sizing logic. The goal long term is:
- As the core Query grid is updated, Notebook can benefit from the changes
- As we add in support for contributions like new buttons & actions working on the grid, can share the logic
- Ideally if and when we refactor things like the action bar for grid results, we can apply in both places though this is TBD.

Fixes a number of issues:
- Fixes #5755 Grids don't respond to font settings. @anthonydresser can we remove setting from each query results editor and just use Notebook Styles since these are global (not scoped) settings?
- Fixes #5501 Copy from grid settings. 
- Fixes #4938 SQL Notebook result sets are missing the actions provide for SQL File results sets. this now has the core ability to solve this, and separate work items for specific asks (serialization, charting) are tracked.

Currently hidden:
- Save as... options in context menu
- All right toolbar actions (save as, chart).

Remaining issues to address in future commits:
- Need to implement support for serialization (#5137). 
- Need to add charting support
- Need to solve the layout of buttons on the right hand side when a small number of columns are output. It doesn't look right that buttons are so far away from the results
  - Will work with UX on this. For now, mitigating this by hiding all buttons, but will need to solve in the future
- Would like to make buttons contributable via extension, but need to refactor similar to ObjectExplorer context menu so that we can serialize context menu options across to extension host while still having internal actions with full support
2019-07-03 14:34:03 -07:00
Cory Rivera
8a8cb3ab27 Clear fields on Add New Package page after getting No Valid Versions error. (#6261) 2019-07-03 14:29:26 -07:00
Anthony Dresser
f19f21d547 use file service for insights (#6248) 2019-07-03 13:45:19 -07:00
Anthony Dresser
92fbfcdac9 move edit data and query plan to their own files (#6256) 2019-07-03 13:45:04 -07:00
Cory Rivera
4189e761ff Fix bugs in selecting a system version of Python for Notebook dependencies (#6250) 2019-07-03 12:51:48 -07:00
Udeesha Gautam
a8b3f056a0 Add only non null changes to difference dictionary to ensure index doesnt mismatch (#6236) 2019-07-03 10:08:14 -07:00
Kevin Cunnane
cc6dea0631 Add Plotly output support to notebooks
With this change, Plotly types will be successfully rendered in a Notebook.
Currently they have a default width of 700px with a scrollbar if the window size is smaller (this matches other notebook viewers).
The Plotly library is dynamically required to avoid startup time perf hits. This is something we could look at for other components too.
2019-07-02 18:08:38 -07:00
Anthony Dresser
8c4f6f9e5f remove results serializer dependency on node (#6202) 2019-07-02 18:00:14 -07:00
Chris LaFreniere
708461eab5 Remove 'isMenu=true' from notebook toggle more (#6253) 2019-07-02 17:22:17 -07:00
Charles Gagnon
8ec1a05296 Change notebook to not save connections added through dropdown (#6254) 2019-07-03 00:11:02 +00:00
Chris LaFreniere
c4bf1b4180 Notebooks: Run all after/before (#6239)
* Run all above/below

* PR comments pre tests

* Added integration test
2019-07-02 16:49:12 -07:00
Anthony Dresser
495c9330f6 add event to capture state and reapply when necessary (#6246) 2019-07-02 15:52:42 -07:00
Charles Gagnon
49619e5b39 Clear Azure token if connection doesn't need it (#6244)
* Clear Azure token if connection doesn't need it

* Update function name
2019-07-02 21:01:53 +00:00
Karl Burtram
7b88800c62 Add a New File menu item for plain text files (#6240)
* Add a New File menu item for plain text files

* Correct handling of saved files

* Fix command palette text to avoid duplicate entry
2019-07-02 13:29:47 -07:00
Lucy Zhang
ecef90dc8b Book/externallink (#6215)
* show markdown preview

* open external link

* addressed Charles' comments
2019-07-02 09:30:48 -07:00
Chris LaFreniere
e8d4fba3c0 Support non-default font sizes in notebooks (#6222)
* Support non-default font sizes notebooks

* pr comments
2019-07-01 16:10:20 -07:00
Karl Burtram
384d87f84d Return an empty arrary from breakpoints API (#6235) 2019-07-01 15:22:58 -07:00
Charles Gagnon
8b349dbcde Change service installation messages to not steal focus (#6227)
* Change service installation messages to not steal focus

* Undo unnecessary changes to localized strings in serviceClient.ts

* Add default case for missing server event types
2019-07-01 22:11:14 +00:00
Kim Santiago
7f5e00fd81 Fix #6217 DacFx: Connection Dialog has no ConnectionProfile loaded (#6232)
* fix connection dialog not opening with connection profile

* bump extension version
2019-07-01 14:42:32 -07:00
Chris LaFreniere
e5858dee52 Make sure we sanitize the same way (#6233) 2019-07-01 14:05:50 -07:00
Chris LaFreniere
bae573453a Accessibility: Screen Reader Thinks SelectBox is a Button on Mac (#6216)
* Stop reading dropdowns as buttons for mac

* Remove role of combobox for sql selectbox
2019-07-01 13:46:06 -07:00
khoiph1
3e68c3ee0c Loc Update (#6223) 2019-07-01 12:45:29 -07:00
Maddy
e44e0a7c89 added scrollable directibe to the dashboard griod container (#6206) 2019-07-01 12:20:17 -07:00
Aditya Bist
678b2737bd CMS - SQL Login (#5989)
* initial SQL Login with save password working

* fix switching auth types

* remove metadata from package file

* allow editing connections for unsaved password connections

* review comments

* change thenables to async/awaits

* review comments

* changed thenables to promises

* remove authTypeChanged bool

* removed unused import

* review comments

* removed try catches

* cr comments

* review comments
2019-07-01 11:40:11 -07:00
Kim Santiago
6b5193908c Remove failing assert from Schema Compare tests (#6229)
* remove failing assert

* add TODO comment
2019-07-01 10:45:41 -07:00
Kevin Cunnane
87f1f11509 Fix tags issue where metadata was not preserved (#6219) 2019-07-01 10:10:42 -07:00
Charles Gagnon
0503c8d8fe Initial work to update telemetry to use Common Schema (#6203)
* Update admin-tool-ext-win to use new ads-extension-telemetry package

* Add AdsTelemetryService for sending telemetry events using the ADS Common Schema

* Clean up unused import and add engineType

* Address PR comments

* Update private var names
2019-06-30 19:38:04 +00:00
Karl Burtram
bc7ac519d0 Update extension resources and ENU XLF files (#6220)
* Add extension resources

* Update enu XLF resources
2019-06-28 19:09:23 -07:00
Udeesha Gautam
00c3758d86 Fixing backup restore launch bug #5797 (and a test) (#6218)
* Fix the launch of backup dialog in server context scenario

* Adding wait to ensure sc tasks complete before test exits
2019-06-28 17:54:04 -07:00
Kim Santiago
e5256b0a61 allow table width to be specified (#6196) 2019-06-28 15:55:58 -07:00
Lucy Zhang
eb3c6cadd2 show markdown preview (#6198) 2019-06-28 15:02:17 -07:00
Charles Gagnon
d701a20cd5 Bump SqlToolsService to 1.5.0-alpha.105 (#6209) 2019-06-28 18:16:49 +00:00
Karl Burtram
095f35d07e Prevent out of bounds splitview error (#6210) 2019-06-28 10:26:07 -07:00
Kim Santiago
53cd22f142 Add more validation for DacFx tests (#6120)
* add checking for tables and data

* addressing comments
2019-06-28 10:01:19 -07:00
Chris LaFreniere
8cf4120c27 Notebooks: Support for In-Proc Markdown Renderer (#6164)
* NB improve startup using built-in markdown render

This is a sample branch showing perf improvements if we load content using built-in markdown rendering
- Has issues where images aren't correctly rendered due to sanitization, need to copy renderer code and update settings
- Moves content load up before anythign to do with providers since we can render without knowing about these things

# Conflicts:
#	src/sql/workbench/parts/notebook/cellViews/textCell.component.ts

* Re-enable logging of each cell's rendering time

* Fix test issue

* Kernel loading working with new markdown renderer

# Conflicts:
#	src/sql/workbench/parts/notebook/cellViews/textCell.component.ts

* Fixed tests, cleaned up code

* markdownOutput component integration

* PR Comments

* PR feedback 2

* PR feedback again
2019-06-27 20:55:50 -07:00
Kim Santiago
b34e3cbe90 fix compare after opening scmp with dacpacs failing (#6201) 2019-06-27 18:26:02 -07:00
Karl Burtram
4ef25ecf37 Properly save and restore dynamic tab state (#6185)
* WIP

* WIP 2

* WIP 3

* Rework state capture implementation

* Break loop after setting
2019-06-27 16:14:28 -07:00
Udeesha Gautam
f5d647f05c Bug/toolbar icon revert (#6194)
* Change icon size rather than component size

* reverting the icon height impact
2019-06-27 13:53:05 -07:00
Anthony Dresser
7b6181de2a XML Formatter (#6182)
* add xml formatter extenstion

* remove unused imports
2019-06-27 12:20:19 -07:00
Alan Ren
20bbaa3fe6 Update package.json (#6187) 2019-06-27 10:58:12 -07:00
Chris LaFreniere
a2c9a0a1ae Addaria label to kernel and attach to dropdowns (#6181) 2019-06-27 10:19:22 -07:00
Lucy Zhang
98c6af628b New feature: Jupyter Books (#6095)
* Initial commit

* Fixed broken branch

* Show notebook titles in tree view

* Added  README

* sections showing in tree view

* Multiple books in treeview

* removed book extension, added to notebook

* removed book from extensions.ts

* addressed Chris' comments

* Addressed Charles' comments

* fixed spelling in readme

* added comment about same filenames

* adding vsix

* addressed Karl's comments
2019-06-27 10:10:30 -07:00
Alan Ren
f39647f243 add save/load filter feature to profiler (#6170)
* save/load profiler filter

* add role for custom buttons
2019-06-26 23:55:03 -07:00
Alan Ren
c2cec5d93f enable lang pack filter in extension manager (#6148)
* enable lang pack filter in extension manager

* fix null check issue
2019-06-26 23:17:35 -07:00
Charles Gagnon
5a11cf1a6f Filtering out some more high-frequency events (#6178) 2019-06-26 23:03:22 +00:00
Kim Santiago
b0b1b59147 Fix #6175: Schema compare doesn't always use scmp options (#6176)
* remove resetting options to dialog options before comparing

* bump azdata dependency version
2019-06-26 15:35:52 -07:00
Chris LaFreniere
77fb060fde Notebooks: Add Command + Keyboard Shortcut to Clear Outputs of Active Cell (#6169)
* Add command to clear cell output with test

* Fix typo

* PR Comments
2019-06-26 15:19:12 -07:00
Udeesha Gautam
caba5c9d26 Databases order in schema compare dialog to be same as OE (#6162)
* Database order to be same as OE

* changes to reset for source target buttons
2019-06-26 12:16:51 -07:00
Kevin Cunnane
97d36e2281 Support output renderers via Angular contributions (#6146)
- Added a registry for output components
- Refactored existing renderers to plug in via Angular
- Added Markdown renderer using new Angular contribution point
- Added support to notebook module to dynamically load new components
2019-06-26 11:32:24 -07:00
Maddy
32235b0cb6 Cluster management/newdashboard task (#6060)
* initial commit: added cluster status notebook and dashboard task

* following the previous naming conventions

* endpoint widget changes to accomodatw naming changes

* management-proxy/mgmtproxy chnages

* updates to address the comments and added the new copy image with hover text.

* added user select for making the table selectable

* localize changes

* added the final documented notebook

* reset execution_count to 0 for all cells

* style changes

* updated the url to point to private repo
2019-06-25 22:46:11 -07:00
Alan Yu
6142109bf5 Update Readme for anchor tags to Insiders Build for easy sharing (#6166) 2019-06-25 21:14:13 -07:00
Kim Santiago
144a7f941b Schema Compare open scmp file (#6118)
* can compare scmp with databases

* show error if can't connect to db

* excludes now work

* fixes after rebase and other small fixes

* Addressing comments

* fixes after rebasing

* fix excludes not always being remembered correctly

* fix switched check

* addressing comments
2019-06-25 17:30:07 -07:00
Kim Santiago
f01c318c30 Schema Compare save scmp file (#6150)
* initial changes

* send source and target excludes

* disable save scmp button until there is source and target

* addressing comments
2019-06-25 15:07:58 -07:00
Alan Ren
ac76302d6c Filter templates by supported engine type (#6133)
* Filter templates by supported engine type

* fix the azure sql db name
2019-06-24 23:37:56 -07:00
Alan Ren
a906a9c862 handle copy in all profiler tables (#6134)
* handle copy in all profiler tables

* use camel casing
2019-06-24 17:01:19 -07:00
Alan Ren
9a3daabeb4 add db name to xevent session and view (#6135) 2019-06-24 16:16:38 -07:00
Karl Burtram
9687159484 Add metadata tags to package.json (#6129) 2019-06-24 14:50:39 -07:00
Kim Santiago
6a0ffdfa60 Update dacpac and schema compare extensions to use getConnections() (#6131)
* update dacpac and schema compare extensions to use getConnections

* use more const

* make MSSQL a const

* changing name of mssql const

* add comment for name of parameter
2019-06-24 14:16:07 -07:00
Karl Burtram
e3f26e8f12 Update resource strings for 1.9.0 langpacks (#6144)
* Refresh loc resources

* Update loc strings
2019-06-24 13:34:30 -07:00
Kim Santiago
cf85bb14f5 Fix #5809: Data-tier wizard "Source Server" shouldn't show database name (#6125)
* Removing database name from server connection and adding required asterisk to database dropdowns

* also remove database name in flat file import wizard
2019-06-24 11:44:26 -07:00
Chris LaFreniere
4fe81d8449 Only set notebookEditor <a> color in scrollable portion (#6140) 2019-06-24 10:46:38 -07:00
Kim Santiago
46b8d55280 fix filepath getting regenerated ever time page is entered (#6132) 2019-06-24 09:52:05 -07:00
Alan Ren
08cf731c87 expand the detail section by selecting the headers (#6130)
* expand the detail section by selecting the headers

* use methods

* address comments
2019-06-22 00:29:02 -07:00
Udeesha Gautam
4c1af148c7 Feature/schemacompare cancel (#6104)
* Schema compare cancel changes for ADS

* adding a missed change

* Merge from Master

* Updating SqltoolsService version

* trying stress with bigger runtime

* trying one more stress fix with unique operation ids

* refactoring test a bit to ensure stress run works
2019-06-21 13:55:01 -07:00
Kevin Cunnane
83410565da Increase timeout to fix notebook integration test (#6117) 2019-06-21 10:08:10 -07:00
Alan Ren
de81c37611 new connect command (#6115)
* new connect command

* address comments
2019-06-20 23:08:12 -07:00
Karl Burtram
176719000d Bump eslint dependency (#6122) 2019-06-20 17:36:26 -07:00
Karl Burtram
1411ad4503 Fix modelview webview to work in query tab (#6119)
* WIP

* Rebuild webview when switching tabs

* Remove unneeded code

* Make ready promise private

* Undo edit in sendMessage

* Add null check prior to using ready promise

* Remove extra whitespace

* Rename parameter and fix strict null check errors
2019-06-20 16:28:32 -07:00
Anthony Dresser
77b351adf3 Add query editor view state (#6018)
* add query editor view state

* change commnet

* change state key name

* wip

* fix tests
2019-06-20 16:10:00 -07:00
Kevin Cunnane
b37b14eabd Improve notebook link handling (#6087)
* Improve notebook link handling
- Single click now works for links inside Output areas
- Command links in untrusted notebooks have link color
- Refactored to use directive so code is in 1 place and can be easily
  added elsewhere if needed

* Removed unneeded service from constructor
2019-06-20 11:40:12 -07:00
Kevin Cunnane
578ac6cae5 Add notebook open protocol handler (#6093)
Adds a protocol handler for notebook open, which can be used from browsers
Uses extension-based handler support so all URIs must start with `azuredatastudio://microsoft.notebook`

Adds 2 actions:
- `/new` opens a new empty notebook
- `/open` opens a HTTP/S file as an untitled notebook or text document

Sample URL:
```
azuredatastudio://microsoft.notebook/open?url=https%3A%2F%2Fraw.githubusercontent.com%2Fkevcunnane%2Fmsbuild_ads_demo%2Fmaster%2F0_YoAzdata.ipynb
```
2019-06-20 11:00:24 -07:00
Cory Rivera
72c3239d63 Allow user to select source package type in Manage Packages dialog. (#6092) 2019-06-20 10:51:36 -07:00
David Shiflet
99614ecc8f add --help documentation (#6112)
* add --help documentation

* hyphenate parameters
2019-06-20 13:05:38 -04:00
Aditya Bist
1ececc3035 Agent - Last step quit with success (#6034)
* turn strings to enums and allow changing step completion action

* fixed edit

* added tests for agent enums and fixed cms tests
2019-06-20 00:30:05 -07:00
Anup N. Kamath
433e5633cf Check provider type before throwing error message on cloud servers (#5948)
* check provider type in backup action

* check provider name in case of restore as well

* removed harcoding of constant
2019-06-19 22:53:54 -07:00
Alan Ren
b9a0c9ce7e getConnections API (#5651)
* getConnections

* update

* fix the condition check

* pr feedback

* fix test cases

* add test for the new method

* address comments
2019-06-19 22:51:53 -07:00
Kevin Cunnane
47cf496c36 Add Notebook Save integration tests (#6103)
* Add Notebook Save integration tests
Wrote test to verify save behavior
Fixed issues found during testing,
specifically around how we notify dirty state change to extensions

* Improved error messages
2019-06-19 16:09:24 -07:00
Kim Santiago
32313c71e4 Add ability to change source and target to Schema Compare (#6026)
* add ability to change source and target

* addressing comments

* fixes after rebasing

* add check for user

* bump extension version
2019-06-19 15:42:46 -07:00
Chris LaFreniere
453caa92d4 Fix for standard in hovering in code cell (#6107) 2019-06-19 15:02:03 -07:00
Charles Gagnon
7a689b93db Filter out high frequency command events (#6102) 2019-06-19 19:45:11 +00:00
Aditya Bist
639efbcfad Agent: Added loading component to schedule list (#5992)
* added loading component to schedule list

* changed thenable to await

* throw error
2019-06-19 12:32:14 -07:00
Kim Santiago
1c706fdfca align differences table with source label (#6094) 2019-06-19 09:56:48 -07:00
Charles Gagnon
fab8de632d Add EXTERNAL and FIRST_ROW to the keyword list for colorizing (#6097) 2019-06-19 14:44:29 +00:00
Kevin Cunnane
27cbd53253 Fix notebook dirty state after save (#6096)
Was getting a content changed loop in the events
Fix is to ignore save events from the input since it sends them
2019-06-18 17:38:20 -07:00
Kim Santiago
d67fd038dc DacFx integration tests (#6049)
* tests working

* add bacpac

* formatting

* addressing comments

* ignore bacpacs for hygiene check

* add check for error message when checking for db creation

* adding comments
2019-06-18 17:21:52 -07:00
Kevin Cunnane
36fe725cf0 Fix RunAllCells throwing unhandled exception (#6089)
- Added await here.
- There's a 2nd entry point already doing it the right way, in the Notebook extension.
  No need to change that
2019-06-18 16:19:32 -07:00
Charles Gagnon
373c3488bb Fix Add Azure Account dialog constantly reappearing (#6048) 2019-06-18 21:48:11 +00:00
Alan Ren
58e5e095e5 add PGSQL to integration test (#6040)
* Verify providers in integration test

* include pgsql
2019-06-18 14:35:08 -07:00
Charles Gagnon
9e7282d16a Connection dialog cleanup (#6076)
* Update names to be clearer and remove some unnecessary code

* Remove unused/inaccurate CMS display name value
2019-06-18 21:33:21 +00:00
Karl Burtram
561b7575ba Fix arith abort default value (#6080) 2019-06-18 14:11:15 -07:00
Cory Rivera
cecc899949 Disable Manage Packages button if python is not installed (#6008) 2019-06-17 18:28:16 -07:00
Charles Gagnon
59b0e6737f Update provider correctly when showing the dialog (#5995) 2019-06-17 15:32:36 +00:00
Aditya Bist
449cd9ea27 added run job context menu to jobs view page (#6011) 2019-06-14 16:20:49 -07:00
Aditya Bist
256ef072df fix header options (#6036) 2019-06-14 15:45:55 -07:00
Karl Burtram
26d8b32717 Fix lang pack names and versions (#6038) 2019-06-14 14:22:47 -07:00
Anthony Dresser
7a31d66d2c ensure we set group for both editors (#6016) 2019-06-14 14:00:28 -07:00
Karl Burtram
2ed9a93bae Add initial lang pack resources (#6035)
* Initial vs code lang packs

* Update resource to merge in ADS-specific strings
2019-06-14 13:38:04 -07:00
udeeshagautam
f494c7af4e Schema compare Icon and other fixes (#6009)
* Changes to 1. Enable Icon for Schema Compare model view editor 2. Set context setting in editable drop down 3. Fix a console error

* new icons

* Changes as per PR comments

* Adding PR comments

* Fixing a spelling mistake
2019-06-14 13:28:46 -07:00
Charles Gagnon
363af2a85c Add telemetry for all commands (#6025) 2019-06-14 16:16:21 +00:00
Kevin Cunnane
6ff34d9894 Fix #6022 connection dialog broken (#6023)
Opening connection dialog was broken if you have a postgres connection
listed in the Connections view and the extension isn't installed.
The capabilities service breaks as it returns an undefined object.
Added handling, so now it does show as "loading..." forever
but doesn't crash
2019-06-13 16:50:00 -07:00
Aditya Bist
a566fa9728 Agent - fix edit step (#6010)
* made all jobs get focus and keyboard friendly

* added tooltip for steps image

* made operator dialog accessible

* change job name as soon as job name changes in text box
2019-06-13 14:08:58 -07:00
Kim Santiago
248f2f5071 Formatting diff editor title (#6013)
* add background color and more formatting to diff editor title

* also handle hc-black
2019-06-13 13:07:52 -07:00
Gene Lee
a79f1ac830 Added 'New Notebook' task on database dashboard (#5996) 2019-06-12 18:33:52 -07:00
Aditya Bist
073a372d4d Agent: Accessibility bugs 2 (#5994)
* made all jobs get focus and keyboard friendly

* added tooltip for steps image

* made operator dialog accessible

* localized tooltip
2019-06-12 14:40:57 -07:00
udeeshagautam
6c69eaef4c Make dropdown editable but not allow okay till a valid value is selected (#5991)
Make dropdown editable but not allow OK till a valid value is selected
2019-06-12 10:18:05 -07:00
Alan Ren
b0fdaedfdb fix the sqldb OE test (#6006) 2019-06-12 09:04:13 -07:00
Alan Ren
d089d6642a fix the typo (#5997) 2019-06-11 23:17:07 -07:00
Charles Gagnon
5aa730b5d4 Add logging to try and find a possible connection race condition (#5813)
* Add logging to try and find a possible connection race condition

* Fix tests

* More test fixes

* Update error message and use error level
2019-06-12 00:08:51 +00:00
Karl Burtram
0832dd2a45 Add SCHEMA to colorization list (#5993) 2019-06-11 16:57:20 -07:00
Aditya Bist
a03507c998 added option to force reload extensions (#5990) 2019-06-11 16:12:41 -07:00
Kevin Cunnane
f1e38b655e Fix failing integration test due to notebook context menu (#5986)
- Updated test baselines
- Removed duplicate 'Standard SQL DB context menu test'
 -  it's identical to Azure test
 - Standalone database context menu test covers non-Azure
2019-06-11 16:09:12 -07:00
Charles Gagnon
33a9f2e3e4 Plumb editor state change through to state object (#5931) 2019-06-11 23:05:40 +00:00
Charles Gagnon
eaa5f504e3 Remove unneeded dev dependency clean-css (#5937) 2019-06-11 23:01:01 +00:00
Karl Burtram
5a7562a37b Revert "Merge from vscode 81d7885dc2e9dc617e1522697a2966bc4025a45d (#5949)" (#5983)
This reverts commit d15a3fcc98.
2019-06-11 12:35:58 -07:00
Kim Santiago
95a50b7892 Remove DacFx deploy action page (#5942)
* remove deploy action page since generated script is opened instead of saved now

* bump sqltoolsservice version to 1.5.0-alpha.100
2019-06-11 11:16:48 -07:00
Chris LaFreniere
86a3217e98 Notebooks: Log telemetry when all markdown cells rendered (#5862)
* Log telemetry when all markdown cells rendered

* Enable referencing previous telemetry timestamps

* Fix broken unit test to do a null check

* Undo loading icon changes in textcelll

* Addressing PR comments

* PR comments II
2019-06-10 21:55:49 -07:00
Chris LaFreniere
d15a3fcc98 Merge from vscode 81d7885dc2e9dc617e1522697a2966bc4025a45d (#5949)
* Merge from vscode 81d7885dc2e9dc617e1522697a2966bc4025a45d

* Fix vs unit tests and hygiene issue

* Fix strict null check issue
2019-06-10 18:27:09 -07:00
Cory Rivera
ff38bc8143 Add dialog to notebooks for managing Pip packages (#5944) 2019-06-10 17:02:31 -07:00
Kevin Cunnane
14a6bf581c Fix New Notebook issues (#5958)
* Fix New Notebook issues
- Fix #5338 New Notebook menu item should be next to New Query
- Fix #4936 Add a shortcut to create a notebook in the document well

Created a built-in New Notebook command
that routes to the existing extension-based command.
This avoided a rearchitecture that was more complex that seemed worth it.
Per VSCode patterns, used a _ modifier for the existing command so it's "private"
2019-06-10 16:38:07 -07:00
Kevin Cunnane
730ad4b814 Fix outputs constantly focusing on new output (#5959)
- Only scroll if it's the 1st output, not for subsequent ones
- Otherwise can't use notebook while a cell is running & regularly updating the outputs
2019-06-10 16:28:59 -07:00
Kevin Cunnane
f05260d95a Fix #5379 markdown Gray code makes code look wrong (#5954) 2019-06-10 14:29:29 -07:00
Charles Gagnon
673ecc3870 Change hardcoded MSSQL provider name to use constant instead (#5953) 2019-06-10 18:10:39 +00:00
Kim Santiago
97a37e6834 Disable schema compare generate script and apply buttons when no changes included (#5923)
* disable generate script and apply buttons when no changes are included

* addressing comments
2019-06-07 15:30:20 -07:00
Aditya Bist
d9b48bae80 Agent - Accessibility Bugs (WIP) (#5807)
* fix accessbility issue where tabbing would get wrong focus

* dialogs open one at a time

* get focus on filter headers

* added tool tips to proxy dialog

* added labels to step dialog
2019-06-07 09:41:00 -07:00
Charles Gagnon
cbaa0a132f Bump to 1.9.0 for July release (#5925) 2019-06-06 23:43:42 +00:00
Charles Gagnon
4ad5520568 Update azdata engine checks to include all future versions (#5924) 2019-06-06 22:36:43 +00:00
Gene Lee
43457c0184 Fixed run cell error message in Notebook (#5912) 2019-06-06 13:18:50 -07:00
Alan Ren
1150433c0a update the strings (#5904)
* update the strings

* PR comments and remove the workaround
2019-06-06 13:03:03 -07:00
Anthony Dresser
76a84a2cf4 fix dimensions of the query editor (#5910) 2019-06-06 12:31:12 -07:00
Charles Gagnon
68328f65b5 Update README and CHANGELOG for June 2019 Release (#5913)
* Update README and CHANGELOG for June 2019 Release

* Add contribution thank-you
2019-06-06 18:28:21 +00:00
Charles Gagnon
b7956c5fbf Update admin-tool-ext-win README (#5911) 2019-06-06 01:17:59 +00:00
Kevin Cunnane
44d6bb66da Add saved and executed events to notebook changed (#5848)
- Updated the notebook API to add a change kind, and support saved, executed and other simplified status
- Plumbed this through to the main thread classes
- Support sending the events from cell / input to the notebook model so they loop over the extension host as a content changed event
- Add executed event from the cell
2019-06-05 16:34:26 -07:00
Charles Gagnon
7d67711336 Update admin-tool-ext-win extension to require ADS 1.8.0 (#5905)
* Update admin-tool-ext-win extension to require ADS 1.8.0

* Use correct version check
2019-06-05 23:15:01 +00:00
Charles Gagnon
f320deaa73 More CMS test fixes (#5901) 2019-06-05 22:22:30 +00:00
Gene Lee
a518c4a529 Making Notebook to scroll to output area only when Notebook command is executed (#5893) 2019-06-05 12:07:31 -07:00
Chris LaFreniere
da164cec0a remove tabindex=0 for notebook component (#5851) 2019-06-05 08:28:00 -07:00
Charles Gagnon
bb470c3676 Fix CMS tests (#5897)
* Fix CMS tests

* Fix spacing and format
2019-06-05 06:07:13 +00:00
Aditya Bist
685a608518 Make sure the first Connection Dialog has correct provider. (#5884)
* force a restart for cms

* remove unneeded existing conditional

* fix for random provider when opening a connection dialog
2019-06-04 17:11:36 -07:00
Charles Gagnon
5be2121a3e Only run pipeline tests if all previous steps succeeded (#5886) 2019-06-04 23:22:22 +00:00
Gene Lee
bf643cc85f Fixed bug: Execute cell should scroll to its results (#5861) 2019-06-04 16:03:55 -07:00
Anthony Dresser
eda96c046a fix strict-null check (#5878) 2019-06-04 15:31:34 -07:00
Charles Gagnon
137c78c04e Remove VS redist license
This isn't necessary since we aren't providing the VS binaries as redist-able. VS has signed off as confirming that the main extension license is enough to cover the VS binaries as well. (#5880)
2019-06-04 22:20:43 +00:00
Karl Burtram
d9e1aa57c9 Bump SQL Tools Service to 1.5.0-alpha.99 (#5879) 2019-06-04 14:17:13 -07:00
Anthony Dresser
912c80e496 store active tab so it isn't overwritten (#5875) 2019-06-04 12:55:55 -07:00
Anthony Dresser
7390dce536 Fix handling of state in the grid panel (#5867)
* fix handling of state in the grid panel

* trigger rebuild

* trigger rebuild
2019-06-04 12:55:20 -07:00
Anthony Dresser
67859ab139 dont focus panels when they are shown; dont swallow keys for message panel (#5872) 2019-06-04 12:54:54 -07:00
Anthony Dresser
540635c54f make sure we update our sizes when we update the size of items (#5874) 2019-06-04 12:53:17 -07:00
Aditya Bist
6197279e83 remove SQL Login from CMS and add error messages (#5873) 2019-06-04 12:52:46 -07:00
Anthony Dresser
4ad226570a Add no implicit any to the strict null check (#5635)
* wip

* working through adding no implicit any
2019-06-04 09:29:40 -07:00
Charles Gagnon
50242b2c35 Add test for opening existing SQL file and typing text into it (#5816)
* Add test for opening existing SQL file and typing text into it

* Clean up

* More cleanup, remove unneeded queryEditor and add smoke test scripts

* Update comments to be clearer
2019-06-04 02:42:39 +00:00
Gene Lee
9f7d96bad3 Fixed bug: Notebooks Vertical Scrollbar is Unnecessary for Some Grid Outputs (#5847) 2019-06-03 16:47:07 -07:00
Chris LaFreniere
8d70544374 Notebooks: Adding Change Kernel API, 3 Integration Tests (#5287)
* Notebook change kernel

* Fix notifying of k change too many times add tests

* Fix broken unit test

* Deal with comment
2019-06-03 14:49:40 -07:00
Charles Gagnon
4b6214c9a4 Remove install count widget from extension gallery view until we can actually have install counts (#5828) 2019-06-03 14:48:26 -07:00
Alan Ren
aaa2ef3a97 Alanren/tool check (#5810)
* tool check

* tools table update

* buttons

* update tools table

* remove tool status check

* updates

* PR comments
2019-06-03 14:32:10 -07:00
Charles Gagnon
639bd5a550 Fix typo in setting description (#5837) 2019-06-03 14:31:24 -07:00
udeeshagautam
763080aea0 Increasing schema compare and dacfx ext versions in Source code (#5832)
* increasing schema compare and dacfx ext versions

* updating azdata dependency version
2019-06-03 12:18:03 -07:00
Cory Rivera
fb713e0762 Automatically detect existing Python/Conda installs in Configure Python dialog. (#5801) 2019-06-03 11:56:06 -07:00
Aditya Bist
a8d41a6717 updated cms version to not break with old core (#5812) 2019-06-03 09:06:55 -07:00
Chris LaFreniere
34bc0efc1c Check if sessionManager exists (#5811) 2019-05-31 17:42:12 -07:00
Anthony Dresser
23e4a30cd1 dispose and clear things out when they are not needed to free up memory (#5805) 2019-05-31 17:24:01 -07:00
Karl Burtram
ba58b0f429 Update CMS extension display name to be consistent with other extensions (#5802) 2019-05-31 16:04:13 -07:00
Charles Gagnon
89e6e062ab Add licenses for SsmsMin extension (#5806)
* Add licenses to Admin Tool Extension

* Update license link
2019-05-31 15:37:48 -07:00
Anthony Dresser
33ff8ec5a3 Fix instances of listeners not being disposed in Query Editor (#5798)
* fix instances of listeners not being disposed

* spelling mistake
2019-05-31 15:14:03 -07:00
Yurong He
9f4053d051 Fixed #4113 Added integration tests to cover cell language unit tests (#5788)
* Fixed #4113

* Resolve PR comments
2019-05-31 15:03:21 -07:00
Aditya Bist
1bb9d142f1 CMS - error check (#5796)
* cms connections dont save

* added value to enum

* remove refresh and update provider name for cms

* removed ownerUri from saved connection and contributed to array

* removed owneruri

* ownerUri not needed any more

* removed AAD from cms

* initial review

* changed comments

* add back saveProfile option for connectionProfile

* review fixes and other UI improvements

* fixed auth

* added cms integration tests

* added constants

* removed utils from apiwrapper

* changed connection type name

* review comments

* clearer code and addressed reviews

* added more error checks

* add back functions in apiwrapper
2019-05-31 14:27:27 -07:00
Aditya Bist
b10bd70d67 bump agent version for june (#5800) 2019-05-31 13:52:10 -07:00
Anthony Dresser
dc2ff2295e change message focus only on query end (#5793) 2019-05-31 13:23:17 -07:00
Anthony Dresser
f231d6945f clear indicator when a query starts (#5795) 2019-05-31 13:19:03 -07:00
Kevin Cunnane
8786caf630 Fix #4976 AAD Login will not work from Azure (#5768)
- Azure token wasn't being copied into profile so expand didn't work, it spun forever
- Fixed this by saving in password/token after the connection profile copy, & refactored to be cleaner
- We should ideally ensure session notifies resolves even on fail so if there's a different error it'll break, not spin forever. Punting this part of the fix for now
- Also doesn't solve issue where we always try to connect even if user + password combo isn't saved. Will investigate / work in a separate PR on this.
2019-05-31 12:47:45 -07:00
Anthony Dresser
5c520bc82e Apply title bar adjustment to edit data (#5777)
* apply title bar adjustment to edit data

* wi

* rework layout to be simplier

* fix layout
2019-05-31 12:03:52 -07:00
kisantia
1ccf408654 Fix #5702 Schema Compare Diff is editable (#5703)
* make diff readonly

* change to use ResourceEditorInput instead of UntitleEditorInput since it is readonly

* inject service

* use this instead of declaring local variables

* addressing comments
2019-05-31 11:45:04 -07:00
Anthony Dresser
76ebfe38a1 focus messages on error (#5791) 2019-05-31 11:43:15 -07:00
Anthony Dresser
364206010b clear out tabs when a input is set (#5789) 2019-05-31 11:38:52 -07:00
Anthony Dresser
559c675164 Hide results tab when there are none (#5763)
* wip

* add behavior around hiding results when there are none

* fix strict null access
2019-05-31 11:18:33 -07:00
Aditya Bist
1773dede25 CMS fit and finish (#5542)
* cms connections dont save

* added value to enum

* remove refresh and update provider name for cms

* removed ownerUri from saved connection and contributed to array

* removed owneruri

* ownerUri not needed any more

* removed AAD from cms

* initial review

* changed comments

* add back saveProfile option for connectionProfile

* review fixes and other UI improvements

* fixed auth

* added cms integration tests

* added constants

* removed utils from apiwrapper

* changed connection type name

* review comments

* clearer code and addressed reviews
2019-05-31 11:14:37 -07:00
Gene Lee
fa52478ffa Changed output channel name from 'Jupyter Notebooks' to 'Notebooks' (#5780) 2019-05-31 11:08:34 -07:00
Kevin Cunnane
ae0603c041 Improve Azure startup time and fix command palette (#5761)
* Improve Azure startup time and fix command palette
- Improvement (but not full fix) for #4732 where Azure startup is very slow.
This speeds the startup by up to 5 seconds by
  - Changing from fixed 5 second initial wait time to a promise to check for accounts
  - Using the accounts changed notification to send a message when accounts are ready. This is usually what will "win out" since Azure seems to load before the Account Providers are set up.
- Remove right-click actions from the command palette.
- Rename Azure actions so it's clear what they are in the command palette.

* Remove coveralls task while investigating how to make optional task

* Add void return type
2019-05-31 10:22:01 -07:00
Kevin Cunnane
a364af5c4c Fix #5765 Trusted state handling (#5775)
- Notebooks Can be Improperly Trusted After Save + Reopen
Fix was to only save if actually trusted.
Also fixed condition where it wasn't correctly skipping if in the queue
2019-05-31 10:21:30 -07:00
Anthony Dresser
6b76611c93 Handle maintaining tab state properly (#5776)
* handle maintaining tab state properly

* some code cleanup
2019-05-31 10:13:19 -07:00
udeeshagautam
d7df71c8ba Do not add generated files from Temp if file is not dirty (#5758)
* Do not add generated files from Temp if file is not dirty

* restricting the path more

* Adding tmpdir as per CR comment
2019-05-30 18:39:42 -07:00
udeeshagautam
ea464abaaf show not supported message for backup for Azure (#5762)
* show not supported message for backup for Azure

* Adding PR comments
2019-05-30 15:34:11 -07:00
Yurong He
a026b682c4 Fixed #4485 add max-width for attachTo and tooltip (#5493)
* Added tooltip of selectBox and set max width of attach to

* Undo our change, vscode fixed the issue
2019-05-30 15:19:06 -07:00
Alan Ren
8e3fa660fd fix smoke test failure (#5764) 2019-05-30 15:03:18 -07:00
kisantia
c9257822ec Disable schema compare generate script and apply buttons after applying changes (#5759)
* disable generate script and apply buttons after applying changes

* reenable buttons if apply failed
2019-05-30 14:08:47 -07:00
Charles Gagnon
783bb8bd92 Fix editor title not updating after connect (#5760)
* Fix editor title not updating after connect

* Fix disconnect to also update title
2019-05-30 13:39:16 -07:00
Anthony Dresser
bf19ab6ad9 add messages to title to help discoverability (#5733) 2019-05-30 13:31:32 -07:00
udeeshagautam
c4e59027fc Backup to open correctly from insights dialog (#5737)
* recreate modal dialog button to ensure onDidClick works properly

* fixing comment message

* Adding PR comments
2019-05-30 12:32:38 -07:00
Karl Burtram
f9b7bc26c0 Add a null check before using settings editor options (#5756) 2019-05-30 11:31:00 -07:00
Anthony Dresser
a197cd6158 Add customizable error color; fix line height in Messages (#5734)
* add customizable error color; fix line height

* ensure proper disposal of elements
2019-05-30 10:22:39 -07:00
Karl Burtram
ae5b506848 Provide a default result grid height (#5739)
* Revert "Revert "Provide a default result grid height (#5710)" (#5738)"

This reverts commit f8ab5fef78.

* Tune the sizing a bit to be more closer to May release
2019-05-30 10:21:09 -07:00
Karl Burtram
f8ab5fef78 Revert "Provide a default result grid height (#5710)" (#5738)
This reverts commit 0893ba33fc.
2019-05-29 18:23:12 -07:00
Karl Burtram
0893ba33fc Provide a default result grid height (#5710) 2019-05-29 17:38:47 -07:00
Maddy
75ce8d6c20 Maddy/properties widget for endpoints (#5654)
* changes to resue properties widget for endpoints

* comments added

* added loader

* changes to address the review comments: code clean up and constants

* dark theme style fox and misc

* added varibale for display aligment

* renamed hyperlink to isHyperlink

* added endpoints as a modelview to be used on dashboard

* added padding to fix the overlap issues.

* removed the propertieswidget changes

* formatting fixes for hygiene errors

* renamed endpoints to bdc-endpoints

* chnages to address the comments

* added null check
2019-05-29 17:30:11 -07:00
Kevin Cunnane
6d56701b5b Refactor to remove controller and static instance (#5735) 2019-05-29 17:23:23 -07:00
Anthony Dresser
ea8aa92dd5 fix scrolling in messages (#5732) 2019-05-29 15:13:56 -07:00
Charles Gagnon
9fc634dfe0 Fix query files not being editable (#5731)
* Initial attempt at fix

* Use two separate containers and just swap between those
2019-05-29 14:21:06 -07:00
udeeshagautam
4b1088edbc Bug/schema compare apply confirmation (#5698)
* Ask confirmation before apply

* Adding PR comments
2019-05-29 13:30:22 -07:00
Kevin Cunnane
8e355a14e9 Fix #4529 'Use column names as labels' by default (#5723)
- Fixes charting issue
2019-05-29 13:03:31 -07:00
Anthony Dresser
ef2b6f91f1 Fix task flicker (#5694)
* adjust timer rendering

* give each element a unique id

* remove testing code

* remove unused imports

* change task timeout to 1000
2019-05-29 11:34:27 -07:00
Arvind Ranasaria
9cae7a0a49 First version of Stress (#5499)
* First version of Stress - moving over from feat/Stress1 branch

* a working version - still issues with stresssified notebook tests

* update notebooks to use new message event (#5395)

* Latest changes for notebook tests

* Stressify objectExplorer tests

* formatting changes

* removing the tracing added previously and ability to set tsc verbose option in tsconfig.json

* addressing review feedback

* addressing review feedback

* implementing runtime parameter for Stress

* addresing review feedback and moved out stress modules to its own project outside of azuredata source tree

* referencing adstest from the github location

* incorporating review feedback

* Review feedbak

* removing uncommong entries added to .gitignore

* removing unrelated change

* replacing debug/trace statements with console.info or cosole.warn statments in integration-tests\main.tx
2019-05-29 11:18:20 -07:00
kisantia
9a55ca3021 Show message to recompare when schema compare options change (#5690)
* add message to recompare if options changed

* changes after rebasing

* Add compare button to notification

* remove async

* change to yes and no button
2019-05-28 16:46:44 -07:00
Charles Gagnon
e8b20c86c1 Have Install-SsmsMin task clean up all older versions (#5670) 2019-05-28 15:21:19 -07:00
Yurong He
dc78a4af88 Fixed #5631 didn't use % for variable name (#5633) 2019-05-28 15:18:15 -07:00
Gene Lee
34273907f7 Made connection icon to be remembered by memento in ConnectionManagementService (#5523) 2019-05-28 14:45:28 -07:00
Karl Burtram
ca1c5899a1 Bump vs code version metadata for June (#5697) 2019-05-28 13:37:26 -07:00
Anthony Dresser
0064f17ad4 add icon for custom title bar (#5693) 2019-05-28 12:46:10 -07:00
Anthony Dresser
f8ccafd2af Add focus logic to the tabbed panel (#5649)
* add focus logic to the tabbed panel

* change enter to be on key up

* fix  tests
2019-05-28 12:44:22 -07:00
udeeshagautam
d386311e54 Feature/schema compare: refactor options dialog (from checkbox list to table) (#5608)
* Adding code to change checkbox list to table with checkbox.

* removing some unnessary ','

* Increasing the table a little bit

* reverting height changes

* Adding PR comments
2019-05-28 11:59:14 -07:00
Charles Gagnon
0e2475aa72 Fix agent context menu to appear in correct location (#5671)
* Fix agent context menu to appear in correct location

* Further type function params

* Just use anchor directly
2019-05-28 10:37:37 -07:00
Anthony Dresser
b4cce9f147 adjust modals for title bar (#5648) 2019-05-27 14:40:16 -07:00
Charles Gagnon
3e8f25d864 Update to latest version of SsmsMin (#5646) 2019-05-26 16:43:23 -07:00
udeeshagautam
4c68580e82 Fixing model view editor to layout currenty with native and custom settings (#5650) 2019-05-25 16:33:25 -07:00
Charles Gagnon
3bc82c10b1 Change some of the env vars to be ADS-specific (#5636)
* Change some of the env vars to be ADS-specific

* Undo changes to ext host IPC hook

* Keep VSCODE_LOGS vars too
2019-05-24 17:17:02 -07:00
Alan Ren
c7d94055a4 add managed instance to recommended list (#5639)
* add managed instance to recommended list

* update the publisher name casing
2019-05-24 17:00:21 -07:00
kisantia
34316b0ffd fix widths changing when radio buttons are clicked (#5634) 2019-05-24 16:44:30 -07:00
kisantia
3d494dcd73 Add initial telemetry for schema compare (#5595)
* add initial telemetry to schema compare

* addressing comments
2019-05-24 15:32:23 -07:00
Anthony Dresser
85b2c4de4a disable remote contribution (#5630) 2019-05-24 13:55:15 -07:00
Anthony Dresser
d0ce6bb066 move status bar contribution to contribution file (#5609) 2019-05-24 12:22:39 -07:00
Anthony Dresser
38c6495fd8 Inital Messages Tab work (#5604)
* inital work

* iterate

* move messages to tab

* revert package changes

* remove unused properties

* format imports
2019-05-24 12:22:27 -07:00
Anthony Dresser
bcc449b524 Merge from vscode 5b9869eb02fa4c96205a74d05cad9164dfd06d60 (#5607) 2019-05-24 12:20:30 -07:00
Alan Ren
361ada4963 update engine name from sqlops to azdata (#5605) 2019-05-23 22:08:03 -07:00
Alan Ren
2e8d62d0ca remove api used only by us (#5588)
* remove api used only by us

* fix the exthost crash issue
2019-05-23 22:02:28 -07:00
Karl Burtram
fb97bf6041 Don't fail build with vsix directory exists (#5593)
* Don't fail build with vsix directory exists

* Add a !fs.exists call prior to mkdir
2019-05-23 16:55:37 -07:00
Charles Gagnon
6d9c95720d Fix extensions to swallow exceptions when sending telemetry (#5600)
* Fix extensions to swallow exceptions when sending telemetry

* Remove reference to GDPR

* Clean up now-unused code
2019-05-23 16:24:55 -07:00
Anthony Dresser
db5a0a892a fix issue with empty connection (#5602) 2019-05-23 16:07:28 -07:00
udeeshagautam
7b01a6ca61 Fix for Schema compare Options dialog double scroll (#5584)
* setting height through vh removes outser scroll (% does not work)

* Chainging auto to scroll for overflow for better viewing
2019-05-23 15:51:52 -07:00
Charles Gagnon
fbbf767700 Clean up extension telemetry (#5596) 2019-05-23 15:29:28 -07:00
Yurong He
776e2cf6e7 Fixed #4567 adding open connectionDialog when no connection is available (#5581)
* Fixed #4567 adding open connectionDialog when no connection is available
2019-05-23 15:11:10 -07:00
Anthony Dresser
34ca0e8671 fix editors being disposed multiple times (#5594) 2019-05-23 14:08:37 -07:00
kisantia
0d37047c37 Get rid of schema compare double scrollbar (#5571)
* hide split view overflow to get rid of double scrollbar

* change overflow to hidden
2019-05-23 13:41:59 -07:00
Aditya Bist
25c8f60e6e Committer week - Agent fixes and refactor (#5547)
* refactored job actions to wait for job objects to load

* sorted the jobs in descending order

* decreased steps table height in jobs dialog

* cannot open multiple instances of dialogs now

* fixed edit job not show up some times when re opening history page

* fix broken icon

* fix a bunch of stuff and refactored code

* added isopen prop to dialog
2019-05-23 12:35:50 -07:00
Anthony Dresser
aae1480e4f Allow data explorer to use connect (#5564)
* wip

* handle save password; get correct profile

* ensure resolve is being called

* fix tests

* fix more tests
2019-05-23 11:44:18 -07:00
Anthony Dresser
5e8a52bcc0 Query Editor Refactor (#5528)
* query editor changes

* query editor changes

* finish converting query editor

* fix merge issue

* remove unused code

* fix tests

* fix tests

* remove editor context key class

* edit tests to test input state
2019-05-23 11:43:59 -07:00
Anthony Dresser
cf8f8907ee Merge from vscode 8aa90d444f5d051984e8055f547c4252d53479b3 (#5587)
* Merge from vscode 8aa90d444f5d051984e8055f547c4252d53479b3

* pipeline errors

* fix build
2019-05-23 11:16:03 -07:00
Anthony Dresser
ca36f20c6b fix bug with previous classes staying applied (#5586) 2019-05-23 11:15:18 -07:00
kisantia
bdae02e51e fix default server not being chosen (#5583) 2019-05-22 16:42:02 -07:00
udeeshagautam
2d7eb0dcb5 Feature/schemacompare include exclude checkboxes (#5548)
* Adding include exclude boxes

* Adding as table component generic column and rememebering state.

* converged custome action and select checkboxes, moved sc checkbox to middle, can have multiple checkboxes and can remember state now

* adding action on column as a common column property

* Taking PR comments

* Changing Arg name as per CR comment

* Taking a PR comment
2019-05-22 13:53:50 -07:00
Karl Burtram
2e0756ad8d Customize mixin strings for insiders build (#5569)
* Add test product-insider.json file

* Update some GUID

* Update mixin file

* Add more mixin customizations

* Update the mixin strings a little more

* Reorder code blocks to shorten a bit
2019-05-22 12:33:23 -07:00
kisantia
ef2bbce34b Widen schema compare dialog dropdowns and textboxes (#5563)
* widen dialog dropdowns and textboxes

* change padding

* add min-width for labels to match connection pane
2019-05-22 11:55:12 -07:00
Charles Gagnon
65e59dc57d Fix callback to display error message instead of {1} (#5577) 2019-05-22 11:45:31 -07:00
Anthony Dresser
2ca39b571a removes unused vscode changes (#5533) 2019-05-21 21:48:57 -07:00
udeeshagautam
48a6157efb Test fixes based on recent changes (#5570)
* Test fixes based on recent changes and allowing toolbar to stay at top

* removing scroll changes to be in seperate PR
2019-05-21 19:56:44 -07:00
Yurong He
ab3a64604a Fixed #5546. Add missing line for vscode merge and wait for untitledM… (#5567)
* Fixed #5546. Add missing line for vscode merge and wait for untitledModel.load
2019-05-21 18:56:28 -07:00
kisantia
e9ddf43c6c only generate filename if the database name is set (#5568) 2019-05-21 16:03:54 -07:00
udeeshagautam
2549d91ddf Fixing column polugin to not have extra whitespaces (#5565) 2019-05-21 15:09:49 -07:00
Anthony Dresser
81ae86ff79 Revert "Revert "Merge from vscode ada4bddb8edc69eea6ebaaa0e88c5f903cbd43d8 (#5529)" (#5553)" (#5562)
This reverts commit c9a4f8f664.
2019-05-21 14:19:32 -07:00
udeeshagautam
7670104e4d DisableSchema compare dialog's Okay button if source/target isn't selected (#5557)
* Enable schema compare ok button only on filling source and target

* Nit fixes
2019-05-21 12:48:43 -07:00
udeeshagautam
3fc2ad5bc9 Schema Compare tests addition (#5136)
* initial tests for schema compare

* Adding schema compare integration test

* Adding code to fix some build issues

* DB compare test

* Adding some CR comments

* db creation and deletion as per CR comments
2019-05-21 11:17:52 -07:00
kisantia
c84367e2ee Open schema compare script (#5515)
* don't prompt for script file name anymore

* bump sqltoolsservice version so the generated script gets opened
2019-05-21 10:30:18 -07:00
kisantia
77413ad25c Schema compare server dropdown changes (#5552)
* remove duplicate server entries in server dropdown

* change server dropdown to show most recent connections first
2019-05-21 09:23:43 -07:00
Yurong He
c1f73255b5 Fixed #3936 display cancel msg when interrupt run cell (#5518) 2019-05-20 21:52:36 -07:00
Yurong He
fe7ec76cd5 Fixed #4289 setLoading to false too early before (#5503) 2019-05-20 21:52:07 -07:00
Alan Ren
0bc9849ad8 more test cases (#5545)
* toolsservice test

* resource type tests

* pr comments

* comments
2019-05-20 17:09:10 -07:00
Anthony Dresser
c9a4f8f664 Revert "Merge from vscode ada4bddb8edc69eea6ebaaa0e88c5f903cbd43d8 (#5529)" (#5553)
This reverts commit 5d44b6a6a7.
2019-05-20 17:07:32 -07:00
Karl Burtram
1315b8e42a Bump Azure Data Studio to 1.8.0 (#5554) 2019-05-20 16:27:08 -07:00
Cory Rivera
7a03da42ec Revert user settings changes after running Existing Python Install integration test. (#5551) 2019-05-20 15:44:14 -07:00
Charles Gagnon
8b9bb2a8fc Fix/add OE Context Menu tests (#5543)
* Fix/add OE Context Menu tests

* Add return types for helper functions
2019-05-20 13:48:49 -07:00
Kevin Cunnane
162dfbaab0 Add SQL settings into the settings editor (#5524)
- Initial stab at a hierarchy for all Azure Data Studio settings
- Will revisit later, but this covers all built-in settings
2019-05-20 12:27:14 -07:00
Charles Gagnon
71b6e35231 Update installer warning to refer to ADS instead of VS Code (#5540) 2019-05-20 08:19:01 -07:00
Karl Burtram
9268513128 Detect when resultset length changes and force buffer recreation (#5522) 2019-05-20 08:14:12 -07:00
Anthony Dresser
5d44b6a6a7 Merge from vscode ada4bddb8edc69eea6ebaaa0e88c5f903cbd43d8 (#5529) 2019-05-19 18:52:19 -07:00
Alan Ren
586fe10525 resource deployment ext implementation -wip (#5508)
* resource types

* implement the dialog

* remove unused method

* fix issues

* formatting

* 5-17

* address comments and more tests
2019-05-17 20:24:02 -07:00
Cory Rivera
a59d1d3c05 Add functionality to use an existing Python installation for Notebook dependencies (#5228) 2019-05-17 14:39:44 -07:00
kisantia
1fce604a11 add css styling for text component (#5491) 2019-05-17 14:17:14 -07:00
Kevin Cunnane
8ea831c845 Fix #3439 Trusted doesn't get saved in Notebooks (#5507)
* Fix #3439 Trusted doesn't get saved in Notebooks
The main fix is to add a memento to store trust information. This is only needed for saved files - untitled files are always trusted as the user created them.
On clicking trusted or saving a file, the trusted state is cached. In the future, we will also handle code execution here too by sending notification on snapshot state.
I found issue #5506 during testing - existing issue where we should track trusted state changing on run. In the case all cells are ran, the whole notebook should become trusted.

Finally, I did a decent amount of refactoring to move more logic to the model - removing unnecessary calls from components which duplicated model behavior, moving trust notification to the model or at least the notebook service completely.

Added tests and logging for catch handling
2019-05-17 11:56:47 -07:00
Charles Gagnon
94061fa634 Fixes/improvements for SsmsMin extension (#5495)
* Fix server name in URN for Azure servers and update menu when clauses

* Use setStatusBarMessage instead of constructing our own. Remove server name from URN - it's not necessary.

* Clean up unused code
2019-05-17 09:38:15 -07:00
Karl Burtram
fe17955fa1 Add support for setting query options in settings and through API (#5484)
Initial query execution options PR.  This is the second in a set of query editor extensibility improvements I'm making.  The PRs include (1) bug fix for webview in query tab (2) dynamic toolbars and (3) fix query event sequencing and event metadata info.
2019-05-16 16:31:30 -07:00
kisantia
3158e9f63a Change schema compare generated script filepath (#5496)
* generated script filepath defaults to folder last opened in file dialog

* use string interpolation
2019-05-16 14:55:08 -07:00
Karl Burtram
fcb2b53bf8 Update yarn.lock files (#5489) 2019-05-15 13:15:58 -07:00
udeeshagautam
5470702c16 Adding reset for options (#5483)
Adding reset for schema compare options (fix for bug #5483)
2019-05-15 10:43:13 -07:00
kisantia
8c05c6e122 fix schema compare diff script formatting (#5486) 2019-05-15 09:03:14 -07:00
Charles Gagnon
cc397a012f Revert "Switch to vscodetestcover for running extension tests (#5458)" (#5487)
This reverts commit dbac187b44.
2019-05-14 17:16:01 -07:00
Charles Gagnon
e12827de3f Add pipeline task to push coverage results to coveralls (#5461) 2019-05-14 17:14:04 -07:00
Gene Lee
02a646c7ea Fixed bug: wrong handle usage in updateWizardPageInfo() in extHostModelViewDialog (#5485) 2019-05-14 16:17:59 -07:00
Charles Gagnon
86986efb15 Fix test cleanup to remove folders regardless if tests passed or failed (#5482) 2019-05-14 16:09:58 -07:00
Stephen
a718fb3cae Corrected Keyboard Shortcut Execution Issue (#5480) 2019-05-14 15:41:12 -07:00
kisantia
48ba9ce175 Sanitize db name for dacpac/bacpac file names (#5479)
* Sanitize db name for filename

* Add unit tests

* lower timeout to 60 seconds

* add extra coverageConfig.json and missing character check
2019-05-14 15:00:28 -07:00
udeeshagautam
be60ad6766 Bug fix Backup dialog list box to allow selecting file (#5472)
* Fixing list box to allow selection and not show dropdown

* Adding PR comments
2019-05-14 13:45:25 -07:00
Alan Ren
b1b58f2550 resource deployment extension (#5464)
* initial checkin

* exclude from default ads package

* keep extensions.js in sync

* address review feedback

* PR comments
2019-05-13 22:38:59 -07:00
Gene Lee
99d00e2057 Differentiated server icons by server type: box, big data cluster, cloud... (#5241) 2019-05-13 14:52:56 -07:00
Chris LaFreniere
7da0dddaa9 Notebooks: Prevent Outputs from Rendering Too Many Times on Layout Change Events (#5366)
* Prevent output rendering too many times

* went from 250ms to 50ms without any perf penalties
2019-05-13 13:41:23 -07:00
Yurong He
3ea45e4ef5 Fixed #5463 Enable stricter compile options on extensions #5044 missed mkdir line (#5469) 2019-05-13 13:28:32 -07:00
Charles Gagnon
dbac187b44 Switch to vscodetestcover for running extension tests (#5458)
* Update vscode dependency (needed for test runner) to pull latest 1.x version

* Update to use 1.1.5 instead

* Undo change to vscode engine version

* Switch to using vscodetestcover for running tests

* Switch to cobertura output for coverage reports

* Remove vscode dependency from profiler extension

* Update yarn.lock with combined changes
2019-05-13 11:39:09 -07:00
Anthony Dresser
43293b98c0 Merge from vscode fa77b52b5e2067798006aaff8e463a2b425509d5 (#5453) 2019-05-13 10:57:51 -07:00
Charles Gagnon
34eef8e97d Add download task for SsmsMin (#5460)
* Add SsmsMin files

* no message

* Add script to install extension
2019-05-10 14:08:43 -07:00
kisantia
5a48fd80cd Fix #4460: Add dacfx wizard to server and database folder context menus (#4989)
* add dacfx wizard to server and database folder context  menus

* update object explorer test
2019-05-10 12:03:26 -07:00
Alan Yu
5260afc15d Added Gif to Schema Compare readme (#5424) 2019-05-10 10:14:51 -07:00
Alan Ren
66b4e08026 produce the vsix files during windows build (#5454)
* vsix packages

* update path
2019-05-10 09:57:13 -07:00
Charles Gagnon
2b8e0cc6c4 Bundle SsmsMin directly into extension (#5450) 2019-05-09 18:28:49 -07:00
Karl Burtram
a501214bfa Adjust estimated scrollbar height to avoid unneeded scrollbar (#5452) 2019-05-09 18:19:07 -07:00
Anthony Dresser
2b9a8b9136 Merge from vscode 8ef8aa6b3cb5b96870660fdd3bb8d0755e62fe51 (#5422) 2019-05-09 11:10:57 -07:00
Chris LaFreniere
1cc3cb5408 Notebook Port Test Fix (#5433)
* Test fixes to not rely on 7100

* PR comments
2019-05-08 13:16:09 -07:00
Karl Burtram
7df793f208 Update readme\changelog for May release (#5421) 2019-05-08 10:03:21 -07:00
Charles Gagnon
ca2b7cc4bc Remove unnecessary 'use strict' lines and add hygiene check (#5363)
* Remove unnecessary 'use strict' lines and add hygiene check for them

* Move check to under tslint filters to reduce number of filters needed

* Only take first 10 lines of file
2019-05-07 22:57:08 -07:00
Aditya Bist
9e0a74da3d Committer work: Import fixes and improvements (#5357)
* can make imports, but need to change success and failure views

* fixed error handling UX

* removed unused function

* cr comments

* fix typo

* requested change
2019-05-07 17:25:58 -07:00
Charles Gagnon
da5076a4dc Clean up extension packaging (#5416)
* Clean up extension packaging

* Add sql image
2019-05-07 16:52:00 -07:00
Alan Yu
005808f003 Adding CMS readme (#5417) 2019-05-07 16:16:54 -07:00
Alan Ren
1dd1919200 bump the version of schema compare extension (#5411) 2019-05-07 14:45:28 -07:00
Aditya Bist
0a9db55dc4 Fix agent css in packaged builds (#5409)
* fix agent css in packaged

* fix agent css
2019-05-07 14:16:52 -07:00
Anthony Dresser
4551ba5b7c update notebooks to use new message event (#5395) 2019-05-07 13:03:58 -07:00
Karl Burtram
0f25ad5676 Bump agent, import, profiler extension versions (#5405) 2019-05-07 12:55:26 -07:00
Anthony Dresser
c6564c0d84 Fix css to fix new dashboard tab dialog (#5385)
* fix css to fix new dashboard tab dialog

* add readonly
2019-05-06 17:25:51 -07:00
kisantia
a7e94d433f bump dacpac extension version (#5387) 2019-05-06 17:10:51 -07:00
Karl Burtram
64480a35ac Bump the VS Code version metadata (#5392) 2019-05-06 17:02:51 -07:00
Alan Ren
4f96d5caf2 fix profiler formatter issue (#5372) 2019-05-06 15:39:44 -07:00
Maddy
7cca1b9f48 Maddy/newline overrides edit data (#5364)
* formatting it before displaying

* replace with space

* added comments and method to handle the line break space conversion
2019-05-06 14:20:08 -07:00
Yurong He
56ebbedbfd Add log for output verification (#5381)
* Add log for output
2019-05-06 14:15:51 -07:00
Anthony Dresser
4e5b8ce875 add a applystyle function with saftey checks in header filter (#5380) 2019-05-06 13:43:54 -07:00
Anthony Dresser
5c5ee50983 fix error icon (#5373) 2019-05-06 12:33:38 -07:00
Chris LaFreniere
0a67488447 Don't scroll when running code cell if in view (#5365) 2019-05-06 11:36:06 -07:00
Kevin Cunnane
022761aa4b Fix #4089 Linked account cancel (#5347)
VSCode serialization changed rejection handling to only serialize errors, which caused things to break
- Changed to return either account info or a cancel message in the resolve
- Rewrote to use promises. Tracking how to return canceled through 4+ thenables was way trickier than just using a promise
- Updated unit tests to handle new scenario
- Tested integration tests, realized they a) didn't run and b) didn't passed. 
  - Added vscode dev dependency to fix run issue
  - Fixed tests to account for behavior changes in tree state.
2019-05-06 09:13:03 -07:00
Anthony Dresser
b9d985b663 remove event emitter and use event and emitter instead (#5361) 2019-05-06 00:27:55 -07:00
udeeshagautam
08ed9d285e Changing the start for Schema compare and adding some text (#5356)
* Dialog Ok will not start Schema compare explit compare will. Adding wait text.

* Fixing variable name and spelling
2019-05-05 11:22:32 -07:00
Anthony Dresser
ab0cd71d10 Remove logging and clone utlities (#5309)
* remove log utility functions; remove custom mixin

* fix tests

* add log service as required by telemetry utils

* remove unused code

* replace some console.logs with logservice
2019-05-04 22:37:15 -07:00
Anthony Dresser
df7645e4e5 Merge from vscode da3c97f3668393ebfcb9f8208d7616018d6d1859 (#5360) 2019-05-03 21:59:40 -07:00
Aditya Bist
39f9c72390 fixed packaged icons for azure (#5354) 2019-05-03 18:47:45 -07:00
Anthony Dresser
0023714884 Fix menu entry for tasks (#5355)
* fix menu entry for tasks

* change wording for action
2019-05-03 18:20:51 -07:00
kisantia
ec9fdc517f Fix schema compare dialog font sizes (#5353)
* make font sizes consistent

* fix order of source dialog options

* update fontsize to 13

* make fontsize a constant
2019-05-04 10:48:25 +12:00
Anthony Dresser
db387eb770 Fixing the layering in the base folder (#5308)
* removes more builder references

* remove builder from profiler

* formatting

* fix profiler dailog

* remove builder from oatuhdialog

* remove the rest of builder references

* formatting

* add more strict null checks to base

* enable strict tslint rules

* code layering of base

* wip

* working through changes to table data view

* fix tests

* update editabledropdown to not use layout service

* wip

* fix imports

* fix import

* fix compile error

* add more localization

* add comments to changes to import patterns

* add more import comments
2019-05-03 14:49:18 -07:00
kisantia
354ed22706 Fix #5314: schema compare doesn't always open with correct default connection (#5315)
* fix schema compare default connection when there are multiple connections
2019-05-04 09:40:53 +12:00
Aditya Bist
a69f194d8b Fixed CMS connection auth (#5306)
* fixed auth change to reflect in UI

* added comments

* fix bug where cms shows in dropdown

* added parameter to parent call
2019-05-03 13:26:04 -07:00
Aditya Bist
f5d13319a2 Remove CMS from packaged builds (#5349)
* remove cmss from packaged builds

* add repo to package file

* added cms to recommendations
2019-05-03 13:25:47 -07:00
Yurong He
9a0e691635 Change toolbar icon back to black (#5345)
* Change it back to black icon

* Install package icon
2019-05-03 13:11:44 -07:00
kisantia
6c7cb185a1 group source and target components into formcomponentgroups (#5329) 2019-05-04 05:51:40 +12:00
Charles Gagnon
ab22b93ce0 SsmsMin integration clean-up and improvements (#5339)
* SsmsMin integration clean-up and improvements

* Fix tests
2019-05-03 08:44:21 -07:00
Karl Burtram
c7f5278430 Remove 'Try backup or restore task' from task message (#5340) 2019-05-02 18:52:11 -07:00
Anthony Dresser
56ad0dbaf2 revert changes to common file and move icons to specific folder (#5335) 2019-05-02 18:14:13 -07:00
Anthony Dresser
74c92cd460 disable page scrolling in slickgrid (#5333) 2019-05-02 18:13:49 -07:00
Alan Ren
16ebb4322a fix insights dialog issue (#5336) 2019-05-02 18:13:33 -07:00
kisantia
fccd026812 Remove database name from schema compare source/target server field (#5327)
* remove database name from source/target server field
2019-05-03 11:20:28 +12:00
Charles Gagnon
b6e49f2bc0 Fix connections from editor not saving to MRU (#5328) 2019-05-02 16:08:46 -07:00
Karl Burtram
188ccf849d Fix nullref exception in connection profile (#5330)
* Fix nullref exception in connection profile

* Fix condition
2019-05-02 16:04:03 -07:00
Anthony Dresser
1bfdce9642 fix icons for various status; move menu entry to the other panels (#5319) 2019-05-02 14:31:58 -07:00
Karl Burtram
495254b0be Remove package-lock.json from extension sample (#5324) 2019-05-02 13:41:48 -07:00
Charles Gagnon
d209ff7b9a Fix broken CMS test and add to test-extensions-unit scripts (#5301) 2019-05-02 11:13:30 -07:00
Yurong He
98eeb50060 Fixed #4953 provider create file action when file is deleted (#5283)
* Fixed #4953 following query editor behavior provide create file option when file name doesn't exist

* Check active edtior is the correct one before closing

* close editor

* Removed codes not needed

* Check error null
2019-05-02 10:59:46 -07:00
Chris LaFreniere
8a68f0aaf9 Improvements to notebook editor code auto height (#5291) 2019-05-02 10:23:26 -07:00
Anthony Dresser
039859213c Replace observable references with just promises (#5278)
* replace observable references with just promises

* add tests for searching in dataview

* add comments

* work through respecting max matches

* fix tests

* fix strict null checks
2019-05-02 00:06:28 -07:00
Alan Ren
a3c022aebf Update dropdown.ts (#5290)
* Update dropdown.ts

fix the issue that the filtering feature is not working

* format the doc

* comments
2019-05-01 22:55:53 -07:00
Anthony Dresser
3a9b32b6e8 Task item look update (#5299)
* wip

* wip

* wip
2019-05-01 13:25:18 -07:00
Karl Burtram
c1cb9000a9 Various results grid scrolling fixes (#5285) 2019-05-01 12:36:16 -07:00
Kevin Cunnane
104b99ffa0 Fix #4553 Azure account prompt (#5303)
"Are you sure" prompt on removing Azure account does not respect "No"
2019-05-01 12:28:36 -07:00
Charles Gagnon
a89d7f327a Update SsmsMin to 15.0.18120.0 (#5293) 2019-05-01 11:29:36 -07:00
Charles Gagnon
91bc4bde3c Change default timeout value to 30sec to match SSMS (#5289) 2019-05-01 11:19:40 -07:00
Charles Gagnon
80da7ad496 Fix broken CMS commands (#5296) 2019-05-01 10:55:58 -07:00
Charles Gagnon
64bf211a45 Fix failing OE Tests caused by changes in connection API defaults (#5295)
* Fix failing OE Tests caused by changes in connection API defaults

* More spelling fixes
2019-05-01 10:18:23 -07:00
Anthony Dresser
2558a66a48 Merge from vscode 473af338e1bd9ad4d9853933da1cd9d5d9e07dc9 (#5286) 2019-04-30 21:53:52 -07:00
kisantia
df22eab4ec fix diff editor title disappearing after swap (#5280) 2019-05-01 11:30:03 +12:00
Aditya Bist
c2678cf818 Committer work - Agent tests (#5034)
* added couple tests

* added unit tests for agent extension

* added agent test in test script
2019-04-30 16:25:16 -07:00
Yurong He
f9af34b103 Notebook toolbar UI improvement (#5282)
Fixed  #5236
- Change the icon to blue image. Existing icons are not deleted. We will use them when the icons are - - moved as the secondary actions.
- Change the font size to 13px
- Change the height to 21px
- Move Add actions to the left
2019-04-30 16:18:48 -07:00
Charles Gagnon
97cab22e00 Add more SsmsMin interactions (#5267)
* Add support for more SsmsMin property dialogs and the Generate Script Wizard. Also fixed bug with ExtHostObjectExplorerNode getParent function

* Localize package.json entries

* Fix localization tokens

* Address PR comments

* Fix regex and getParent
2019-04-30 16:10:11 -07:00
Kevin Cunnane
64416e05c1 Notebook StdIn support to fix #5231 (#5232)
Fixes #5231 
- Add stdin handling. Has to be at UI level so add plumb through handling
- Add unit tests
- Add new StdIn component.

Testing:
Unit Tests and manual testing of following:
- Prompt for password using `getpass` in python.
   - Password prompt is hidden since "password" is true.
   - Hit enter, it completes
- prompt, stop cell running, StdIn disappears
- prompt, hit escape, stdIn disappears and stdIn request is handled.

Issues: focus isn't always set to the input even though we call focus.
Will investigate this further.
2019-04-30 14:57:27 -07:00
Kevin Cunnane
b21125ff2d Fix markdown security and enable most CSS (#5263)
* Fix markdown security and enable most CSS

Stops using our sanitizer and instead disables HTML in markdown engine
- This was blocking Note because it converted > to &gt;
- It's slightly more strict in that it fully disables HTML unless trusted. Will need to improve handling of Trusted to support this in a future PR

Adds in correct CSS, both from .css file in markdown extension and from built-into all webviews global CSS
- Fix #3765 standard markdown support
- Fix Support of Notes by bringing correct styles
- Fix code block colorization
- Fix link handling so it's not bolded / gets underlined on hover
- Fixes table rendering (for markdown and HTML tables)

* Reduce scope of CSS changes
- Removed some CSS that wasn't needed or caused issues
- Scoped most things under the preview section not the whole component

* Avoid markdown html block by sanitizing after render

* Fix pre node not overflowing
- This was a bug in existing implementation too
2019-04-30 14:21:20 -07:00
Chris LaFreniere
e72d0d03ed Notebooks: More Jupyter Server Hardening (#5264)
* Do not rely on same starting port every time, misc

* put back to 5 seconds for process kill timeout

* extHostNotebook shutdown manager handle
2019-04-30 14:13:04 -07:00
Anthony Dresser
0f12d15020 Refactor connection store (#5024)
* various clean ups

* formatting

* remove linting

* formatting

* IConfigurationService is even better

* messing with connection config tests

* update tests

* formatting

* foramtting

* remove unused code

* add more tests

* working through tests

* formatting

* more factoring of connection store and increase code coverage

* formatting

* fix tests

* change use of state service to storage service

* remove unused files

* fix strict null errors

* formatting
2019-04-30 13:32:32 -07:00
Yurong He
44a2d009c0 Fixed #4169 shows notification when not active code cell is selected when hit F5 (#5259)
* Refine the msg
2019-04-30 13:25:11 -07:00
Anthony Dresser
48682bacde rework panel to not need platform (#5270) 2019-04-30 13:19:21 -07:00
Aditya Bist
f70369c2a6 Fix object explorer session fails (#5256)
* only start session if response successful

* added return type

* fixed OE tests

* format doc
2019-04-30 11:45:52 -07:00
udeeshagautam
f7fc94520a Fixing schema compare object type enum to follow PascalCasing (#5266) 2019-04-30 11:07:46 -07:00
udeeshagautam
77c6f5c9a2 Reverting the checkboxes because of scrollbar issue 5230 (#5271) 2019-04-30 11:07:11 -07:00
kisantia
dffa47301b fix path to dark icon of options button (#5274) 2019-05-01 05:53:55 +12:00
Charles Gagnon
08f47e7e14 Fix "Open File Location" for CSV Export (#5273)
Fix #5177 - File path wasn't being parsed correctly, URI.file should be used for full file paths.
2019-04-30 10:41:19 -07:00
Anthony Dresser
56342af140 Rework editableDropdown to not need platform (#5189)
* rework editableDropdown to not need platform

* rework editable dropdown to not depend on platform

* fix compile

* fix focus bluring
2019-04-30 09:25:04 -07:00
Anthony Dresser
aacc0eca67 Merge from vscode 071a5cf16fc999727acc31c790d78f750fa4b166 (#5261) 2019-04-30 07:54:56 -07:00
Kevin Cunnane
02916aeffa Fix #3440 link in Markdown requires cmd+click (#5239)
- Copied over click handler from VSCode markdown renderer
- Added logic to only support command links if trusted

Now clicking on a link will do an action or open a file.
Created #5238 to track need to support relative links which doesn't work
2019-04-29 19:15:39 -07:00
udeeshagautam
e42bfada9d Feature/schemacompare options (#5143)
* extension now working

* make button messages better

* fix diff editor title disappearing and remove border from source and target name boxes

* redoing a bunch of stuff that disappeared after rebasing

* add images and add to extensions.ts

* moving a few changes to the right place after rebase

* formatting

* Initial schema compare options working code

* Adding description.icon etc.

* Enabling disabling options button

* Name change: SchemaCompareOptions to DeploymentOptions. To reflect SqltoolsService side parameters

* Adding sorting and correct sql tools version

* Adding options button themes

* Formatting fix

* Adding get default options call to get options from tools service

* Exclude/Include changes - first commit

* Adding border to checkboxes

* Taking PR comments

* Updating to latest sqltools with schema compare options
2019-04-29 18:11:48 -07:00
Karl Burtram
72fb114dec Load MSSQL extension first during eager load phase (#5255) 2019-04-29 17:46:30 -07:00
Karl Burtram
9ba1561386 Bump SQL Tools Service to 1.5.0-alpha.91 (#5260) 2019-04-29 17:38:19 -07:00
Aditya Bist
39772c2dbe CMS Extension - 2 (#4908)
* first set of changes to experiment the registration of cms related apis

* Adding cms service entry to workbench

* Adding basic functionality for add remove reg servers and group

* Returning relative path as part of RegServerResult as string

* initial extension

* cleaned building with connecting to server

* get list of registered servers

* progress with registered servers tree

* cms base node with server selection

* removed unused services

* replaced azure stuff with cms

* removed cmsResourceService

* list servers progress

* Removing the cms apis from core. Having mssql extension expose them for cms extension

* create server working fine

* initial expansion and nodes

* Propogating the backend name changes to apis

* initial cms extension working

* cached connection needs change in api

* connect without dashboard in proposed

* Fixing some missing sqlops references

* add registered server bug found

* added refresh context menu option

* added payload

* server description not disabled after reject connection

* added more context actions and action icons

* added empty resource and error when same name server is added

* fixed connection issues with cms and normal connections

* added initial tests

* added cms icons

* removed azure readme

* test script revert

* fix build tests

* added more cms tests

* fixed test script

* fixed silent error when expanding servers

* added more cms tests

* removed cmsdialog from api

* cms dialog without object

* fixed theming issues

* initial connection dialog done

* can make connections

* PM asks for strings and icons

* removed search

* removed unused code and fixed 1 test

* fix connection management tests

* changed icons

* format file

* fixed hygiene

* initial cr comments

* refactored cms connection dialog

* fixed bug when switching dialogs

* localized connection provider options

* fixed cms provider name

* code review comments

* localized options in cms and mssql

* localized more options
2019-04-29 15:16:59 -07:00
David Shiflet
cbf3ca726f Connect opened query editors to the server (#5207)
* Connect opened query editors to the ser

* only show firewall rule error for one file

* remove unused imports

* sync to latest

* add comment

* one more comment
2019-04-29 13:10:19 -04:00
Yurong He
f1e21ebe9d Fix driverlog uri in output (#5212)
* Fixed #4813

* Changed to lowercase for comparision
2019-04-26 20:25:55 -07:00
Yurong He
49c36cc040 Fixed #4947 Increase the size of the notebook command icons and space between icon and button (#5235) 2019-04-26 19:57:40 -07:00
Anthony Dresser
9b90400abd Merge from vscode 63d257f78a36951ab7e821170ba675b11dc06d48 (#5240) 2019-04-26 17:24:54 -07:00
Karl Burtram
8cda364210 Fix product name in error message (#5233)
* Fix product name in error message

* fix typo
2019-04-26 15:32:30 -07:00
Anthony Dresser
ca98ef879d Last of the layering (#5187)
* layer query

* update imports
2019-04-26 15:30:41 -07:00
Chris LaFreniere
bb9c85cd8f Improve Cleanup of Jupyter processes on Notebook and/or ADS Close (#5142)
* Close jupyter and python

* Ensure we stop jupyter correctly on process end

* dont stopServer from clientSession shutdown

* PR comments

* close notebook after each test
2019-04-26 15:28:26 -07:00
Anthony Dresser
91b946bf3d rework listbox to not require platform (#5192) 2019-04-26 15:02:03 -07:00
kisantia
64377000c6 Change default folder in dacpac and schema compare extensions (#5215)
* change default folder in dacpac and schema compare extensions

* move getting rootpath to a method

* change method name
2019-04-27 04:27:03 +12:00
Yurong He
23f4931a1d Fixed #5210 (#5211)
* Fixed #5210

* Added function to check error to ensure get string
2019-04-25 19:45:33 -07:00
Anthony Dresser
3625834028 Merge from vscode 63655183ba5305b70ffaf1327b8a4708f0a79bd9 (#5221) 2019-04-25 19:18:04 -07:00
Alan Ren
705e7b30bc enable 'New Notebook' entry points for PG SQL (#5194)
* enable 'New Notebook' entry points for PG SQL

* fix comment

* permanent fix for context menu entry point
2019-04-25 16:23:01 -07:00
udeeshagautam
6528c0817d quick fix for column name (#5214) 2019-04-25 15:51:54 -07:00
Aditya Bist
f3d7392af3 Localize connection options for mssql (#5209)
* localize connection options for mssql

* added all strings
2019-04-25 15:13:07 -07:00
Yurong He
5d5f44ba11 Fixed #3961 Removed hardcoded value in notebook and use QueryEditor Result RowHight to keep it consistent (#5208) 2019-04-25 12:59:13 -07:00
Anthony Dresser
34457880c7 Merge from vscode 0fde6619172c9f04c41f2e816479e432cc974b8b (#5199) 2019-04-24 22:26:02 -07:00
kisantia
d63f07d29a Schema compare publish (#5127)
* adding publish command
2019-04-25 10:37:40 +12:00
Anthony Dresser
5c2cbc9d29 fix task contribution (#5188) 2019-04-24 15:05:11 -07:00
Karl Burtram
8dbfa10646 Bump npm packages (#5185) 2019-04-24 14:55:36 -07:00
Anthony Dresser
9e804089e0 Move Tasks Panel (#5162)
* moving tasks panel

* fix css styling

* clean up code

* do a bunch of renaming
2019-04-24 13:54:58 -07:00
v-mdriml
51145903aa LOC CHECKIN | Microsoft/azuredatastudio master | 20190424 (#5179) 2019-04-24 13:46:50 -07:00
Anthony Dresser
036c49f398 Modify coverage process to reduce time (#5164)
* modify coverage process to reduce time

* disable coverage on linux
2019-04-24 13:39:30 -07:00
Chris LaFreniere
46b85ebc6b Fix casing on welcome page to be consistent (#5169) 2019-04-24 11:18:33 -07:00
Yurong He
07bc5e2de9 Fixed #3244 (#5166) 2019-04-24 09:04:15 -07:00
Yurong He
e822091907 Only when the connection is connected, then do disconnect. (#5170) 2019-04-23 19:03:49 -07:00
Chris LaFreniere
7d46e77922 Notebooks: Show keyboard shortcut for run cell (#5097)
* Show keyboard shortcut for run cell

* PR comment
2019-04-23 17:33:15 -07:00
Karl Burtram
ad528ad3d5 Update Azure Data Studio to 1.7.0 for May release (#5167) 2019-04-23 17:24:24 -07:00
Yurong He
e8b4c03770 Fixed #4772 create unique connection for each notebook by using noteb… (#5163)
* Fixed #4772 create unique connection for each notebook by using notebook path as uri
Disconnect sqlConnection from SqlKernel to ensure no connection left after the notebook is closed.

* SqlSessionManager is ADS level manager

* Moved path to SqlKernel constructor
2019-04-23 16:42:28 -07:00
Chris LaFreniere
33aacc1798 Improve Notebook Context Menu Casing (#5145) 2019-04-23 11:35:46 -07:00
Anthony Dresser
58959ef35e Enable stricter compile options on extensions (#5044)
* enable stricter compile settings in extensions

* more strict compile

* formatting

* formatting

* revert some changes

* formtting

* formatting
2019-04-23 11:18:00 -07:00
Charles Gagnon
c66b349cec Show correct release notes for command (#5108) 2019-04-23 10:55:09 -07:00
Aditya Bist
5c90df092b Switch on code coverage (#5120)
* testing code coverage for /sql/

* fix test result path

* fixed coverage task:



'

* added coverage to linux and mac as well

* fix script for mac and linux

* bash script cmd

* fixed scripts

* remove warnings from summary

* remove coverage from linux

* added job action tests

* added more tests
2019-04-23 10:47:03 -07:00
Karl Burtram
32374f264f Adjust default fonts (#5146)
* Adjust default fonts

* Adjust default ratio
2019-04-22 17:30:16 -07:00
Anthony Dresser
5e62229f25 Merge from vscode 2b87545500dbc7899a493d69199aa4e061414ea0 (#5148) 2019-04-22 16:57:13 -07:00
Cory Rivera
1b24dff738 Show an Install Skipped message when Python already exists at the specified install location. (#5141)
* Also fixed a bug where future installs would get blocked after doing an install that gets skipped.
2019-04-22 14:28:59 -07:00
Aditya Bist
161135cd90 remove old unused splash screen (#5112) 2019-04-22 13:39:27 -07:00
Aditya Bist
5fb583da06 uncheck desktop icon by default (#5111) 2019-04-22 13:38:51 -07:00
Kevin Cunnane
8b40d20eab Typings reference failing in external extension (#5129)
This fixes an issue where if you copy azdata.proposed.d.ts to
an extension project, it would fail to compile using default
tsconfig.json settings due to missing return type
2019-04-19 16:41:23 -07:00
Chris LaFreniere
432034d2cb Notebooks: Pressing Escape key in markdown cell exits edit mode (#5094)
* Pressing Escape key in markdown exits edit mode

* text + code cells inactive when pressing escape
2019-04-19 16:19:16 -07:00
Alan Ren
0e168e36fc Merge from vscode 12cb89c82e88e035f4ab630f1b9fcebac338dc03 (#5125) 2019-04-19 10:26:20 -07:00
Cory Rivera
f248260584 Add flag to skip python install integration tests. (#5119) 2019-04-18 17:13:19 -07:00
Cory Rivera
880e3e10da Update Python install messages to differentiate between download completion and install completion. (#5114) 2019-04-18 16:52:50 -07:00
Anthony Dresser
f33b95ee82 Use vscode clipboard rather than clipboardy (#5117)
* remove clipboardy and use vscode's clipboard apis instead

* update lock
2019-04-18 16:34:24 -07:00
Anthony Dresser
4a71eb9b90 use vscode open external rather than opener (#5115) 2019-04-18 16:34:12 -07:00
Chris LaFreniere
3372a5ad4b Fix hygiene issues in textCell.css (#5116) 2019-04-18 15:58:27 -07:00
Zbyněk Sailer
8326f05f66 LOC CHECKIN | Microsoft/azuredatastudio master | 20190418 (#5103) 2019-04-18 14:59:25 -07:00
Chris LaFreniere
7ce921d449 Improve tabe markdown css (#5093) 2019-04-18 14:57:26 -07:00
Chris LaFreniere
31f7364f08 Only show nb loading spinner when no cells shown (#5095) 2019-04-18 14:57:11 -07:00
Anthony Dresser
5119d28b9d fix svg references (#5109) 2019-04-18 12:29:18 -07:00
Yurong He
a2a5fe3bee Added more logging for python installation and remove dup test in the other suite (#5106) 2019-04-18 11:24:12 -07:00
Karl Burtram
6222d8c977 Bump agent extension to 0.38.0 (#5104) 2019-04-18 09:48:47 -07:00
Karl Burtram
d3699a261a Update readme for April release (#5063) 2019-04-18 09:25:01 -07:00
Anthony Dresser
9c0e56d640 Layering of everything else but query (#5085)
* layer profiler and edit data

* relayering everything but query

* fix css import

* readd qp

* fix script src

* fix hygiene
2019-04-18 01:28:43 -07:00
Anthony Dresser
ddd89fc52a Renable Strict TSLint (#5018)
* removes more builder references

* remove builder from profiler

* formatting

* fix profiler dailog

* remove builder from oatuhdialog

* remove the rest of builder references

* formatting

* add more strict null checks to base

* enable strict tslint rules

* fix formatting

* fix compile error

* fix the rest of the hygeny issues and add pipeline step

* fix pipeline files
2019-04-18 00:34:53 -07:00
Alan Ren
b852f032d3 Merge from vscode 3a6dcb42008d509900b3a3b2d695564eeb4dbdac (#5098) 2019-04-17 23:38:44 -07:00
Aditya Bist
1fec26c6b3 dont dispose serverTreeView every time hidden (#5092) 2019-04-17 20:12:44 -07:00
kisantia
d3483afaed Schema Compare extension (#4974)
* extension now working

* fix diff editor title disappearing and remove border from source and target name boxes

* redoing a bunch of stuff that disappeared after rebasing

* add images and add to extensions.ts

* moving a few changes to the right place after rebase

* formatting

* update toolbar svgs

* addressing comments

* add return types

* Adding PR comments

* Adding light and dark theme icons

* Fixing the diff editor title for dark theme
2019-04-17 19:14:22 -07:00
Chris LaFreniere
910e4815fa Enable Python Integration Tests For Notebooks (#5090)
* Enable python notebook tests

* change run_python3_test to be string

* Add explicit pyspark check
2019-04-17 16:46:10 -07:00
Karl Burtram
ef118e3351 Update extension recommendation list (#5086) 2019-04-17 16:26:41 -07:00
Chris LaFreniere
2beedb10d4 Add run cells as a command (#5080) 2019-04-17 14:26:31 -07:00
Charles Gagnon
41bf10d989 Fix for build break caused by missing SVG (#5078)
* Fix for build break caused by missing SVG

These files were moved in #5029

* Add a few more missing SVGs
2019-04-17 11:22:42 -07:00
Chris LaFreniere
ac3b6aef27 ADS Welcome Page (#5043)
* ADS Welcome Page

* sample notebook section

* Opens in browser :(

* Remove sample notebooks section

* fix open file in windows

* Change az_data_welcome_page to be under sql/

* fix tslint issue

* Scope table css down
2019-04-17 10:50:43 -07:00
Anthony Dresser
8956b591f7 Merge from vscode 05fc61ffb1aee9fd19173c32113daed079f9b7bd (#5074)
* Merge from vscode 05fc61ffb1aee9fd19173c32113daed079f9b7bd

* fix tests
2019-04-16 22:11:30 -07:00
Anthony Dresser
2f8519cb6b Layer Object Explorer; query plan; task history (#5030)
* relayer query plan, task history, object-explorer

* formatting
2019-04-16 21:12:34 -07:00
Aditya Bist
15a19c044d remove agent from old apis (#5064) 2019-04-16 17:13:02 -07:00
Alan Ren
ec47ff7479 Alanren/fixsmoketest (#5019)
* fix the smoke test

* update readme

* fix the selector for server name input

* add new property to server profile engineType
2019-04-16 16:43:11 -07:00
Aditya Bist
82f707ee89 Fixed agent filter in dark theme (#5048)
* fixed agent filter in dark theme

* added high contrast theme as well
2019-04-16 15:30:16 -07:00
Alan Ren
dfcab8db6a fix the selectbox issue for chart (#5052) 2019-04-16 13:32:52 -07:00
Anthony Dresser
5c10127758 Layer grid code (#5029)
* layer grid

* errors; edit data still not showing up

* fix edit data

* fix tab spaces
2019-04-16 13:30:15 -07:00
Aditya Bist
b376f36733 fix job action context (#5053) 2019-04-16 13:07:22 -07:00
Aditya Bist
96c0f62cf5 Removed duplicate connections code (#5045)
* removed duplicate connections code

* removed old comment

* removed unused code
2019-04-16 11:54:07 -07:00
Zbyněk Sailer
a96f996b59 LOC CHECKIN | Microsoft/azuredatastudio master | 20190411 (#4993) 2019-04-16 10:36:41 -07:00
Anthony Dresser
b75d0b6cb5 Fix bootstrapping around dashboard (#5040) 2019-04-15 23:49:40 -07:00
Anthony Dresser
a5bc65fbfb Merge from vscode 8b5ebbb1b8f6b2127bbbd551ac10cc080482d5b4 (#5041) 2019-04-15 20:37:22 -07:00
Gene Lee
dcdbc95ae7 Fixed bug: CheckboxTreeNode label overflows, and node icon disappears (#5022) 2019-04-15 13:44:49 -07:00
Anthony Dresser
72e7e5e025 relayer connection code; formatting (#5020) 2019-04-15 01:05:23 -07:00
Anthony Dresser
57242a2e13 Merge from vscode 3d67364fbfcf676d93be64f949e9b33e7f1b969e (#5028) 2019-04-14 22:29:14 -07:00
Anthony Dresser
6dbf757385 Remove some vscode differences (#5010)
* remove some vscode differences

* add dates to todo comments
2019-04-12 21:55:07 -07:00
Karl Burtram
c5a32d8373 Show user email address in account picker (#5015)
* Show user email address in account picker

* Fix build break and remove Azure account from sqlops namespace
2019-04-12 17:23:48 -07:00
Charles Gagnon
34288435ec Fix Data Explorer context menu items visibility (#4996)
* Fix Data Explorer context menu items visibility

The when clause was making the menu items show up for all nodes in the Data Explorer - even ones that didn't make sense such as the Databases folders. This change makes it only appear for the Database and Server nodes (which is how the OE tree is set up)
2019-04-12 17:09:53 -07:00
Yurong He
177b48c3f2 Revert #4955 to unblock integration test (#5016) 2019-04-12 14:08:38 -07:00
Anthony Dresser
642f5d4405 replace nulls with undefined (#5014) 2019-04-12 13:08:32 -07:00
Anthony Dresser
92b1c59e48 Strict null check on full "base" code (#4973)
* removes more builder references

* remove builder from profiler

* formatting

* fix profiler dailog

* remove builder from oatuhdialog

* remove the rest of builder references

* formatting

* add more strict null checks to base
2019-04-12 12:59:58 -07:00
Anthony Dresser
cb1682542b remove sql/services (#4991) 2019-04-12 12:48:06 -07:00
Anthony Dresser
9e56187c16 Remove sql/common (#4990)
* remove sql/common/

* formatting

* fix cyclic dependency
2019-04-12 12:47:48 -07:00
Yurong He
51851efda5 Use notebookUtils.getErrorMessage to get the correct msg instead of showing {0} (#5012) 2019-04-12 11:50:44 -07:00
Charles Gagnon
11e4b743e0 Fix wrong release notes being loaded (#5008) 2019-04-11 16:56:55 -07:00
Kevin Cunnane
3349151d4c Touchbar icon support in notebooks (#4998)
* Touchbar icon support in notebooks
- updated shortcut keys to only work if notebook is active
- Added icons
- Now have 1 "add cell" icon that prompts for code/text.
This is useful as there wasn't an icon to differentiate
2019-04-11 16:43:27 -07:00
Kevin Cunnane
c8f6937166 Fix #4500 Untitled notebook reopen doesn't show dirty (#5005)
* Fix 2 notebook issues
- Do not create notebook model twice on start
- Do not cause disposed warnings due to markdown cell deserialization

* Fix notebook dirty on open issue
Before model is resolved we weren't getting dirty events.
Solution is to use backing text model until it's ready.
Must hook to the dirty event & notify to get the dot to appear
2019-04-11 16:39:54 -07:00
Maddy
ad36c1df3d add wrap to the <pre> tag (#5002)
* add wrap to the <pre> tag

* removed styles for browser support
2019-04-11 15:32:52 -07:00
Anthony Dresser
a5b8924e2d restore line height to account list renderer (#4999) 2019-04-11 15:10:38 -07:00
Anthony Dresser
bc898cc2c2 revert data explorer id to connections (#5003) 2019-04-11 15:10:26 -07:00
kisantia
1247b6e8eb bump SQL Tools to 1.5.0-alpha.85 to get invalid dacpac version fix (#5001) 2019-04-11 13:53:45 -07:00
udeeshagautam
2111c3de1a Fix for 4104 : Multiple consecutive spaces in query results cells are condensed into one (#4983)
* preserving spaces in query results - all beginning, trailing and middle spaces will be shown as is

* removing the change through formatting and replacing with css change formatting was leaving special char while removing nbsp
2019-04-11 13:45:10 -07:00
Aditya Bist
bcea1b66be Added some usage details (#4711)
* added some details

* remove unused import

* added new metrics, removed churn

* merged master and code review comments

* code review comments

* normalized days to calendar days/weeks/months

* cleaned up code

* changed comment to start required check for PR

* fix failing test

* fix test

* removed null assignment

* fix null test script
2019-04-11 12:07:20 -07:00
Yurong He
4f8d14ed3e Fix ##3479 ctrl+a select active cell output or preview markdown (#4981)
* Enable ctrl+a to select the output or markdown content when the cell is active

* Moved toggleUserSelect into ngOnChanges

* Resolve PR comments
2019-04-11 11:36:42 -07:00
Chris LaFreniere
442adfbbc3 Add New Notebook from Server Dashboard (#4971) 2019-04-11 10:15:45 -07:00
Aditya Bist
fe12233954 Azure extension changes (#4987)
* removed search box

* removed commented code
2019-04-11 09:44:43 -07:00
Anthony Dresser
c725f6f572 fix html structure of add account dialog (#4988) 2019-04-10 17:41:18 -07:00
Cory Rivera
1870d83081 Add additional error handling to Python installation for Notebooks (#4891)
* Also enabled integration tests for python installation.
2019-04-10 17:09:28 -07:00
Anthony Dresser
8315dacda4 Merge from vscode 31e03b8ffbb218a87e3941f2b63a249f061fe0e4 (#4986) 2019-04-10 16:29:23 -07:00
kisantia
18c54f41bd remove unnecessary message that connection is needed (#4982) 2019-04-10 16:22:18 -07:00
Kevin Cunnane
293f9c22c4 Touchbar support for Run Cell, Run Query, Add Cell (#4972)
* Basic touchbar support

* Touchbar support for Run Cell, Run Query, Add Cell
- Add top 3 notebook commands
- Add top query command

Actions only appear on having active editor of expected type.
In order to make Notebook work as expected, added tabindex to support focusing
and hide outline to ensure it doesn't get weird blue outline on click
Note: does not have icons yet, which would be nice.
However can add in subsequent PR once this comes from UX.
2019-04-10 14:04:36 -07:00
Anthony Dresser
a74510544f Remove rest of builder (#4963)
* removes more builder references

* remove builder from profiler

* formatting

* fix profiler dailog

* remove builder from oatuhdialog

* remove the rest of builder references

* formatting
2019-04-10 13:23:33 -07:00
Aditya Bist
9b053c50c2 Azure - icon change (#4967)
* changed icon and string

* changed add to servers icon
2019-04-10 13:20:43 -07:00
Chris LaFreniere
88712f46bf Fix for relative markdown image paths (#4889)
* Fix for relative markdown image paths

* PR comments
2019-04-10 11:35:06 -07:00
Chris LaFreniere
d6df20b0e8 Notebooks: Potential Fix for "Notebook Provider does not Exist" Error (#4848)
* Fallback to SQL

* Fix providers not found issue

* await whenInstalledExtensionsRegistered

* PR comments
2019-04-10 11:34:46 -07:00
Chris LaFreniere
5dc37f7557 improve assert message (#4968) 2019-04-10 10:14:44 -07:00
Gene Lee
445d306586 Fixed Broken Notebook Preview (#4895) 2019-04-09 18:47:24 -07:00
Chris LaFreniere
d332ae1132 Fix to ensure that we rewrite spark ui links correctly (#4962) 2019-04-09 18:23:21 -07:00
Kevin Cunnane
37f45b10a3 Mitigate (but not fully fix) Run Cell from disconnected notebook (#4960)
This is a partial fix that lays groundwork for full "Prompt to connect" if a kernel needs a connection.
I am waiting on Yurong's refactoring of connection handling before doing any of the prompt work.

- Adds kernel metadata about whether a connection is required.
- For Jupyter, only Spark kernels are listed as requiring a connection
- If this is true and there's no active connection, will show notification and not call execute

In the future, this path will still be used if user is prompted to connect and cancels out.
The future change will be to inject a "connect" handler from notebook.component to the cell callback and use to set connection context
2019-04-09 17:45:05 -07:00
kisantia
d9b6ec0654 Add title for DiffEditor and fix SplitViewContainer (#4961)
* add option to have diff editor title

* fix component being undefined and splitter not showing
2019-04-09 16:41:32 -07:00
kisantia
f98428aea5 Add reverse color option to text diff editor (#4826)
* Add reverse color option to text diff editor
2019-04-09 16:26:41 -07:00
Gene Lee
b3be1d79cd Add support for new endpoint key string 'gateway' (#4954) 2019-04-09 15:19:02 -07:00
Yurong He
ea8f885f05 Call getCurrentGlobalConnection like New Query to get the active connection context (#4955) 2019-04-09 15:15:36 -07:00
Kevin Cunnane
2de47c2a50 Fix #4930 Text cells are referred to as text and markdown in commands (#4956) 2019-04-09 14:20:29 -07:00
Kevin Cunnane
30b8e105f9 Fix #4893 New Notebook Can Open Existing Noteboon (#4959)
Add back check for textDocuments with same name, should've been there anyhow

On rehydration files show as text docs before clicking as only get
changed by customInputConverter code path.
We should look at this long term - ideally we'd update notebookDocuments
with correct values on initial start. #4958 opened to track this.
2019-04-09 14:11:10 -07:00
Kevin Cunnane
2a44fab8ba Fix issue where mac launch often failed to attach (#4957)
- Timeout now matches the windows timeout value
2019-04-09 14:10:56 -07:00
Chris LaFreniere
6a06a99e46 Change pfs.rimraf call in insightsutils test (#4921) 2019-04-09 13:05:36 -07:00
Anthony Dresser
3670dfbebd changes strings for data explorer (#4946) 2019-04-09 12:39:15 -07:00
kisantia
daf929ecc7 add splitViewContainer and diffEditor to ModelBuilder interface (#4950) 2019-04-09 11:45:21 -07:00
Anthony Dresser
f96a17c930 Remove unused code (#4873)
* removes unused features

* remove more unused code; formatting

* lock changes

* fix run issue
2019-04-09 02:00:51 -07:00
Anthony Dresser
2fb06e7f4f Move notebooks to workbench (#4888)
* move notebooks under workbench

* fix style imports
2019-04-09 01:59:55 -07:00
Anthony Dresser
4ece9b0085 remove dispose of button listeners (#4914) 2019-04-09 01:08:56 -07:00
Anthony Dresser
a4bd31e96a Removes Builder references from modal (#4869)
* remove builder from modal

* add more DOM methods
2019-04-09 00:28:52 -07:00
Anthony Dresser
8bdcc3267a Code Layering dashboard (#4883)
* move dashboard to workbench

* revert xlf file changes

* 💄

* 💄

* add back removed functions
2019-04-09 00:26:57 -07:00
Takahito Yamatoya
9e9164c4ee fix the format (#4899) 2019-04-08 19:56:44 -07:00
Alan Ren
5dc6a39652 Update readme.md (#4907) 2019-04-08 16:25:53 -07:00
kisantia
ada0966832 Add toolbar separator (#4890)
* Add option to add toolbar separator after toolbar component
2019-04-08 15:27:18 -07:00
Alan Ren
e6faef27ab Alanren/integration setup (#4871)
* automate the setup and use akv to store values

* update readme.md

* get rid of the save to file part

* update readme

* add more messages

* fix the error

* fix some errors

* fix the readme
2019-04-08 15:11:38 -07:00
Anthony Dresser
acc27d0829 Code Layering Accounts (#4882)
* code layering for accounts

* formatting

* formatting

* formatting
2019-04-08 14:45:30 -07:00
Anthony Dresser
ab54f7bb45 remove builder references from some componnets (#4868) 2019-04-08 14:24:03 -07:00
Yurong He
88161cc37d Fixed cancel connectionDialog from attach to shows dup "select connection" (#4865) 2019-04-08 14:06:46 -07:00
Anthony Dresser
0975e6834e remove builder from taskswidget (#4866) 2019-04-08 13:28:23 -07:00
Anthony Dresser
22ec1d5f0a Strict Null Checks, Builder Removal, Formatting (#4849)
* remove builder; more null checks

* null checks

* formatting

* wip

* fix dropdown themeing

* formatting

* formatting

* fix tests

* update what files are checked

* add code to help refresh bad nodes
2019-04-08 13:27:41 -07:00
udeeshagautam
01784dd186 Adding Diff view and Split view container as Model View Editor Components (#4831)
* intial code for diff view inside model view

* Adding basic Split View Container depending on Flex Layout

* Enabled resizing between top and bottom view

* cleaning up some of the sqlops references

* Adding height as per CR comment
2019-04-08 11:11:14 -07:00
Karl Burtram
02cf91c158 Remove broken VS Code test from merge (#4887) 2019-04-05 15:06:11 -07:00
Karl Burtram
0532346f4f Merge from vscode 591842cc4b71958c81947b254924a215fe3edcbd (#4886) 2019-04-05 14:14:26 -07:00
Chris LaFreniere
657adafb7d always serialize execution count (#4864) 2019-04-05 11:34:10 -07:00
Anthony Dresser
818c0789ea Reduce packaged size of extensions (#4300)
* reduce output of notebooks

* reduce size of azurecore

* update mssql extensions to be webpacked

* formatting

* remove commented code

* fix packaged config

* fix mssql packing
2019-04-05 10:48:00 -07:00
Karl Burtram
cb5bcf2248 Merge from vscode 2b0b9136329c181a9e381463a1f7dc3a2d105a34 (#4880) 2019-04-05 10:09:18 -07:00
Anthony Dresser
9bd7e30d18 revert vscode cahnges (#4879) 2019-04-05 09:15:38 -07:00
Aditya Bist
572010ded1 fixed azure extension issues (#4859) 2019-04-04 15:32:04 -07:00
David Shiflet
1f22326e78 Reuse existing saved connection that matches args (#4839)
* Reuse existing saved connection that matches args

* search subgroups for matches
2019-04-04 11:11:18 -07:00
Yurong He
504d5c91bc Fixed #4800 need to use ConnectionProfile in order to get the correct… (#4812)
* Fixed #4800 need to use ConnectionProfile in order to get the correct connection

* Go back to create connect in run cell to avoid to fail to run cell or close the connection used by other.
2019-04-03 19:00:56 -07:00
Chris LaFreniere
a34692b6f2 Fix for dispose method of undefined (#4843) 2019-04-03 18:08:42 -07:00
Karl Burtram
73b5d23210 Update bug template labels (#4840) 2019-04-03 16:47:55 -07:00
Anthony Dresser
e31de8b137 fixed more null references (#4841) 2019-04-03 16:30:58 -07:00
Anthony Dresser
cef5bbb2be Strict null pass on some base ui files (#4832)
* more strict null checks in base browser code

* revert changes to radiobutton

* fix some more minor things, enable strict null check in pipelines

* formatting

* fix compile errors

* make null undefined

* more null to undefined
2019-04-03 16:18:33 -07:00
Anthony Dresser
80a8e1a4da More builder removal (#4810)
* remove more builder references

* formatting
2019-04-03 16:18:03 -07:00
Anthony Dresser
fcb8fe50fe Advanced Description On Bottom (#4836)
* move description to the bottom

* formatting
2019-04-03 15:49:53 -07:00
Karl Burtram
5235c8aad6 Turn-off classifier bot while doing label refactoring (#4834) 2019-04-03 13:59:48 -07:00
Karl Burtram
5b67525211 Bump SQL Tools to 1.5.0-alpha.84 (#4830) 2019-04-03 11:43:43 -07:00
Anthony Dresser
fb697729c0 Data Explorer Icons (#4806)
* inital icon support

* add necessary classes for icons

* initialize icon to blank string
2019-04-03 11:10:48 -07:00
Chris LaFreniere
76fe0fef49 Fix broken test merge conflicts (#4823) 2019-04-03 10:44:01 -07:00
Karl Burtram
6295d03801 Update XLF files (#4811) 2019-04-02 17:35:18 -07:00
Chris LaFreniere
07166fb3cd Run All Cells Notebook Implementation (#4713)
* runAllCells API

* add comment

* more run cells fixes

* Add integration test

* Add multiple cell SQL notebook test

* Comment out python tests as they fail in the lab

* remove unused imports

* PR comments

* Remove localize

* Return true instead of promise.resolve(true)
2019-04-02 16:47:00 -07:00
Yurong He
219dfe66d0 Make connection for new Notebook (#4770)
* Make connection for new Notebook

* Resolve merge issue

* User connectionUri to track the connects need to be disconnected

* Removed debugging log
2019-04-02 15:52:01 -07:00
Yurong He
22c62fb524 Added setup and teardown for test; add variable to control run python… (#4782)
* Added setup and teardown for test; add variable to control run python/pyspark tests; remove dup code in clearAlloutput tests

* Resolve PR comments
2019-04-02 15:16:54 -07:00
Karl Burtram
f8706abebe Merge from vscode b8c2e7108b3cae7aa2782112da654bedd8bb3a52 (#4808) 2019-04-02 14:35:06 -07:00
Alan Ren
e83a6f9c2e profile page and summary page (#4769)
* add cluster name to page

* implement profile page -1

* fix compilation error due to new method

* profile page 0328

* summary page

* make divcontainer accessible

* handle disposable

* add support for "coming soon" cards
2019-04-02 13:52:39 -07:00
Anthony Dresser
63485c8c78 Remove builder references from options dialog (#4774)
* remove more builder references; remove $ from declarations

* fix jquery references

* formatting

* fixing backup

* fix backup box
2019-04-02 13:49:50 -07:00
Karl Burtram
72ef024678 Check for null ref in query statusbar timer (#4804) 2019-04-02 11:56:45 -07:00
Chris LaFreniere
414c736655 Allow output area to be selectable again (#4714) 2019-04-01 23:09:00 -07:00
Yurong He
fdbfbb9238 Change attach to show user name. Use connectionProfile.title to displ… (#4794)
* Change attach to show user name. Use connectionProfile.title to display the same info as OE server node

* Fixed some potential profile leak

* Removed unused import

* Resolve PR comments
2019-04-01 21:18:49 -07:00
Alan Ren
a766e5d334 fix error for download extension (#4793) 2019-04-01 16:44:15 -07:00
Yurong He
2faf01eb9d Add MouseWheelSupport , AdditionalKeyBindings and AutoColumnSize plugins to sql notebook grid (#4790) 2019-04-01 12:44:40 -07:00
Yurong He
a4c2463b2f Add azure and standalone instance config to environment variables (#4745)
* change to fs to write file

* change random range to 1 to 100

* Added more env variables check as well

* Add azure and standalone env variables

* Add Azure instance to unblock mac pipeline testing

* Fixed some merge issue and improve delete file

* Added standalone test only run on windows
2019-03-29 20:14:44 -07:00
Cory Rivera
ddbd8033f9 Remove an outdated showErrorMessage assert from Jupyter Server Manager unit tests. (#4777) 2019-03-29 15:18:01 -07:00
Yurong He
6da66cf367 change to fs to write file (#4736)
* change to fs to write file

* Added more env variables check as well

* Resolve PR comments

* Merge master

* Add file name exist check and delete it after test is done
2019-03-29 15:04:11 -07:00
Cory Rivera
c7bc37d010 Add quotes around python paths to guard against spaces. (#4775) 2019-03-29 14:55:08 -07:00
Charles Gagnon
e0ec3c5035 Fix workspaceRoot macro for insights (#4686)
* Fix workspaceRoot macro for insights

The workspaceRoot macro wasn't working correctly for finding the queryFile. There were a couple of issues :

1. The path separators were hardcoded as / which wasn't xplat-compatible
2. They required the first section of the path was one of the folders in the workspace - e.g. if the workspace contained a folder named foo you'd have to specify ${workspaceRoot}\foo\myfile.sql. This is inconsistent with the folder logich was just appends the path after ${workspaceRoot} to the folder that's currently open

I changed the logic to just append the relative part of the path to every folder currently open in the workspace and choose the first one that it found that contained the file we were told to look for - which follows the convention the folder logic uses. If the file doesn't exist it'll just fall back to using the path without the macro (which is likely to not resolve and thus will display an error, but there's nothing we can do at that point anyways)

* Switch to using VS Code resolver (support for more than just workspaceRoot) and move resolution code into helper method so it can be used by the multiple places it's called. Added tests for the methods.

* Add test for invalid param

* Change resolveQueryFilePath to be a standalone exported function. Change it to throw if the file can't be resolved/found so callers can display error correctly. Added more tests to covery new scenarios. Switch to using pfs instead of fs for file existance checks.

* Add extra param to InsightsDialogController construction in test

* Fix formatting and test errors.

* Change to suiteSetup and suiteTeardown so the setup/teardown is only ran once instead of once per test - we don't need unique files and this stops a race condition error with deleting the test folder.

* spaces -> tabs
2019-03-29 14:53:49 -07:00
Aditya Bist
d4f287298f Agent - committer work (#4758)
* fix delete job

* added the ability to change and retrieve jobowner

* fixed UX for delete step

* improved operator actions

* fixed operators and proxies

* added errors for failures
2019-03-29 13:46:52 -07:00
Aditya Bist
e70d5838a8 Agent - stop job (#4410)
* stop job behavior similar to ssms

* removed agent from sqlops

* fix couple UX issues

* let sqlops remain unchanged
2019-03-29 13:42:34 -07:00
Aditya Bist
b04ca0fdbd Agent - refresh refactor (#4773)
* refactored refresh views

* removed refresh from jobs view
2019-03-29 13:42:02 -07:00
Anthony Dresser
a064da642d Merge from vscode f5044f0910e4aa7e7e06cb509781f3d56e729959 (#4759) 2019-03-29 10:54:38 -07:00
Charles Gagnon
37ce37979a Fix broken notebook test (#4766)
* Fix broken test

The test would always fail if it was doing the right thing - since it was asserting that there WEREN'T any cells with outputs (the opposite of what it should have been checking).

* Update error message to be clearer

* Fix spelling error
2019-03-29 10:25:52 -07:00
Karl Burtram
cb3cbd0d78 Revert Messages Collapse behavior (#4757)
* Revert "change sizing behavior to allow the messages to fulling collapse down (set results to have no max height) (#4313)"

This reverts commit 7de294a58e.

* Revert "Grid scrolling bugs (#4396)"

This reverts commit ace6012c1c.
2019-03-28 18:09:58 -07:00
Raj
1415aa1c03 #4586: Clear Results feature in Notebooks (#4705)
* #4586: Clear all outputs feature

* text change

* Adding extensible method with integration tests

* Misc change

* Misc change

* Adding more logging

* Change to test

* Adding outputs confition in integration tests
2019-03-28 15:20:28 -07:00
Karl Burtram
7eb17f6abc Add Query Editor null checks (#4753) 2019-03-28 15:18:44 -07:00
Charles Gagnon
fdb471d506 Windows extension fixes/improvements (#4740)
* Windows extension fixes/improvements

Remove error message when activating extension on non-win platforms (instead just don't do anything). Add hooks to kill child processes on exit if they haven't ended already. Formatting fixes.

* Remove shelljs

* Remove error messages from telemetry until I can follow up on whether we're allowed to send these up. Fix some issues with the exec callback - the child_process exec function returns an ExecException object instead of the code directly like the shelljs did so check for that correctly.
2019-03-28 14:58:18 -07:00
Anthony Dresser
e6785ffe95 Merge from vscode de81ccf04849309f843db21130c806a5783678f7 (#4738) 2019-03-28 13:06:16 -07:00
Karl Burtram
cc2951265e Add query execution plan extensibility APIs (#4072)
* WIP 1

* WIP 2

* Fix typos

* Iterate on API a bit

* Query Tab WIP

* More dynamic query tab impl

* Fix merge breaks

* Update interfaces

* Update to single event handler for query events

* Remove query plan extension

* Add generated JS file
2019-03-28 10:59:02 -07:00
Chris LaFreniere
ee413f3b24 Ensure only one cell shows as running at a time (#4715) 2019-03-27 19:04:07 -07:00
Karl Burtram
fc664a850d Fix query plan styles to correct rendering (#4739) 2019-03-27 18:00:33 -07:00
Raj
55efe76b2e #3585: Canceled message when changing kernel (#4673)
* #3585: Canceled message when changing kernel

* Indentation

* Cleanup
2019-03-27 15:43:27 -07:00
Raj
dd8922ce4d #4511: Streamline untitled notebook count (#4727) 2019-03-27 15:29:45 -07:00
Yurong He
102b48c302 Change standalone integration test to use sql login (#4728) 2019-03-27 15:09:45 -07:00
David Shiflet
a360bebd9d Add status messages during command line processing (#4725)
* add status bar messages

* missed semicolon
2019-03-27 15:44:57 -04:00
Cory Rivera
37ab493b78 Use getErrorMessage to get exception messages in python installation code. (#4730) 2019-03-27 12:14:42 -07:00
Anthony Dresser
46b7afe558 Merge from vscode 3bd60b2ba753e7fe39b42f99184bc6c5881d3551 (#4712) 2019-03-27 11:36:01 -07:00
Chris LaFreniere
eac3420583 Minimum grid height set when grid returns 0 rows (#4716) 2019-03-27 10:57:48 -07:00
Chris LaFreniere
e1e9c08242 Disable Python 3 notebook integration test due to timeout (#4726)
* Disable Python 3 notebook integration test due to timeout

* Remove unused imports
2019-03-27 10:18:42 -07:00
Chris LaFreniere
5ac6cf3b74 Ensure SQL is the first kernel in the kernels dropdown (#4692)
* Ensure SQL is the first kernel shown in the dropdown

* cleanup to prevent sql from registering twice
2019-03-26 19:39:28 -07:00
Yurong He
d6a58136da Add log to help debug failure in the lab (#4706) 2019-03-26 11:55:14 -07:00
Alan Ren
819b7b93d1 render xml properly in the grid (#4674)
* render xml properly in the grid

* fix html injection

* fix tslint error

* add comments and fix an issue

* fix comment
2019-03-26 11:48:55 -07:00
Anthony Dresser
bceeda1cfd Remove dev flags (#4707)
* remove more dev flags

* formatting
2019-03-26 11:44:59 -07:00
Anthony Dresser
0d8ef9583b Merge from vscode 966b87dd4013be1a9c06e2b8334522ec61905cc2 (#4696) 2019-03-26 11:43:38 -07:00
Yurong He
b1393ae615 Disable pySpark3 notebook test to unblock insider build (#4695)
* Disable pySpark3 notebook test to unblock insider build

* Removed unused import
2019-03-25 17:36:49 -07:00
Alan Ren
464109313b fix context menu test due to new SSMSMin feature (#4693) 2019-03-25 16:49:12 -07:00
Raj
83c8baf8e3 #4418: Notebook file icon doesn't show for new notebook (#4676)
* #4418: Notebook icon doesn't load for untitled

* Indentation

* Indentation
2019-03-25 16:05:04 -07:00
Chris LaFreniere
1bac929ab3 Fix for SQL Kernel only showing up (#4691)
* Ensure that notebook providers that are registered "early" are shown in kernels dropdown

* cleanup
2019-03-25 15:24:57 -07:00
Charles Gagnon
b27417da41 Add ADS Windows support extension with LaunchSsmsDialog command (#4248)
* ADS Windows support extension with LaunchSsmsDialog command

* Update readme

* Fix spacing

* Update download with new file location and name

* Update SsmsMin package with bits from latest RC build and addressed some comments.

* Update extension name. Add Context menu extension for launching server properties dialog. Remove params interface from public API

* Rename folder and update README

* Correct README title

* Fix a few issues and clean up some stuff.

* Update to azdata namespace

* Refactor to use async/await and add some more telemetry

* Add .bat for running extension tests (currently only Notebook) and set up launch.json with 2 new launch configs for running & debugging extension tests.

* Rename files to make it clear these aren't the integration tests

* Update launch.config too

* Fix spacing and missed file name update

* Fix some bugs in buildSsmsMinCommandArgs and add unit tests
2019-03-25 14:19:11 -07:00
Charles Gagnon
ef1f72f69b Remove popular extensions view (#4679)
We don't track installs for our extensions so the "Popular Extensions" category is just a duplicate of the full list. Removing it to reduce clutter until we can add that functionality in. This also removes the command associated with the view.
2019-03-25 13:21:48 -07:00
Raj
784fd57410 confirmSave using UntitledEditorInput (#4611) 2019-03-25 10:25:28 -07:00
Charles Gagnon
1dd0afcf80 Add ADS Startup Telemetry events back in (#4675)
They were removed with a VS Code merge. Changed to using extension contribution to log the telemetry to reduce the amount of VS code edits.
2019-03-23 11:02:34 -07:00
udeeshagautam
ddce7731b9 Feature/viewlet cmsapis (#4312)
* first set of changes to experiment the registration of cms related apis

* Adding cms service entry to workbench

* Adding basic functionality for add remove reg servers and group

* Returning relative path as part of RegServerResult as string

* Removing the cms apis from core. Having mssql extension expose them for cms extension

* Propogating the backend name changes to apis

* Fixing some missing sqlops references

* Adding a sqltools service version with CMS apis available
2019-03-22 17:24:45 -07:00
Alan Ren
d00c3780a6 fix for issue 4596 (#4670) 2019-03-22 14:11:26 -07:00
Anthony Dresser
4a87a24235 Merge from vscode 011858832762aaff245b2336fb1c38166e7a10fb (#4663) 2019-03-22 13:07:54 -07:00
Anthony Dresser
f5c9174c2f Revert "Connection Store Refactor (#4632)" (#4671)
This reverts commit 756f77063a.
2019-03-22 11:30:20 -07:00
Alan Ren
8d5f676039 a few deploy cluster wizard changes (#4644)
* update icon for target types

* Revert "update icon for target types"

This reverts commit 79bd7674f2c09602430a0b10829f7b0d3234eb98.

* update target type icons

* update eula and privacy policy links

* existing cluster page

* adjust the loading indicator position
2019-03-22 11:29:51 -07:00
Yurong He
71db7e10b6 Add notebook integration tests (#4652)
* Add notebook integration tests
2019-03-22 10:39:44 -07:00
Anthony Dresser
756f77063a Connection Store Refactor (#4632)
* various clean ups

* formatting

* remove linting

* formatting

* IConfigurationService is even better

* messing with connection config tests

* update tests

* formatting

* foramtting

* remove unused code

* add more tests

* working through tests

* formatting

* more factoring of connection store and increase code coverage

* formatting

* fix tests
2019-03-22 01:07:32 -07:00
Matt Irvine
5f637036bc Fix query tab color regression (#4660) 2019-03-21 16:22:03 -07:00
Anthony Dresser
24b5f41065 add data explorer default (#4653) 2019-03-21 16:10:54 -07:00
Chris LaFreniere
b00352570b Fix the "Failed to show notebook document" errors (#4649) 2019-03-21 16:06:17 -07:00
Matt Irvine
0b44b7d384 Fix issue reporter 'Visual Studio Code' text (#4658) 2019-03-21 15:38:25 -07:00
Chris LaFreniere
82da64d66d Handle future done in SQL for multiple batches (#4654) 2019-03-21 13:47:01 -07:00
Matt Irvine
ee5a76bb0c Go back to LICENSE.txt for Windows installers (#4655) 2019-03-21 13:28:51 -07:00
Anthony Dresser
b65ee5b42e Merge from vscode fc10e26ea50f82cdd84e9141491357922e6f5fba (#4639) 2019-03-21 10:58:16 -07:00
Anthony Dresser
8298db7d13 Ignore the users title bar settings (#4625)
* ignore the users title bar settings

* ignore in more places
2019-03-20 19:39:43 -07:00
Chris LaFreniere
5b0e86b179 Have notebook grids respond to theme change events (#4624) 2019-03-20 18:47:57 -07:00
Matt Irvine
ed2641ea02 Update how we create services (#4638) 2019-03-20 17:08:15 -07:00
Matt Irvine
66939636dc Bring over code-url-handler.desktop (#4629) 2019-03-20 14:56:50 -07:00
Anthony Dresser
2c331d929a remove duplicate launch task (#4631) 2019-03-20 12:34:27 -07:00
Anthony Dresser
4472764f3a extensions tslint cleanup/Connection config refactor (#4370)
* various clean ups

* formatting

* remove linting

* formatting

* IConfigurationService is even better

* messing with connection config tests

* update tests

* formatting

* foramtting

* remove unused code

* add more tests
2019-03-20 11:59:07 -07:00
Karl Burtram
6cb7153bdd VBump ADS to 1.6.0 2019-03-20 11:41:41 -07:00
Anthony Dresser
5142f69655 Cleanup dependencies (#4546)
* Merge VS Code 1.31.1

* Fix missed merge conflict

* Fix license in new files

* Remove extra extension files

* Fix compile error

* Fix TSLint errors

* Fix integration tests

* Fixed saved and recent connections list

* Fix tests

* move dependecies and delete unused ones

* fix test compile
2019-03-20 11:30:20 -07:00
Anthony Dresser
dfe23f7bfe Remove some CARBON Edits (#4571)
* remove some unnecessary sql carbon edits to vs source and adds correct fix where necessary

* revert bad change
2019-03-20 11:30:06 -07:00
Raj
6873353cd4 #4618: Notebook JSON has extra } after save (#4627) 2019-03-20 11:12:02 -07:00
Matt Irvine
025c97673f Reapply extension management edits that were removed in 1.31 merge (#4606) 2019-03-20 10:40:09 -07:00
Anthony Dresser
c814b92557 VSCode merge (#4610)
* Merge from vscode e388c734f30757875976c7e326d6cfeee77710de

* fix yarn lcoks

* remove small issue
2019-03-20 10:39:09 -07:00
Anthony Dresser
87765e8673 Vscode merge (#4582)
* Merge from vscode 37cb23d3dd4f9433d56d4ba5ea3203580719a0bd

* fix issues with merges

* bump node version in azpipe

* replace license headers

* remove duplicate launch task

* fix build errors

* fix build errors

* fix tslint issues

* working through package and linux build issues

* more work

* wip

* fix packaged builds

* working through linux build errors

* wip

* wip

* wip

* fix mac and linux file limits

* iterate linux pipeline

* disable editor typing

* revert series to parallel

* remove optimize vscode from linux

* fix linting issues

* revert testing change

* add work round for new node

* readd packaging for extensions

* fix issue with angular not resolving decorator dependencies
2019-03-19 17:44:35 -07:00
Raj
833d197412 #3973: Persist scroll position when tab notebooks (#4531)
* #3973: Persist scroll position when tab notebooks

* Remove getter and setter
2019-03-19 14:01:24 -07:00
Maddy
5e72cd12d1 Maddy/prompt password when different (#4537)
* Prompt for password once when the sql instance password doesn't work for hdfs. If the user provides the correct password, connect and continue, else show Unauthorized  node.

* Removed the hardcoded bad password

* Added check for empty folder scenarios

* Added ErrorStatusCode as property of TreeNode. Checking for the error code instead of the error string to avoid localization issues

* type fixed

* implemented hasExpansionError
2019-03-19 13:46:55 -07:00
Charles Gagnon
330fb6dff5 Remove tooltip from editable dropdown (#4542)
* Fix our custom dropdown control to update the tooltip text correctly when a new value is selected. Previous behavior was to always keep the initial text (<Default> for example). Now we'll update it as appropriate (and default back to Placeholder text when we're clearing the value)

* Spaces -> tabs

* Remove extra ;

* Remove tooltips from the text box part of the dropdown

* Remove tooltips from dropdown arrows

* Revert "Remove tooltips from dropdown arrows"

This reverts commit 31a0748aaea42d5009eb9752bd075ce49a6716f5.
2019-03-19 13:46:36 -07:00
Raj
6d7d485a38 #4331: Set notebook dirty when change 'trusted' (#4569) 2019-03-18 10:56:04 -07:00
Karl Burtram
0160901060 Update readme\changelog for March (#4575) 2019-03-18 10:23:28 -07:00
Chris LaFreniere
9313140c59 Fix install packages not always showing on startup (#4566) 2019-03-18 10:18:03 -07:00
Raj
25b1d4b673 #4565: Open notebook from Dashboard - can't close dirty notebook (#4568)
* #4565:"Don't save" doesn't close editor -Dashboard

* Misc change
2019-03-17 08:03:43 -07:00
Chris LaFreniere
b9d0602f55 Only show filename for notebook titlebar when opening from Open Notebook (#4556) 2019-03-15 17:45:06 -07:00
Charles Gagnon
ec0a3bbc95 Fix recent connections list to use <default> DB if no DB is specified by the user when a connection is made (#4564) 2019-03-15 17:28:56 -07:00
Matt Irvine
50f63a2f72 Fix extension bugs from VS Code merge (#4563) 2019-03-15 17:22:56 -07:00
Raj
cb47cb7dbf #4365: Make notebook dirty when changing kernel (#4547) 2019-03-15 17:02:00 -07:00
Cory Rivera
c9ac49c758 Check error in webhdfs.sendRequest before trying to check response code. (#4561) 2019-03-15 16:52:08 -07:00
Karl Burtram
c4e8aba1c9 Fix missing Azure account name (#4555)
* Fix Azure account picker row height

* Remove unneeded styles
2019-03-15 15:37:00 -07:00
Karl Burtram
ba6b8b1f69 Fix Azure account picker row height 2019-03-15 15:29:25 -07:00
Anthony Dresser
09af2fc2cb Add needs repro closer bot (#4517)
* change message of needs repro and enable

* update needs repro to needs info

* update to needs more info
2019-03-15 14:53:15 -07:00
Alan Ren
5caf0b02f0 make connectiondialog react to provider event (#4544)
* make connectiondialog react to provider event

* fix unit test error

* code review comments
2019-03-15 14:47:23 -07:00
Matt Irvine
86bac90001 Merge VS Code 1.31.1 (#4283) 2019-03-15 13:09:45 -07:00
Charles Gagnon
7d31575149 Fix dropdowns flickering every other time they're opened. The hide message code was being invoked which called hideContextView (the actual dropdown part) even if no message was ever displayed. Now we'll delay setting the message to null and only call hideContextView if we actually had a message to display. (#4528)
Also fixed small issue where messages that didn't have a container would throw an error when trying to call removeClass (since this.element is pulled from the container and thus was undefined.

Tested that the flicker is gone and that messages still show up correctly
2019-03-15 11:19:14 -07:00
Charles Gagnon
7223b28829 Add config for running extension tests (#4495)
* Add .bat for running extension tests (currently only Notebook) and set up launch.json with 2 new launch configs for running & debugging extension tests.

* Rename files to make it clear these aren't the integration tests

* Update launch.config too

* Fix spacing and missed file name update
2019-03-15 09:12:04 -07:00
Anthony Dresser
4014c1d0ab Small strict null checking pass on a few files (#4293)
* fix some null checking

* fix various null strict checks

* move location fo sql files in json

* fix compile and more unused properties

* formatting

* small formatting changes

* readd types

* add comments for angular components

* formatting

* remove any decl
2019-03-14 18:18:32 -07:00
Anthony Dresser
0bf0e795ca More data explorer actions (#4307)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code

* add node context on data explorer

* formatting

* add new Query action

* fix various errors with how the context menus work

* add manage and new query

* add refresh command

* formatting
2019-03-14 17:19:37 -07:00
Kevin Cunnane
0bc3716f74 FIX #4513 Notebook stuck at changing kernel (#4518)
* FIX #4513 Notebook stuck at changing kernel
- Intra-provider kernel change didn't happen because we only tried changing kernel on new session creation.
- Inverted the logic (e.g. did the right thing) and renamed the method so it's clearer what we're doing & what the boolean value should be
- Manually tested all the known scenarios
2019-03-14 15:49:14 -07:00
Alan Ren
ca23ea0f69 use the new dataprotocol client (#4515) 2019-03-14 15:48:49 -07:00
Kevin Cunnane
efaa2c0e3f Fix #4505 Notebooks: New Notebook will not work if existing untitled notebooks are rehydrated (#4506)
* Fix #4505 Notebooks: New Notebook will not work if existing untitled notebooks are rehydrated
* Also fixes #4508
* Unify behavior across New Notebook entry points
- Use Notebook-{n} as the standard in both entry points
- Use SQL as default provider in both
- Ensure both check for other names and only use free number
2019-03-14 14:27:08 -07:00
udeeshagautam
d91f4d5748 Fix for : 4471 Backup/Restore shows in Context Menu for Azure DB (#4498)
* Remove Back Restore from Cloud db's Context menu

* checking for null

* Cleaning up the check
2019-03-14 14:06:11 -07:00
Kevin Cunnane
6f1a03587a Fix #4029 Ensure changeKernels always resolves, even in error states (#4488)
* Fix #4029 Ensure changeKernels always resolves, even in error states
This is necessary to unblock reverting the kernel on canceling  Python install
- startSession now correctly sets up kernel information, since a kernel is loaded there.
- Remove call to change kernel on session initialize. This isn't needed due to refactor
- Handle kernel change failure by attempting to fall back to old kernel
- ExtensionHost $startNewSession now ensures errors are sent across the wire.
- Update AttachTo and Kernel dropdowns so they handle kernel being available. This is needed since other changes mean the session is likely ready before these get going

* Fix to handle python cancel and load existing scenarios
- Made changes to handle failure flow when Python dialog is canceled
- Made changes to handle initial load fail. Kernel and Attach To dropdowns show No Kernel / None and you can choose a kernel
- Added error wrapping in ext host so that string errors make it across and aren't lost.
2019-03-14 13:07:08 -07:00
Matt Irvine
7973f0f178 Disable custom title bar popup on Linux (#4478) 2019-03-14 09:28:19 -07:00
Chris LaFreniere
11f0ca371b Show running state in cell toolbar immediately when cell is run (#4484) 2019-03-13 20:04:46 -07:00
Kevin Cunnane
0131746919 Fix #4452 Notebook is reloaded with wrong kernel (#4480)
* Fix #4452 Notebook is reloaded with wrong kernel
- Await all extension registration before getting providers and providerId
To do this, we need to await way up in the NotebookInput, and promise the model that we'll have values eventually
2019-03-13 20:03:01 -07:00
Chris LaFreniere
5774b1f69a remove ugly border around text cells (#4481) 2019-03-13 19:24:39 -07:00
Cory Rivera
34d36c1de1 Prompt for Python installation after choosing a Jupyter kernel in notebook (#4453) 2019-03-13 18:44:54 -07:00
Chris LaFreniere
cca84e6455 Fix disappearing notebook table (#4466) 2019-03-13 18:32:31 -07:00
Matt Irvine
82ce1ace28 Put back code to refresh tree when capabilities provider gets new capabilities (#4475) 2019-03-13 17:41:29 -07:00
Anthony Dresser
e59d5a766f Enable similarity and locker (#4435)
* turn on similarity

* turn on locker
2019-03-13 16:51:01 -07:00
Anthony Dresser
98a8103f5a fix spacing in yml (#4436) 2019-03-13 16:49:26 -07:00
Alan Ren
81a8593eb6 bump up the extension version for new release (#4456)
* bump up the extension version for new release

* change url for dacpac ext
2019-03-13 15:46:03 -07:00
Anthony Dresser
b6584c9ddf Change shutdown listeners (#4282)
* change shutdown to use proper notification

* change to use storage service

* remove unused imports

* fix test

* change shutdown methods to private

* remove unusde imports

* fix tests

* formatting
2019-03-13 15:15:51 -07:00
Raj
08d4cc9690 #4441:Remove notebook editor - don't save selected (#4449) 2019-03-13 14:25:45 -07:00
kisantia
0565162fde remove broken link from dacpac extension readme (#4448) 2019-03-13 14:03:32 -07:00
Charles Gagnon
31614247cf #3127 #478 Fix 2 Edit Data grid issues (#4429)
* 2 fixes :

1. Fix it so that edits made to a table with a single column will correctly create a new row when done editing. This is done by caching whether the cell is dirty in on the onCellEditEnd callback and then when onCellSelect is called to switch cells we only return early if the cell isn't dirty. Before we were returning early and thus not going into any of the code that creates the new row (submitCurrentCellChange)

2. Fix it so we don't throw an error when deleting the NULL (new) row. As noted in the code this should really be handled by not displaying the context menu to begin with for that row but that's a bit more involved of a fix so for now I'm at least making it not display an error to the user - it'll just do nothing.

Manual testing for now - I tried to add tests but the core logic and the UI layer are too intertwined so this will likely need to be a full end-to-end UI test. I'll continue working on adding some for this dialog but for now I'd like to at least submit these fixes since they're pretty painful to deal with.

* Undo changes to addRow

* Clean up resetCurrentCell and add comment to setCurrentCell

* Use reset method instead of directly initializing currentCell

* Add issue # to comment
2019-03-13 13:52:35 -07:00
Karl Burtram
e9390dcd48 Update "Preview features" notification text (#4444) 2019-03-13 13:50:42 -07:00
Gene Lee
c9b3e2b156 Added 'serverMajorVersion' key to context which is missed currently and caused failure for loading 'Create External Table' menu (#4423) 2019-03-13 13:32:04 -07:00
Anthony Dresser
ace6012c1c Grid scrolling bugs (#4396)
* better maintense of heights in scrollable splitview

* formatting

* fix issue around views resizing messing with render logic

* remove commented code
2019-03-13 11:51:47 -07:00
Chris LaFreniere
17901fbf3d Fix Issue when OE connection Disconnects on Notebook Close (#4425)
* Fix OE connection closing when notebook closes

* handle connections created through Add new connection
2019-03-13 11:32:12 -07:00
udeeshagautam
2c7b5578e7 Bug/3731 results scroll (#4408)
* Bug 3731: Minor issues saving scroll position in results

Fixing the second part i.e. scroll position of results view while switching tabs. The resize call was not respecting the old scrolbar state. Making an explit call to set scroll position  to ensure this sets at the end.
2019-03-13 11:03:41 -07:00
Chris LaFreniere
b2b7d18802 Change Kernels Dropdown to show Switching instead of Loading On Kernel Change (#4426)
Change Kernels Dropdown to show Changing instead of Loading On Kernel Change
2019-03-13 09:59:19 -07:00
Geoff Young
965da0535a Fix sqlDropColumn description (#4422) 2019-03-13 01:06:16 -07:00
Raj
ebd187ec06 #4363: Reopen notebook editors when ADS launched (#4424)
* #4363: Reopen notebook editors when ADS launched

* Code review changes
2019-03-12 23:02:37 -07:00
Kevin Cunnane
b495fb7a37 Add notebook file icon support (#4419)
- Added to both built-in themes. Long term would be good to contribute this back to Seti so it works without a carbon edit tag.
- Issue #4418 tracks remaining problem where for new notebooks it's not set initially. This is blocked by Raj's work so will update once that lands.
2019-03-12 21:23:43 -07:00
David Shiflet
58e5125cde Add scripted object name to query editor tab display (#4403)
* Add scripted object name to query editor tab display

* Update src/sql/parts/query/common/queryInput.ts

Co-Authored-By: shueybubbles <shueybubbles@hotmail.com>
2019-03-12 22:07:11 -04:00
Anthony Dresser
dadfbd2ddb add check to remove errors (#4412) 2019-03-12 15:32:41 -07:00
Charles Gagnon
9fdeec6128 #3224 Fix extra connections populating MRU List (#4368)
* Modify where we add active connections so that we can use the saveTheConnection option to decide whether to add a connection to the MRU. This was necessary because the old location was called from onConnectionComplete which is sent by the SqlToolsService - and that doesn't get any UI related information like the options. The new location is still only called after the connection completes and will be added only if the connection succeeds.

Added new test and updated existing tests to handle new logic (plus a bit of async-refactoring).

* Fix couple spacing issues

* Add logic back in to short-circuit if we already have the connection in the active connections list.

* Fix spaces -> tabs
2019-03-12 14:05:50 -07:00
Matt Irvine
e783aeab66 Use max column width when auto-sizing columns (#4394) 2019-03-12 14:03:37 -07:00
Karl Burtram
53a94cc7bb Revert SQL Tools Service to 1.5.0-alpha.71 (#4413) 2019-03-12 13:38:44 -07:00
Anthony Dresser
839a9b6cb8 move event to better understand when things are happening (#4393) 2019-03-12 13:19:08 -07:00
Anthony Dresser
2557d77ae3 readd row height; add font styles to message panel as well (#4388) 2019-03-12 13:18:25 -07:00
Raj
6a7df2f1ae Adding back save api (#4407)
* #4339: Kernel change event occurs after model load

* #4347: Code cleanup - Notebooks Save

* Remove save method from sqlops

* Adding save method to api's

* Adding save method to ext host

* Misc change
2019-03-12 13:07:10 -07:00
Maddy
2db83b3892 Added localizie for the warning string (#4411) 2019-03-12 13:06:16 -07:00
Yurong He
08c7cc0918 Fixed #4384 add await on disconnect (#4389)
* Fixed #4384 add await on disconnect

* Resolve PR comment
2019-03-12 13:06:02 -07:00
Kevin Cunnane
d555dcb6d7 Fix minor error in snippet (#4398)
- Found a snippet missing a `,`
2019-03-12 12:14:24 -07:00
Kevin Cunnane
7226f25c67 Fix #4356 New Notebook from connection doesn't connect (#4364)
* Fix #4356 New Notebook from connection doesn't connect
Fix new notebook error by passing profile instead of ID.
- I could've just sent the ID over, but this fix sets the stage for disconnected connections to work (since we have enough info to properly connect).
- There's a bug in NotebookModel blocking the disconnected connection part working, but Yurong's in progress fixes will unblock this. Hence checking in as-is and working to properly unblock once that's in.

* Support connection profile in commandline service
- Added new context API for things that want to work on commandline and object explorer
- Refactored commandlineservice slightly to be async & have a simpler execution flow (far fewer if/else statements)

* Fix unit tests
- Fixed 2 issues raised by tests (sholdn't do new query if no profile passed, shouldn't error on new query failing)
- Updated unit tests to pass as expected given changes to the APIs.
2019-03-12 12:14:08 -07:00
Maddy
77a3be6fd7 pulling max bytes of data through the webhdfs api (#4314)
* pulling max bytes of data through the webhdfs api

* Added warning message for the Users to inform the data truncation.

* Updated the warning message on the notification flyer.
2019-03-12 10:38:34 -07:00
Raj
2397df7f22 #4339: Kernel change event occurs after model load (#4366) 2019-03-12 08:21:00 -07:00
Yurong He
118d2c7273 Fix #4047 Redesign notebook model to handle single client session (#4371)
* Start single client session based on the default kernel or saved kernel in NB.

* Added kernel displayName to standardKernel.
Modified name to allign wtih Juptyer Kernel.name.
So we can show the displayName during startup and use the name to start the session.

* Change session.OnSessionReady event in KernelDropDown

* Added model.KernelChnaged for switching kernel in the same provider

* Fixed session.Ready sequence

* Fixed merge issues

* Solve merged issue

* Fixed wrong kernel name in saved NB

* Added new event in Model to notify kernel change.
Toolbar depends on ModelReady to load

* Change attachTo to wait for ModelReady like KenelDropDown

* sanitizeSavedKernelInfo to fix invalid kernel and display_name. For example: PySpark1111 and PySpark 1111


* Added _contextsChangingEmitter to change loadContext msg when changing kernel

* Resolve PR comments
2019-03-11 17:59:13 -07:00
David Shiflet
b44d2b1bb3 Re-enabled command line service tests (#4387) 2019-03-11 19:31:11 -04:00
Chris LaFreniere
9867d88067 check for changeRef not destroyed before detecting changes (#4385) 2019-03-11 14:23:37 -07:00
Chris LaFreniere
037c49e2c6 Warning for table max rows displayed (#4357)
* Warning for top rows

* change single to double quotes when calling localize method
2019-03-08 19:01:21 -08:00
Chris LaFreniere
1e989060f9 Rewrite Spark UI link when using unified connection (#4362)
* Rewrite Spark UI link when using unified connection

* Add more robust error checking
2019-03-08 17:34:47 -08:00
Raj
4d6271c161 'Confirm save' implementation while closing untitled/existing notebooks (#4349)
* #4326: 'Confirm save' while closing both notebook

* Adding comment
2019-03-08 13:28:15 -08:00
Kevin Cunnane
c3900f6984 Notebooks: fix AttachTo showed only Localhost (#4354)
The Attach To was showing only localhost for SQL, since we overrode the standard kernels from SQL with the ones from Jupyter.
Fix is to save all standard kernels.
Also, added dispose handling for some events I found during debugging and removed unused imports
2019-03-08 13:27:14 -08:00
Gene Lee
aa1a036f66 Fixed bug: tree in extension does not show icon (#4348) 2019-03-08 12:34:13 -08:00
Karl Burtram
26274e6c5d Fix connection dialog Saved Connections refresh timing (#4346) 2019-03-08 11:45:11 -08:00
kisantia
bcfbe5a284 fix flatfile and dacfx wizard not defaulting to selected connection when launched from command palette (#4344) 2019-03-08 09:19:03 -08:00
Charles Gagnon
496243fbc7 Fix extra spacing in the file search view (#4343)
Recent changes in the VS Code layout for that viewlet caused the min-height property to affect the display (the property was set on that class before but the layout prevented it from actually being displayed as it was).
I removed the style for the messages class completely since I couldn't find a place that we actually used that ourselves - and anyways having styling on such a common name like messages isn't really a good practice since it applies to everything in ADS.
2019-03-08 07:32:04 -08:00
Raj
036ffe595a #3920: Notebooks file save/save all/cache - for existing files (#4286)
* #3920: Notebooks file save

* Missed in merge

* #4290: Untitled save and native dirty implementation

* Misc changes

* Content Manager, notebooks extension and commented failed unit tests

* Removing modelLoaded event
2019-03-07 18:07:20 -08:00
Maddy
2a903e9f03 handle non ascii characters hdfs filename (#4340)
* encoding the url so that special characters doesn't make the rquest a bad request.
2019-03-07 16:21:03 -08:00
Gene Lee
36e5bbb752 added 'fireOnTextChange' field to azdata.proposed.d.ts (#4341) 2019-03-07 16:03:50 -08:00
Anthony Dresser
96a976d826 add new release yml (#4333) 2019-03-07 15:33:13 -08:00
Kevin Cunnane
ff514a0568 Remove Analyze in Notebook from command palette (#4330) 2019-03-07 14:26:42 -08:00
Kevin Cunnane
a4d99b78d5 cluster deploy extension: Add localization support and fix " to ' strings (#4332)
* Add localization support and fix " to ' strings

* Fix ${ usage
2019-03-07 14:26:31 -08:00
Gene Lee
029c69ecd3 Fixed issue: input change on dropdownbox not reflected to 'dropdownbox.… (#4316) 2019-03-07 13:35:10 -08:00
Alan Ren
9e1f04e476 Enforce vscode and ads version check when installing extensions (#4267)
* engine check when install extension

* gallery install/update and vsix install

* FIX COMMENTS

* Fix the detail not loading issue when version is invalid

* add more comments and adress PR comments

* add install telemetry for install from vsix scenario

* correct the name of the version property for telemetry
2019-03-07 13:04:50 -08:00
Aditya Bist
c8bde41451 Disable edit step until all steps are loaded (#4327)
* disable edit step until all steps are loaded

* job check
2019-03-07 12:58:58 -08:00
Maddy
e16c01623d Added the new hdfs icon for the web HDFS folder. (#4317)
* Added the new hdfs icon for the web HDFS folder.

* overriding the getNodeInfo() in the ConnectionNode
2019-03-07 00:57:12 -08:00
Anthony Dresser
7de294a58e change sizing behavior to allow the messages to fulling collapse down (set results to have no max height) (#4313) 2019-03-06 22:44:49 -08:00
Alan Ren
060343f096 fix couple build issue due to merge issue (#4324)
* fix the build error on linux and mac

* one more fix
2019-03-06 21:53:09 -08:00
Alan Ren
addba0d007 remove the modelviewdialog namespace from azdata (#4301) 2019-03-06 21:32:05 -08:00
Alan Ren
68418f2c8f update target environment type page based on latest design (#4311) 2019-03-06 21:21:47 -08:00
Karl Burtram
428dd17d54 Set dashboard DB to master only for MSSQL provider (#4321) 2019-03-06 19:40:46 -08:00
Matt Irvine
7344b41f47 Fix bug where git extension fails in packaged builds (#4318) 2019-03-06 18:56:04 -08:00
kisantia
5f003b0dd7 Move DacFx wizard into separate extension (#4115)
* Moved dacfx wizard into separate extension

* updating to use azdata

* one more azdata change

* bump import extension version

* renaming extension to dacpac
2019-03-06 17:45:30 -08:00
Kevin Cunnane
b8f454b8ac Add request dependency to correct package.json (#4306)
- This was needed in mssql extension, not the main package.json
2019-03-06 14:33:07 -08:00
Anthony Dresser
b45e03a45a enable classifier (#4296) 2019-03-05 23:49:58 -08:00
Anup N. Kamath
0a268f35bc Fix backup (#4274)
* check to see whether options is available

* removing options state and using _modelOptions

* removed additional space

* making options getter private as not intended to expose
2019-03-05 19:17:42 -08:00
Chris LaFreniere
e3709533b4 Add New Notebook to File Menu (#4287) 2019-03-05 17:42:35 -08:00
Gene Lee
acc8d5f7b2 Added request dependency in mssql extension (#4297) 2019-03-05 17:33:28 -08:00
Anthony Dresser
da06a96630 Add Data Explorer Context and apply to disconnect (#4265)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code

* add node context on data explorer

* formatting

* fix various errors with how the context menus work
2019-03-05 17:09:00 -08:00
Yurong He
7eaf8cfd2f Add check OE node tests (#4273)
* Added verify OE child nodes tests for sandalone and BDC instance

* msg changes

* Added standalone OE node test

* Resolved PR comments

* Change env name

* Added scripts to set env for integration test
2019-03-05 14:42:05 -08:00
Cory Rivera
5248c8f78d Convert caught error to string in notebook onLoad error message. (#4276) 2019-03-04 16:49:23 -08:00
Gene Lee
f4365dbd3a Added WebHDFS rewritten to provide correct Error object and localized error messages (#4223) 2019-03-04 15:23:50 -08:00
Karl Burtram
2309b16bd4 Remove watch script and use 'yarn watch' instead (#4277) 2019-03-04 13:47:31 -08:00
Karl Burtram
c65af61a68 Bump ADS to 1.5.1 2019-03-04 13:01:37 -08:00
Yurong He
a48e9bc64c Fixed #4206 and #4207 open and close notebook quickly issue #4240 (#4255)
* Fixed  #4206 and #4207

* Check if it makes my PR run tests

* Add isReady to JupyterSessionManager
2019-03-04 12:46:27 -08:00
Karl Burtram
b4984d7f2d Update 'sqlops' to 'azdata' in Notebook Manager (#4275) 2019-03-04 12:45:34 -08:00
Yurong He
1017d62f0d Move sql related code to sqlNotebook folder (#4254)
* Move sql related code to sqlNotebook folder

* Resolve PR comments: rename folder to sql.

* Fixed the import path after rename folder
2019-03-04 09:45:32 -08:00
Yurong He
ebc208cacd Fixed #4181 and #4167 change kernel issue between SQL and Sparks #4238 (#4256)
* Fixed #4181 and #4167

The problems are:
- When change Kernel, setProviderIdForKernel switches Kernel first. So when we switch kernel across provider for exmple from PySpark3 to SQL. The Kernel is set SQL already in ClientSession.changeKernel. Then we lost oldKernel info.
  The fix is cache the old session in model before switch and use it to get the correct context
- SQL Kenerl could make mulitple connects from "Add new connection". While we didn't track them, those connections didn't close properly when close the notebook
  The fix is saving the connections made from "Add new connection" in model. and close them and activeConnection when close notebook

Problem is not solved yet in this PR:
- Didn't shutdown Jupytper when swich kernel from spark kernel to SQL.
2019-03-01 22:11:45 -08:00
Anthony Dresser
0236c8e7f8 Data Explorer Disconnect and Error Handling (#4243)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code
2019-03-01 17:47:28 -08:00
Cory Rivera
db8a92f5c2 Update LabeledMenuItemActionItem to match new vscode behavior. (#4264) 2019-03-01 17:37:12 -08:00
Charles Gagnon
c1e5408492 Change vscode folder name to azuredatastudio 2019-03-01 14:32:42 -08:00
Karl Burtram
84890eb1b4 Update product references from 'sqlops' to 'azdata' (#4259)
* Update extensions to use azdata

* Switch core code to use azdata
2019-03-01 13:59:37 -08:00
Alan Ren
220685a522 fix the undefined error when uninstalling extension (#4258) 2019-03-01 13:58:00 -08:00
Karl Burtram
8ebf5dbcb4 Add azdata.d.ts for new extensibility APIs (#4247)
* Add azdata.d.ts for new extensibility APIs

* Update azdata typing files for connection API proposal

* Add implementation for azdata module

* Fix build break in agent
2019-03-01 11:58:32 -08:00
Alan Ren
dad807d62d select cluster page and status update for tool when installing (#4251) 2019-03-01 11:12:57 -08:00
Karl Burtram
8e52ffa30e Fix copywrite headers in notebook extension (#4253) 2019-03-01 10:34:26 -08:00
Raj
18970ff0b9 #4225: Markdown content disappears when edit (#4249) 2019-02-28 19:18:40 -08:00
Cory Rivera
630698459b Update output channel name for Jupyter Notebooks. (#4246) 2019-02-28 16:45:56 -08:00
Karl Burtram
f8e854a087 Turn-on auto-size columns by default (#4241) 2019-02-28 15:24:38 -08:00
Karl Burtram
3b2274b0aa Update Azure resource explorer section title (#4237) 2019-02-28 15:22:40 -08:00
Ronald Quan
0d1ebce1a1 Ron/bdc script (#4221)
* WIP adding scripting support.

* Adding deploy command along with additional env vars needed.

* Adding script generation that sets envars, kube context, and mssqlctl

* Adding test email for docker email envar until we update UI.

* Adding cluster platform detection and disabling generate script after first click.

* Fix spacing and adding comment.
2019-02-28 14:26:50 -08:00
Alan Ren
70d86ce9a2 bump the version of import extension for a new package (#4244) 2019-02-28 13:41:01 -08:00
Cory Rivera
291f591af3 Use upper case PATH for jupyter environment variables. (#4222) 2019-02-27 14:21:34 -08:00
Cory Rivera
5a48c52a95 Delete duplicate path variables when setting up Jupyter environment config. Also added additional error message info on jupyter start. (#4212) 2019-02-27 13:30:27 -08:00
Raj
5625ef956d mkdir under notebook extension folder (#4214) 2019-02-27 13:29:01 -08:00
PaulRandal
969733ab77 Added VDI_CLIENT_OTHER to the list of ignored waits (#4197)
* Update waits_paul_randal.sql

Added the PREEMPTIVE_OS_FLUSHFILEBUFFERS wait to the list to ignore

* Update waits_detail_paul_randal.sql

Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list.

* Update waits_paul_randal.sql

* Update waits_detail_paul_randal.sql

* Update package.json

* Update README.md

* Update CHANGELOG.md
2019-02-27 09:07:41 -08:00
Cory Rivera
12a1ac1a7d Remove leftover merge tags from windows integration test script. (#4213) 2019-02-26 18:02:13 -08:00
Chris LaFreniere
4da322a03f Stop cell content from moving around on hover (#4202) 2019-02-26 17:01:20 -08:00
Anthony Dresser
d5754c00e2 Add bot configs to enable bot features (#4209)
* add bot configs to enable bot features

* update classifier label strings

* initial check in should perform be false to double ensure things are working properly
2019-02-26 16:23:48 -08:00
Anthony Dresser
2e9b5f3a2b Fixes azure sql expansion (#4185)
* fixes azure sql expansion

* remove unneeded changes

* remove unused code
2019-02-26 15:29:02 -08:00
Alan Ren
b11a8e9c0c add placeholder for container username/password input box (#4210)
* add placeholder for container username/password input box

* spacing
2019-02-26 15:19:36 -08:00
Karl Burtram
e37533afbb Add null check to extensionIdentifier to show reload notification (#4208) 2019-02-26 14:18:05 -08:00
Karl Burtram
4a476673ac Update SQL Tools Service to 1.5.0-alpha.73
Update SQL Tools Service to 1.5.0-alpha.73 to pickup the managed batch parser assembly.
2019-02-26 13:35:50 -08:00
Raj
fe5386cc08 display_name undefined error in javascript (#4187) 2019-02-26 13:10:40 -08:00
Alan Ren
8bfb1a9d39 add restore default values button for ports and container settings (#4195)
* add restore default values button for ports and container settings

* change some resource strings
2019-02-26 12:48:35 -08:00
Kevin Cunnane
109aafcbc0 Update vscode-nls in notebook and samples, plus fix samples compilation (#4203)
- Upate vscode-nls to 4.0.0 in notebook extension. Should be fix for Insiders build failure due to localization package failing
- Updated samples to remove as vscode-nls if not needed, or update if still needed
- Updated samples using `npm audit fix`
- Fixed compile errors in all the samples
2019-02-26 12:45:45 -08:00
Karl Burtram
e289c06d8d Add no-op debug extensibility APIs (follow-up) (#4192)
* Add no-op debug extensibility APIs

* Remove unneeded SQL EDIT tags
2019-02-26 08:51:55 -08:00
Karl Burtram
78c1c318c5 Add no-op debug extensibility APIs (#4188) 2019-02-26 08:47:10 -08:00
Matt Irvine
07983d75ea Use correct new line character when copying query results (#4170) 2019-02-25 16:39:57 -08:00
Ronald Quan
d0a4a4242d Feature/mssql-big-data-cluster (#4107)
* Adding kubernetes installer.

* Adding variety of kubectl support and integrating into the kubeconfig target cluster page.

* Addressing PR comments, refactored utility file locations and added missing license headers.
2019-02-25 15:09:22 -08:00
Alan Ren
a71be2b193 Alanren/bdc (#4161)
* wip

* target cluster type page

* finish target cluster type page

* remove commented line
2019-02-25 14:04:12 -08:00
Anthony Dresser
779ca13d48 remove builder references from panel (#4149) 2019-02-25 12:43:00 -08:00
Anthony Dresser
f3b0a50db7 remove builder from button (#4146) 2019-02-25 12:42:33 -08:00
Matt Irvine
c831596e02 Fix Windows issues with menu, packaging, and test script (#4144) 2019-02-25 11:06:31 -08:00
Kevin Cunnane
2ae369fbdb Notebook fixes: Fix #4129, fix #4116, Fix #3913, fix empty results error (#4150)
- Fixes #4129 Overlapping command help windows in notebook
  - Do not show parameter hints for inactive cells, to avoid them hanging around when no longer selected
- Fixes #4116 Notebooks: Intellisense Doesn't Work using Add New Connection
  - Move connect/disconnect logic to 1 place (code component) instead of 2
  - Handle the case where you connect after choosing active cell. We now hook to the event and update connection
- Fix issues in sql session manager where result outputs 0 rows. This was failing to show the empty resultset contents, which is a regression vs. query editor. It also put unhandled error on the debug console
- Fix #3913 Notebook: words selected in other cells should be unselected on cell change

Note: after fix, now looks as follows. Need to do follow up to get correct grid min height

![image](https://user-images.githubusercontent.com/10819925/53280226-9e629580-36cc-11e9-8f40-59cd913caeee.png)
2019-02-25 10:52:07 -08:00
Aditya Bist
f2c9d968a4 fix Object explorer tests (#4135) 2019-02-25 09:51:43 -08:00
Raj
c3f02980a0 Fix #3734 - Codecell content disappers when tabbing editors (#4153) 2019-02-24 19:31:42 -08:00
Kevin Cunnane
7d5ce7b5d7 Fix #4145 Possible for loading icon to appear with rendered widget (#4147)
- For cached insights, it was going down the checkStorage path which wasn't covered
2019-02-22 18:06:00 -08:00
Cory Rivera
8bb71eeb51 Fix lingering bugs from notebook code merge. (#4143)
* Add request to notebook dependencies.
* Use offline python package for windows installation.
2019-02-22 17:31:24 -08:00
Kevin Cunnane
5a88598811 Fix #3778 intellisense is delayed (#4134)
- Jupyter completion item support was awaiting info before responding. Fix is to check if this is even a notebook cell first, then only await stuff if that's true
2019-02-22 16:22:40 -08:00
Raj
da3fbd386d #4132 fix for disolayError (#4133) 2019-02-22 16:00:59 -08:00
Chris LaFreniere
1e915aad20 Make "Double-click to edit" in empty md cell a bit nicer looking (#4102) 2019-02-22 16:00:35 -08:00
Kevin Cunnane
889d5e5b28 Add loading spinner for insight widgets while they're in a loading state (#4136)
- This was very confusing without status indicator, is it complete or in progress?

Please let me know if you see issues with this e.g. was there a reason we didn't do this up to now?

Initial state:

![image](https://user-images.githubusercontent.com/10819925/52883300-4e5d5f00-311f-11e9-988c-f7a70dc0a899.png)

On loading / error:
![image](https://user-images.githubusercontent.com/10819925/52883203-faeb1100-311e-11e9-9757-13a5fd4104c6.png)
2019-02-22 16:00:06 -08:00
Matt Irvine
51cf2df2f8 Reapply changes to publish.ts script (#4138) 2019-02-22 15:52:10 -08:00
Matt Irvine
e690285d9d Undo accidental merge commits (#4137) 2019-02-22 14:52:43 -08:00
Raj Musuku
81d6423f24 Merge branch 'master' of https://github.com/Microsoft/azuredatastudio 2019-02-22 12:58:25 -08:00
Alan Ren
046dee7389 add link area support for text component (#4103)
* add link area support for text component

* add comment for localizable resources

* address comments
2019-02-22 12:53:09 -08:00
Raj Musuku
a33a24dddf Merge branch 'master' of https://github.com/Microsoft/azuredatastudio 2019-02-22 08:45:17 -08:00
Anthony Dresser
636bdbd12c copycat bot yml (#4125) 2019-02-21 22:51:13 -08:00
Raj Musuku
0ab3492afd Merge from master 2019-02-21 18:07:18 -08:00
Raj Musuku
666ae11639 Merge from master 2019-02-21 17:56:04 -08:00
Matt Irvine
826856c390 Merge VS Code 1.30.1 (#4092) 2019-02-21 17:17:23 -08:00
Chris LaFreniere
a764a481f3 Ensure that we preserve rest of PATH when starting Jupyter (#4109) 2019-02-21 11:54:08 -10:00
Anthony Dresser
e345090015 Data Explorer Sourcing (#4033)
* added initial data explorer viewlet

* added dataexplorer contribution point

* removed test view

* remove unused imports

* inital data source, needs work

* add shim for ext host

* formatting

* making the necessary changes to use OE for tree view; need to look at TreeUpdateUtils.connectAndCreateOeSession

* formatting

* shimming oe more

* update to add correct context

* working cross provider; need to fix connection

* connection works but it adds the connection to the oe for some reason

* formatting

* add custom connection dialog code path

* hashing between trees

* added flag and tests

* add id maps to handle multiple nodepaths

* add necessary car edit parts

* keep current behavior in prodc

* fix tests

* address comments

* update comments to be more informative

* finish merge

* update comments

* fix whitespace
2019-02-21 13:47:59 -08:00
Aditya Bist
cc97198fe4 Agent improvements and fixes (#4077)
* fixed new/edit job break when no start step given

* bump agent version

* agent version bump
2019-02-20 15:56:54 -08:00
Aditya Bist
5a146e34fa fixed scrolling for connection viewlet (#4073)
* fixed scrolling for connection viewlet

* removed horizontal scrolling
2019-02-20 13:51:56 -08:00
Cory Rivera
70838c3e24 Move SQL 2019 extension's notebook code into Azure Data Studio (#4090) 2019-02-20 10:55:49 -08:00
Chris LaFreniere
2dd71cbe26 Change SQL kernel to check queryManagementService instead of hardcoding (#4098)
* Change SQL kernel to check queryManagementService instead of hardcoding

* addressing PR comments
2019-02-19 15:55:11 -10:00
Chris LaFreniere
32c013a72c Add total execution time message for SQL notebooks (#4093)
* Display total execution time for sql notebooks

* remove handlebatchstart since it was unnecessary
2019-02-19 15:43:16 -10:00
Chris LaFreniere
db6e1ae558 Merge branch 'master' of https://github.com/microsoft/azuredatastudio 2019-02-19 17:38:10 -08:00
Chris LaFreniere
4117da6e93 undo remove sql kernel setting 2019-02-19 17:38:00 -08:00
Chris LaFreniere
02b1ba03ed remove sql kernel setting 2019-02-19 17:35:33 -08:00
Kevin Cunnane
1f501f4553 Improve cell language detection and add support for language magics (#4081)
* Move to using notebook language by default, with override in cell
* Update cell language on kernel change
* Tweak language logic so that it prefers code mirror mode, then falls back since this was failing some notebooks
* Add new package.json contribution to define language magics. These result in cell language changing. Language is cleared out on removing the language magic
* Added support for executing Python, R and Java in the SQL Kernel to prove this out. It converts to the sp_execute_external_script format

TODO in future PR:

* Need to hook up completion item support for magics (issue #4078)
* Should add indicator at the bottom of a cell when an alternate language has been detected (issue #4079)
* On executing Python, R or Java, should add some output showing the generated code (issue #4080)
2019-02-19 17:05:56 -08:00
Chris LaFreniere
0205d0afb5 Change default max table rows returned in notebook to 5000, make it user configurable (#4084) 2019-02-19 15:01:47 -10:00
Anthony Dresser
ccf9bf4613 change rendering in panel to fix event handelrs (#4082) 2019-02-19 12:48:07 -08:00
Anthony Dresser
d4704e39ac Another code layering (#4037)
* working on formatting

* fixed basic lint errors; starting moving things to their appropriate location

* formatting

* update tslint to match the version of vscode we have

* remove unused code

* work in progress fixing layering

* formatting

* moved connection management service to platform

* formatting

* add missing file

* moving more servies

* formatting

* moving more services

* formatting

* wip

* moving more services

* formatting

* move css file

* add missing svgs

* moved the rest of services

* formatting

* changing around some references

* formatting

* revert tslint

* revert some changes that brake things

* formatting

* fix tests

* fix testzx

* fix tests

* fix tests

* fix compile issue
2019-02-19 12:11:54 -08:00
Chris LaFreniere
4a82abc19b Notebooks: Greatly Reduce Time to Generate HTML Table String (#4086)
* Greatly reduce time to generate html table string

* change outer tag to table instead of html

* address PR feedback for more descriptive variable name
2019-02-19 09:03:58 -10:00
Chris LaFreniere
3ae32ab0a0 Fix the Attempt to use a destroyed view Errors (#4087) 2019-02-19 08:28:55 -10:00
Alan Ren
1cc6a108a7 Deprecate the modelviewdialog namespace (#4075)
* Deprecate the modelviewdialog namespace

* use @deprecated tag
2019-02-19 09:43:37 -08:00
Chris LaFreniere
d4ffe53dbd Add database name to attach to (if not connected to master) (#4076) 2019-02-15 16:50:14 -10:00
Yurong He
f43b3508e9 Added SQL notebook IntelliSense (#4064)
* Added SQL notebook intelliSense

* Resolved PR comments.

* catch disconnect error
2019-02-15 15:53:06 -08:00
Chris LaFreniere
ea0c326d3d Allow code coverage command to succeed again (#4054) 2019-02-15 12:07:52 -10:00
Kevin Cunnane
4843480fbf Update server reports extension version, fix its build breaks, and reduce its size to 86Kb (#4062)
* Updated version
* On building found it had build break due to unused imports. Turns out none of the code is used (it's just a package.json + SQL file extension) so removed it all
* Removed all unnecessary node module imports, reducing size from 5.4MB to 86KB. We should probably do this for all extensions
2019-02-15 10:33:22 -08:00
Chris LaFreniere
930e14e258 Add bottom margin to notebook table, fix python highlighting (#4055) 2019-02-15 08:32:27 -10:00
David Shiflet
87bbb41fb6 window reuse for connections (#4049)
* window reuse for connections

* space after colon

* use undefined instead of null
2019-02-15 09:44:06 -05:00
Kevin Cunnane
e767f68f89 Move New Notebook to the connection node in MSSQL server OE connections (#4053) 2019-02-14 15:47:36 -10:00
Raj
b4d304c21e 'Attach to' with Spark kernel resets to sql connection on cancelling connection dialog (#4024)
* Sql connection resets to Select Connection on cancelling dialog

* Hiding error message wehen cancel the connection dialog
2019-02-14 16:39:23 -08:00
Chris LaFreniere
db1f412dae show errors and messages in output (#4031) 2019-02-13 14:54:51 -10:00
PaulRandal
a73f5e83ec Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list in waits script (#4030)
* Update waits_paul_randal.sql

Added the PREEMPTIVE_OS_FLUSHFILEBUFFERS wait to the list to ignore

* Update waits_detail_paul_randal.sql

Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list.
2019-02-13 13:21:17 -08:00
Karl Burtram
49b2e98a8f Update readme and changelog for Feb release (#4025) 2019-02-13 09:33:40 -08:00
Alan Ren
b3a16fd0ce Feature/bdc create (#4012)
* initial checkin

* rename

* wizard pages

* target cluster radio button group

* resource strings

* existing cluster picker

* revert changes to unwanted file

* revert unwanted changes-2

* update cluster icon

* settings page

* fix group container

* hyperlink component

* address review comments

* comments part 2
2019-02-12 22:13:30 -08:00
Karl Burtram
dd6735ec04 Bump Azure Data Studio to 1.5.0 2019-02-12 21:10:55 -08:00
Chris LaFreniere
9ebf1436d2 Ensure we always switch to a kernel that exists in the session manager (#4015)
* Ensure we always switch to a kernel that exists in the session manager

* PR feedback, create new helper method
2019-02-12 12:46:58 -10:00
Kevin Cunnane
0aa71b5237 Fix kernel name check bug, double-event hooking, and other Notebook issues (#4016)
* Fix error where kernel name was compared to itself. This doesn't break anything right now since we happen to have special handling of Python3, and for other kernels they share the same set of supported providers (which is what the check is used for). However it needs fixing for next release.
* Fix console error due to queryTextEditor trying to access model before it's ready
* Fix issues where notebook model hooked to session events multiple times
* Removed calls that weren't needed such as loadActiveContexts (if undefined, does nothing) and passing connection to initialize method
2019-02-12 14:26:22 -08:00
Alan Ren
7ef2d52efd add admin pack to recommended ext list (#4019) 2019-02-12 13:00:31 -08:00
Karl Burtram
a5c6dfe62b Bump agent and import extension versions (#4018) 2019-02-12 12:53:52 -08:00
Alan Ren
4a606e0cb2 Alanren/admin pack (#4014)
* admin pack

* formatting and typo fixes

* remove changelog.md
2019-02-12 12:04:19 -08:00
Chris LaFreniere
27370c655d fix left table border to be dotted, no longer have table border on div (#4010) 2019-02-12 08:49:44 -10:00
Alan Ren
887f4e8985 Alanren/fix4001 (#4011)
* fix for 4001

* Revert "fix for 4001"

This reverts commit 91fed44f063acb44b844a206a13e4074150d8118.

* fix for issue 4001
2019-02-11 21:51:55 -08:00
Matt Irvine
c91c4b01f9 Fix bug minimizing a maximized result grid (#4007) 2019-02-11 21:51:06 -08:00
Yurong He
5f198dba08 Added hard coded pySpark3 kernel for analyze notebook. Good to know and fix it in time. (#4009) 2019-02-11 17:21:08 -08:00
Kevin Cunnane
67f9a7f5e4 Fix issues due to missing notebook values (specs and cells) (#4008)
- Fix #3844
    - Fix #3955
    - Specs can be null on early load of Jupyter kernels
    - Cells were missing in some reference test .ipynb files. We should be resilient to malformed files if possible.
2019-02-11 17:17:56 -08:00
Yurong He
62404721ed Fixed #3954 pass connection info to new notebook flow (#4000)
* Fixed #3954 
The problem is: connectionProfileId is not passed into New Notebook flow.
The fix is: plumbing connectionProfileId via NotebookInput.

* Resolved PR comments
2019-02-11 15:28:05 -08:00
Kevin Cunnane
6d37329e74 Fix #3989 notebook execution count should start empty (#4004)
- Fixed issues where we missed using actual execution count / starting from empty on this.
2019-02-11 15:18:54 -08:00
Karl Burtram
0c316d3225 Bump Azure Data Studio to 1.4.5 2019-02-11 09:17:37 -08:00
Kevin Cunnane
131644477d Beginning of fix for notebook perf (#3984)
Fixes #3843. Now includes full fix which limits length and ensures a scrollbar is available

- Set max size for editor. 4000px gets us 200-250 lines before needing a scrollbar. 
- Adds layout updating which should also ensure accurage line highlighting to the right of the editor. What's happening is initial size is slightly off, so need to layout a 2nd time (e.g. layout once, let flex figure things out, then layout a 2nd time). This isn't optimal as there's a minor perf hit but it isn't noticeable overall.

To consider in future PRs:
- Add user configurable setting for max length?
- Handle case where we scroll to bottom but scrollbar is at the top. 
- Consider how intellisense will work on this. We may need to split into a window around the current code when sending to the kernel as it's quite likely that doing a 12K line intellisense request will be too big.
2019-02-09 13:44:53 -08:00
Kevin Cunnane
b964dd0895 Fix #3985 Hide cell toolbar for markdown cells (#3987)
* Fix #3985 Hide cell toolbar for markdown cells
* Note that I'm still hiding the overall toolbar section per UX feedback
* Also now hiding line numbers per UX feedback..
2019-02-08 16:38:28 -08:00
Raj
7dd32ed44b Notebook server shutdown error (#3976)
* fix #3959 - Notebook shutdown error

* Removing the unit test case to stopserver from clientSession
2019-02-08 13:51:00 -08:00
Chris LaFreniere
0b6aedfc93 Address notebook margin and border CSS issues (#3977) 2019-02-08 10:01:09 -10:00
Chris LaFreniere
b692088c94 Stop map column names for notebook grid, instead use field with unique values (#3975) 2019-02-08 09:59:36 -10:00
Karl Burtram
160ab8d0ae Bump html_query_plan to 2.6 (#3982) 2019-02-08 11:47:33 -08:00
Kevin Cunnane
a599cb436a Run upgrade on npm and yarn packages to update lodash to 4.1.7.11 (#3983)
- This is a recommended update, let me know if you have concerns
- Updated all samples and Azure Data Studio specific extensions with lodash dependency
2019-02-08 11:44:08 -08:00
Yurong He
294aa81298 Added serverVersion to contextProvider key, so Sql server preview das… (#3981)
* Added serverVersion to contextProvider key, so Sql server preview dashboard tab could be filter by it.

* User major version instead of serverVersion
2019-02-08 11:17:46 -08:00
Kevin Cunnane
ddc4b3dd6e Support execution count in run button and align correctly (#3979)
Fixes #3931 
- Align run button correctly so it's centered in new cell
- Refactor to support multi-state button.
  - Hidden state is set to show execution count
  - Stopped state shows run button
  - Running state shows stop button
  - Error state (will) show error button. This isn't fully handled right now
- Add execution count to model and to SqlKernel, verify serialization, loading, update matches other notebook viewers

**Notes on implementation**:
I think this is a decent solution for a) showing execution count as text, and b) perfectly centering the run button.
 
The below solution shows count correctly up to 999 runs (that’s clicking 999 times in a single session), the icon lines up juuust about right with [ ] but for other numbers it is pretty close but probably not exactly right. I wish I could solve this to work better but trying to change text float to center etc. really isn’t working.
 
**Screenshots**:
![image](https://user-images.githubusercontent.com/10819925/52466366-e8794200-2b36-11e9-9a50-86893e75d5af.png)

With running cell:
![image](https://user-images.githubusercontent.com/10819925/52466378-f333d700-2b36-11e9-9e6c-3cee098790fd.png)
2019-02-08 11:05:03 -08:00
Matt Irvine
4c5bf3ad2b Add insiders build links to readme (#3980) 2019-02-08 09:42:46 -08:00
Aditya Bist
88c33214c6 Dataexplorer viewlet (#3967)
* added initial data explorer viewlet

* added dataexplorer contribution point

* removed test view

* remove unused imports

* added flag and tests

* CR comments

* added icon for data explorer
2019-02-07 17:18:05 -08:00
kisantia
393be65aa6 Add Deploy Plan page to DacFx wizard (#3911)
* upgrade plan is piped through and returns the xml plan

* Added review deploy plan page

* checkbox validation now working and columns formatted

* formatting and cleaning up code

* refactored populateTable()

* addressing comments

* addressing comments

* updating tooltips

* add padding to table cells to align with headers

* fix problems when going back and forth between pages and changing config options

* bump sqltoolsservice version to 71

* fix localization
2019-02-07 16:39:22 -08:00
Chris LaFreniere
9ce9a1598f Improve notebook editor height calculations (#3966)
* Improve notebook editor height calculations

* PR comments, hook up to onDidChangeConfiguration
2019-02-07 14:09:09 -10:00
Yurong He
d9079fe18e Fixed #3888. Don't know why it works in previous extension not in mssql after ported. But look like needs utils.getErrorMessage to get msg. (#3969) 2019-02-07 11:47:53 -08:00
Yurong He
3cde070d3b Added submit spark job to data service context menu (#3968) 2019-02-07 11:46:59 -08:00
Karl Burtram
b2a5f65a77 Add query action bar spacing for XML button (#3923) 2019-02-07 11:33:23 -08:00
Kevin Cunnane
69dff5a2cb Fix #3937 Create new notebook (Mac) and receive TypeError (#3965)
- Handles empty file scenario, with fixes along the way for missing metadata (bonus win)
- In non-empty file still shows error and kernel stuck in loading state. #3964 opened to track this issue and fix later
2019-02-07 10:35:19 -08:00
Kevin Cunnane
40e0d5cfbf Fix toggle more actions staying visible, and clickable issues (#3949)
- Fixed so it's now invisible instead of empty when not selected.
 - This fixes clickability and issue where it stayed visible in 1 fix
- Also fixed cell output action which used active cell instead of context cell.
2019-02-07 09:35:58 -08:00
Raj
5a0100757f Attach To is set to 'Localhost' upon cancelling the connection dialog (#3941)
* #3924: Attach To sets 'Localhost' upon cancelling the connection dialog

* Indentation
2019-02-06 16:05:42 -08:00
Kevin Cunnane
f9fe88898d Fix #3928 'Clear output' in ... for markdown cells (#3935)
- Add filtering support for actions, and use for the Clear Output action
2019-02-06 15:49:20 -08:00
Anthony Dresser
a2d6955f79 check for undefined on query info (#3933) 2019-02-06 15:17:45 -08:00
Yurong He
8fa247145e As PM suggested, moved it to Data Services node. (#3930) 2019-02-06 14:22:37 -08:00
Yurong He
04bb65dcf7 Fixe Linux EMFILE: too many open file problem. (#3912)
* Fixe Linux EMFILE: too many open file problem.
Separated mssql from packageTask like azurecore.

* Make mssql depends on azurecore

* Minor fixes
2019-02-06 14:09:19 -08:00
Karl Burtram
e4884c7835 Bump SQL Tools Service to 1.5.0-alpha.70 2019-02-06 12:06:23 -08:00
Gene Lee
8b9ce3e8de Spark features with dashboard are enabled (#3883)
* Spark features are enabled

* Fixed as PR comments

* minor change

* PR comments fixed

* minor fix

* change constant name to avoid conflicts with sqlopsextension

* sqlContext to context

* Changed tab name to SQL Server Big Data Cluster

* Added isCluster to ContextProvider to control display big data cluster dashboard tab
Ported New/open Notebook code to mssql extension and enable them in dashboard

* Fixed tslint
2019-02-06 11:54:25 -08:00
kisantia
327a5f5fae Add tooltip for table column headers and align header and cell (#3909)
* Adding tooltip and lining up header and cell

* moving padding to separate css class
2019-02-06 11:46:24 -08:00
gbritton1
50b971477b Removed reference to object explorer (#3463)
Removed reference to object explorer since ADS does not have one
2019-02-06 10:43:46 -08:00
Anthony Dresser
07c7eea2df reverse data array on repopulation (#3907) 2019-02-06 10:41:55 -08:00
Raj
42135d3e53 #3897: Unified connection integration - sql connection improvements (#3910)
* #3897: Unified connection integration - sql connection improvements

* variable name change

* Misc changes

* Misc change
2019-02-06 10:33:21 -08:00
Kevin Cunnane
d74e5e6457 Fix regression where border line between editor and output was lost (#3915) 2019-02-05 21:21:59 -08:00
Kevin Cunnane
a2c7377134 Improve notebook colors and UX (#3914)
This was reviewed / worked on with Smitha and will be signed off on by PM via mail.
1 thing left (make run button look better when not selected) will be one in separate review.

Changes
- Add top/bottom padding to editor so it's not cramped
- Added an (on by default) setting `notebook.overrideEditorTheming`. This controls whether new colors etc. are used for notebook editors or if users should see vanilla UI like in standard editor. Settings under this flag are:
  - When unselected, editor has same color as toolbar. On selection it goes back to regular editor view so colors work "right"
  - In standard light/dark themes we now use a filled in background color instead of border box.
2019-02-05 17:51:42 -08:00
Raj
0e54393d5a Unified connection in notebooks (#3898)
* yarn files

* #3897: Integrate unified connection with notebooks

* ConnectionProfile serialization in unified connection

* #3898 Handle connection validation from extension

* Removing unused namespaces

* Remove constant

* Show a detailed error message on changing context

* Indentation
2019-02-05 11:54:29 -08:00
Yurong He
8cf8cefc92 Removed dup OE contribution. The conflict wasn't detected during checkin. (#3906) 2019-02-05 11:36:33 -08:00
Kevin Cunnane
098c40e9ac Use document-style for Notebooks (#3902)
* Added hover support, adding box shadow and light outline on hovering and the "more actions" button showing on hover
* Added box shadow for dark themes (hooray!)
* Remove border from everything but the code cell unless a cell is selected or hovered over. This ensures this looks like a document
* Fix high contrast theming issues.
2019-02-05 11:28:07 -08:00
Yurong He
80c1c4c6c8 Mssql extension exposes OE getNode API for Sql-2019vNext extension (#3901)
* Mssql extension exposes OE getNode API for Sql-2019Vnext extension

* Resolved PR comments
2019-02-04 19:55:32 -08:00
Yurong He
ef8afab7e8 Added error node to OE tree (#3889)
* Add error node to OE tree

* Add globalerror_red.svg for error node.

* Fixed wrong import resolved automatically

* Resolve PR comments
2019-02-04 19:18:58 -08:00
Yurong He
84e0e08aec Ported Analyze notebook code from SqlOpsStudio and make it work. (#3899)
* Ported Analyze notebook code from SqlOpsStudio and make it work.
if config.notebook.sqlKernelEnabled is true, use SQL provider;
Use Jupyter provider if Python is install, otherwise use buildIn Kernel.

* Analyze in Notebook Kernel can only be Python or "No Kernel". So remove Sql Kernel.
2019-02-04 15:41:01 -08:00
Kevin Cunnane
2fce771214 Run and Add Cell keybinding support (#3896)
- As part of this, fixed bug in the insertCell API where it didn't add to the end / failed if no cells existed
2019-02-04 14:02:15 -08:00
Yurong He
15929e8cf2 Add new notebook to OE server context menu (#3892) 2019-02-04 11:25:33 -08:00
Kevin Cunnane
f1c8ec141a Make run cell button float so it's always visible (#3895)
- Make the toolbar sticky and remove overflow:hidden which blocked this working.
2019-02-04 09:32:38 -08:00
kisantia
a62393e0ed Add width and css options for TableColumn (#3893) 2019-02-02 19:27:35 -08:00
Yurong He
a6defd9b62 Fixed build issue: ERROR: D:/a/1/s/src/sql/workbench/services/notebook/common/sqlSessionManager.ts[21, 1]: Duplicate imports for 'sqlops'. (#3894) 2019-02-01 14:39:09 -08:00
Kevin Cunnane
374212beaa Fix bug where results were added to all cells, and support multiple resultsets (#3890)
- SQLKernel is the only place to listen for batch and query complete messages now
- It routes to the 1 and only future (since can only have 1 at a time
- It handles query cancelation and not-connected issues correctly
2019-02-01 13:53:10 -08:00
Kevin Cunnane
5132e62045 Fix #3734 Notebook cells are shown empty some times even when there is content (#3878)
- Editor layout gets called sometimes when other events happen (and Notebook isn't visible)
- Add in a layout call on re-setting input so the cell is updated. This fixes the problem by laying out once the UI is visible again.

Note: long term, should really be destroying the UI (while preserving the model), then restoring it including scroll selection etc. and hooking back up to the model. That is... much more work, but something we'll need long term to avoid issues where we have many Notebooks open at once. Not in scope for this PR
2019-02-01 10:11:45 -08:00
Kevin Cunnane
9504ede1f3 Fix #3875 Notebook stuck Loading Kernels if SQL flag disabled and Jupyter not installed (#3876) 2019-02-01 10:10:21 -08:00
Kevin Cunnane
afb6e6b5ba Fix some cell UI issues (toolbar background color, unselected cells) (#3881)
- Toolbar background is now differentiated from the editor
- For unselected cells there's no longer a line selection in the cell. This makes it clearer what the active cell is (and cleans the UI up)
2019-02-01 08:03:23 -08:00
Yurong He
60b2b92803 Fixed #3873 with update the version of vscode-nls (#3879)
Fixed #3873 with update the version of vscode-nls (#3879)
Added dependencies needed for prompts.
2019-01-31 17:57:04 -08:00
Aditya Bist
6113311fda preserve whitespace in messages (#3821) 2019-01-31 15:23:10 -08:00
Yurong He
ecac6201d0 Rename nodeType name in order to have file context menu in both mssql and SqlOpsStudio (#3862)
* Added data service context menu: file related operations. All new files are ported from SqlOpsStudio. Will remove these functionality from SqlOpsStudio.

* Used the existing constant hadoopKnoxEndpointName

* Rename nodeType name from hdfs to bdc. So we can have file context menu in both mssql and SqlOpsStudio. Need to add "Create External Table from CSV" support for bdc nodeType

* Rename bdc to mssqlcluster
2019-01-31 13:34:59 -08:00
Chris LaFreniere
90d8c37f91 Fix a not implemented issue when we were not sanitizing kernel display name (#3869)
Fixing an issue where we got a 501 Not Implemented because kernel display name sanitization was not occurring with the _defaultKernel case.

In addition, changed a method name to make it more clear, and removed an erroneous error that would occur every time you opened a notebook without any existing connections. I'm just removing this, as it adds no value.
2019-01-31 09:50:27 -08:00
Chris LaFreniere
c43085beab Fix weird exception when no connection is present for SQL kernel, Limit Max Rows to 2000 (#3870)
Fixes #3856. Matches the Jupyter behavior that we have, where we don't show any message when a connection is required. We no longer will throw a bizarre exception about getOptionsKey being undefined.

Also sets max rows returned to 2000.
2019-01-31 09:38:59 -08:00
Kevin Cunnane
d9c383b2ef Remove notebook.enabled feature flag (#3866)
* Remove notebook.enabled feature flag

* Fix build error with package.json typos
2019-01-31 09:34:50 -08:00
Matt Irvine
100938b0e5 Add bytes dependency to mssql (#3867) 2019-01-30 17:29:41 -08:00
Chris LaFreniere
83a6ee0a22 Change feature flag for SQL kernel to be user preference (#3838)
* Change feature flag for SQL kernel to be user preference

* fix test that was broken

* Tweak package.nls.json to show "(Preview)"
2019-01-30 17:29:08 -08:00
Chris LaFreniere
0dab7f02ed Notebooks: Grid Support (#3832)
* First grid support in notebooks

* still trying to get nteract ipynb to display grid correctly

* works opening with existing 'application/vnd.dataresource+json' table

* fixing merge issue due to core folder structure changing a bit

* PR feedback, fix for XSS
2019-01-30 16:56:14 -08:00
Chris LaFreniere
0e6f2eb1cd Only show placeholder when notebook isn't loading (#3863) 2019-01-30 16:22:24 -08:00
Chris LaFreniere
9a371f8998 Fix for Select Connection not showing up in Attach To (#3860) 2019-01-30 16:22:01 -08:00
David Shiflet
8a7bbd1795 Pass connectionid to registered commands from command line (#3861)
* pass connectionid to registered commands from commandline

* remove blank lines

* fix commandline unit test
2019-01-30 18:18:25 -05:00
Kevin Cunnane
d1fef24723 Support Notebook integration testing by adding APIs & fixing others (#3848)
- Added `runCell` API. Updated runCell button to listen to events on the model so it'll reflect run cell when called from other sources
- Plumbed through kernelspec info to the extension side so when changed, it's updated
- Fixed bug in ConnectionProfile where it didn't copy from options but instead overrode with empty wrapper functions

Here's the rough test code (it's in the sql-vnext extension and will be out in a separate PR)
```ts

    it('Should connect to local notebook server with result 2', async function() {
        this.timeout(60000);
        let pythonNotebook = Object.assign({}, expectedNotebookContent, { metadata: { kernelspec: { name: "python3", display_name: "Python 3" }}});
        let uri = writeNotebookToFile(pythonNotebook);
        await ensureJupyterInstalled();

        let notebook = await sqlops.nb.showNotebookDocument(uri);
        should(notebook.document.cells).have.length(1);
        let ran = await notebook.runCell(notebook.document.cells[0]);
        should(ran).be.true('Notebook runCell failed');
        let cellOutputs = notebook.document.cells[0].contents.outputs;
        should(cellOutputs).have.length(1);
        let result = (<sqlops.nb.IExecuteResult>cellOutputs[0]).data['text/plain'];
        should(result).equal('2');

        try {
            // TODO support closing the editor. Right now this prompts and there's no override for this. Need to fix in core
            // Close the editor using the recommended vscode API
            //await vscode.commands.executeCommand('workbench.action.closeActiveEditor');
        }
        catch (e) {}
    });

    it('Should connect to remote spark server with result 2', async function() {
        this.timeout(240000);
        let uri = writeNotebookToFile(expectedNotebookContent);
        await ensureJupyterInstalled();

        // Given a connection to a server exists
        let connectionId = await connectToSparkIntegrationServer();

        // When I open a Spark notebook and run the cell
        let notebook = await sqlops.nb.showNotebookDocument(uri, {
            connectionId: connectionId
        });
        should(notebook.document.cells).have.length(1);
        let ran = await notebook.runCell(notebook.document.cells[0]);
        should(ran).be.true('Notebook runCell failed');

        // Then I expect to get the output result of 1+1, executed remotely against the Spark endpoint
        let cellOutputs = notebook.document.cells[0].contents.outputs;
        should(cellOutputs).have.length(4);
        let sparkResult = (<sqlops.nb.IStreamResult>cellOutputs[3]).text;
        should(sparkResult).equal('2');

        try {
            // TODO support closing the editor. Right now this prompts and there's no override for this. Need to fix in core
            // Close the editor using the recommended vscode API
            //await vscode.commands.executeCommand('workbench.action.closeActiveEditor');
        }
        catch (e) {}
    });
});
```
2019-01-30 14:24:14 -08:00
Yurong He
3ddc5e7846 Added Unified connection support (#3785)
* Added Unified connection support

* Use generic way to do expandNode.
Cleanup the ported code and removed unreference code. Added as needed later.
Resolved PR comments.

* Minor fixes and removed timer for all expanders for now. If any providers can't response, the tree node will spin and wait. We may improve later.

* Change handSessionClose to not thenable.
Added a node to OE to show error message instead of reject. So we could show partial expanded result if get any.
Resolve PR comments

* Minor fixes of PR comments
2019-01-29 14:37:14 -08:00
Yurong He
b439ea45ec Bump to 69 to fix #3839 doesn't have model.sys.assemblies (#3842) 2019-01-29 10:09:45 -08:00
Karl Burtram
5680785f86 Bump Azure Data Studio to 1.4.4 2019-01-28 15:58:14 -08:00
Chris LaFreniere
e8eb7bec1b Fix notebook selection issues including from placeholder (#3836) 2019-01-28 14:02:23 -08:00
kisantia
565b7404f9 Add generate script option to DacFx wizard (#3789)
* Add generate script option to deploy scenario

* add action to summary page and fixed page adding/removing so that summary page will have the correct step number

* updating contract based on change in sqltoolsservice

* added enums to make index checks more clear

* cleaned up onPageChanged()

* bump sqltoolsservice version to 68
2019-01-28 10:48:36 -08:00
Chris LaFreniere
9cffe4d476 Allow for "when" clause filtering for Notebook Toolbar Given ProviderId Changes (#3712)
* Integrate first SQL Notebooks Bits into Master (#3679)

* First crack tsql notebook (no output rendered yet)

* getting messages back

* intellisense working first cell, no connection errors

* sql notebook cell output functioning

* Latest SQL noteobook changes

* Undo change to launch.json

* Plumbing providers through

* Kernels shown from multiple providers, can switch between them. No mementos yet

* Ensure we have a feature flag for SQL notebooks, ensure existing functionality still works

* Fix tslint duplicate imports issue

* Addressing PR comments

* second round of PR feedback to cleanup notebook service manager code

* merge latest from master

* Enable notebook toolbar actions to disable themselves on provider change

* Undo changes to taskbar/actionbar

* very minor change due to latest merge
2019-01-26 11:01:40 -08:00
Chris LaFreniere
43be88a37c SQL Kernel Improvements/Removing Spark Code from Core/Attach to Changes (#3790)
* Scenarios work besides loading saved kernel

* Fix compilation issue

* Save and load functional

* Fix loading kernesl issue when sql kernel is not enabled

* Fix language mapping to not be hardcoded any longer

* Remove unnecessary comment

* PR Comments vol. 1

* Code cleanup, use ConnectionProfile instead of IConnectionProfile when accessing serverName

* PR changes vol. 2

* One final comment for PR

* Fix linting issue
2019-01-25 18:54:04 -08:00
Anthony Dresser
ea67859de7 Initial Code Layering (#3788)
* working on formatting

* fixed basic lint errors; starting moving things to their appropriate location

* formatting

* update tslint to match the version of vscode we have

* remove unused code

* work in progress fixing layering

* formatting

* moved connection management service to platform

* formatting

* add missing file

* moving more servies

* formatting

* moving more services

* formatting

* wip

* moving more services

* formatting

* revert back tslint rules

* move css file

* add missing svgs
2019-01-25 14:52:35 -08:00
Aditya Bist
c8986464ec fixed arrows disappearing after tab change (#3829) 2019-01-25 13:29:39 -08:00
Aditya Bist
7804f94d8b Copy all messages when selecting all (#3818)
* copy all messages when selecting all

* added functionality for keyboard shortcuts

* fixed bug when select all then selection made

* made output similar to debug console
2019-01-25 12:10:00 -08:00
Alan Ren
bfa77aebfc add clear filter icon and update filter icon from Smitha (#3828) 2019-01-25 11:34:58 -08:00
Yurong He
487fb02313 Bump sqltoolservice version to 67 for unified connection support (#3827) 2019-01-25 11:24:26 -08:00
Aditya Bist
ef64038107 Added horizontal scrolling for explorer (#3819)
* added horizontal scrolling for explorer

* made horizontal scrolling auto
2019-01-24 13:31:19 -08:00
udeeshagautam
5d336accbc adding hover text for dashboard serach grid items (#3816)
Fix for issue : Search widget in Manage dashboard truncates long names with no hovertext to show full name (Ref issue: #3075)
2019-01-24 13:17:37 -08:00
Karl Burtram
99047b2866 Remove Ctrl-Alt keyboard shortcuts (#3810) 2019-01-23 17:23:59 -08:00
Aditya Bist
f611cf3b5a Improve Agent performance (#3804)
* pause and resume job history retrieval when opening dialogs

* review comments

* removed boolean for tab change
2019-01-23 14:25:54 -08:00
Karl Burtram
4ad059605c Update Azure account picker styles based on splitview change (#3791) 2019-01-23 10:50:41 -08:00
Karl Burtram
dc2ff97dd8 Bump Azure Data Studio to 1.4.3 2019-01-22 16:54:51 -08:00
Karl Burtram
2b5265c103 Fix infinite callbacks in Azure Resource Explorer (#3780) 2019-01-22 15:22:45 -08:00
Aditya Bist
2e98fde053 fixed resizing in agent because of slickgrid change (#3786) 2019-01-22 14:38:20 -08:00
Anthony Dresser
d5176e0eb7 remove updating row number column size (#3756) 2019-01-22 14:37:59 -08:00
Anthony Dresser
eb0b2a847b change stating to handle magnify state (#3746)
* change stating to handle magnify state

* fix magnify during state setup
2019-01-22 14:37:33 -08:00
Aditya Bist
cff5482f69 Show Azure Data Studio instead of azuredatastudio when updating (#3787)
* show Azure Data Studio instead of azuredatastudio when updating

* added sql carbon tag
2019-01-22 10:58:40 -08:00
Alan Ren
afc37973d0 Update readme.md 2019-01-22 10:53:26 -08:00
Alan Ren
3eada6c6ab Create readme.md 2019-01-22 10:21:18 -08:00
Aditya Bist
7c39268fe5 Agent - bug fixes and mini features (#3637)
* fixed scrollbar in jobs

* show steps tree when job history is opened

* cleaned and added edit job to job history

* scrollbars on step details

* steps scrolling done

* fixed styling

* fixed keyboard selection, navigation and UI

* fixed tabbing accessibility

* added refresh action to job history

* fixed focus on move step

* added remove schedule button

* fixed various bugs

* added errors for all actions

* review comments
2019-01-22 10:01:13 -08:00
Alan Ren
eb67b299de Alanren/integration test (#3657)
* add an extension for integration tests

* setup ads before running test

* test setup

* test cases

* bash script

* shorter temp folder name

* code cleanup

* add commented out original code

* fix test error

* test result path

* rename results file

* change file path

* report smoke test results

* test stablize

* test stablization and configurable test servers

* fix smoke test error

* connection provider

* simplify the integration test script

* add comment

* fix tslint error

* address PR comments

* add temp log to check whether the environment variable is already set

* remove temp log

* move api definition to testapi typing file

* exclude integration tests extension

* address comments
2019-01-18 17:00:30 -08:00
Alan Ren
3e7a09c1e3 Alanren/profiler filter (#3760)
* profiler filter

* add test cases

* perf improvement with bulk insert

* update dependency version and address comments
2019-01-18 16:25:18 -08:00
Karl Burtram
637dc9b9b2 Bump Azure Data Studio to 1.4.2 2019-01-18 09:48:44 -08:00
Karl Burtram
1de16d4715 Reset query messages for each execution (#3772) 2019-01-17 17:53:05 -08:00
Kevin Cunnane
49090d774d Null ref occurred when doing some UI interactions before the notebook model was set (#3769) 2019-01-17 14:41:56 -08:00
Karl Burtram
9a695b5cdd Reenable results stream by default (#3752) 2019-01-17 10:28:48 -08:00
Raj
e0339b50c0 #3753: User settings configuration - python installation path (#3754)
* #3753: User settings configuration - python installation path

* Text change

* #3753: Text change

* Message change
2019-01-16 16:29:06 -08:00
Karl Burtram
d0c584672f Fix Top Operations tab title (#3751) 2019-01-15 16:06:51 -08:00
Anthony Dresser
27816acaeb Remove custom splitview (#3467)
* working on options dialog

* working through options dialog

* trying to work through modifying options dialog

* working on converting scrollablesplitview

* fixed options working through profiler

* fix profiler

* fix account dialog

* trying to fix problems with splitpanel

* fix insights dialog

* moving through

* fix last list, need to verify looks and functionality

* fix look of account dialog

* formatting

* formatting

* working through scrollable bugs

* working on problem with view size

* fix margin issues

* fix styler for dialogs

* add panel styles to insights

* create instantiation issues

* fix test

* fix test

* remove unused code

* formatting

* working through insight dialog issues

* fix table updating

* remove console logs
2019-01-15 15:00:34 -08:00
Mustafa Sadedil
4de3cc8a09 Completed: Missing feature request: Save as XML (#3729)
* Save as XML feature added to grid

* Unrelated code removed
2019-01-15 14:36:42 -08:00
Karl Burtram
5c16ceb2fa Bump SQL Tools Service to pick up https://github.com/Microsoft/sqltoolsservice/pull/763 (#3748) 2019-01-15 14:12:20 -08:00
Raj
9db3f73413 Notebook Doesn't Prompt for Save even when isDirty #3568 (#3656)
This is temp fix until native save is implemented.
2019-01-15 11:41:05 -08:00
Chris LaFreniere
e0ceddce09 Notebooks: Add Placeholder Cell, Fix Link Styling (#3728)
* Placeholder cell to add new real cells

* Fix links in notebooks to show correct color, rely on angular ngif for placeholder

* Fix failing test where one cell was expected by default

* Remove unnecessary TODO
2019-01-14 17:29:06 -08:00
Chris LaFreniere
6dc4096299 Editor focus based on activeCell, create text in edit, scroll to active (#3725) 2019-01-14 16:39:36 -08:00
Chris LaFreniere
1fa03b5c74 Ensure we always get all providers (#3724) 2019-01-14 16:38:57 -08:00
Kevin Cunnane
f8f57a93c3 Fix #3736 Notebook: cannot connect to SQL big data cluster due to empty config.json file (#3738)
- Writing the config file in the core for now, will look to move to the extension in Feb release
2019-01-14 14:14:32 -08:00
Karl Burtram
960fe63312 Bump Data Protocol client to 0.2.11 (#3739) 2019-01-14 14:04:35 -08:00
Karl Burtram
7545b94128 Turn off "something went wrong" message (#3606) 2019-01-11 17:39:13 -08:00
Karl Burtram
1263a27c1c Fix date in change log to 2019 (#3726) 2019-01-11 16:55:18 -08:00
Anthony Dresser
e1c084d365 fix html formatting in grid (#3722) 2019-01-11 16:24:50 -08:00
Karl Burtram
7465ec0bbd Add connection dialog icon dark theme and HC styles (#3721) 2019-01-11 13:38:43 -08:00
Chris LaFreniere
17ed57836f Fix focus issue when opening notebooks (#3711) 2019-01-11 11:37:49 -08:00
Chris LaFreniere
d0acb51fd7 Fix contentManager undefined when builtin manager used (#3710)
* Fix for contentManager undefined for builtin manager

* Clean up code some more
2019-01-11 10:36:36 -08:00
Anthony Dresser
71c1ed6c49 Add state for column sizing (#3683)
* add state for column sizing

* work properly with auto size columns
2019-01-11 10:25:57 -08:00
AlexFsmn
bfb68254a4 Added context menu for DBs in explorer view to backup & restore db. (#2277)
* Added context menu for DBs in explorer view to backup & restore db.
Fixed bug where progress bar didn't complete on backup/restore menuclick
#2084

* Fix merge conflicts
2019-01-11 10:00:16 -08:00
Anthony Dresser
18f7662209 Duplicate Result sets (#3620)
* remove debouncing and echoing to fix rendering bug

* fix access of internal member

* fix issue with using splice rather than slice

* fix compile issues
2019-01-10 13:44:14 -08:00
Karl Burtram
a0d84f383c Generate temp files as not dirty (#3698)
* Generate temp files as not dirty

* Remove whitespace
2019-01-10 12:51:41 -08:00
Karl Burtram
1f447ae681 Add Idera extension to recommendation list (#3709) 2019-01-10 11:54:39 -08:00
Kevin Cunnane
8bd6691331 Added v3 Notebook format support (#3697)
* Added v3 format support
2019-01-09 17:00:56 -08:00
Chris LaFreniere
42afcf9322 Integrate first SQL Notebooks Bits into Master (#3679)
* First crack tsql notebook (no output rendered yet)

* getting messages back

* intellisense working first cell, no connection errors

* sql notebook cell output functioning

* Latest SQL noteobook changes

* Undo change to launch.json

* Plumbing providers through

* Kernels shown from multiple providers, can switch between them. No mementos yet

* Ensure we have a feature flag for SQL notebooks, ensure existing functionality still works

* Fix tslint duplicate imports issue

* Addressing PR comments

* second round of PR feedback to cleanup notebook service manager code

* merge latest from master
2019-01-09 14:58:57 -08:00
David Shiflet
3d3694bb8d Add --command command line argument (#3690) 2019-01-09 17:36:01 -05:00
Anthony Dresser
589b913960 Readd Top Operations (#3628)
* workin on top operations

* added top operations, changed default sorter to handle number string better
2019-01-09 13:52:38 -08:00
kisantia
7ba4f42494 Moving onValidityChanged listener to showPage() so that it gets added to pages that are added to the wizard after the initial start up (#3691) 2019-01-09 13:19:24 -08:00
Chris LaFreniere
c96118d2b5 Fix activeCell nullref issue (#3689) 2019-01-09 11:45:05 -08:00
Karl Burtram
0285d8cd38 Update readme for January release (#3595)
* Update readme for December release

* Fix spelling

* Update release date to 12/13

* added release note items and fixed a small misspell

* Update release date to Dec 18

* Update release date
2019-01-09 10:12:14 -08:00
Matt Irvine
ee87604a4d Save grid selection/vertical scroll when switching tabs (#3682) 2019-01-08 15:51:57 -08:00
Kevin Cunnane
2235ebaf20 Fix #3680 Notebooks: outputs with string arrays rendered incorrectly (#3681)
* Refactor JSON and format files to model and fix tabs -> spaces issues

This is in prep for some work to reuse these code files inside the model,
so pushing as its own PR to keep the next piece of work clean.

* Fix #3680 Notebooks: outputs with string arrays rendered incorrectly
- Add support for processing v4 format files loaded from disk
- Prep support for v3 notebooks by adding placeholder code for that

* Fix failing tests and add specific one for this bug

* Remove references to v5
2019-01-08 15:24:16 -08:00
Anthony Dresser
954d0d954f Auto Column Sizing (#2778)
* add auto column sizing

* add break for performance

* update with new library
2019-01-08 13:05:53 -08:00
Kevin Cunnane
e31747d087 Refactor JSON and format files to model and fix tabs -> spaces issues (#3675)
This is in prep for some work to reuse these code files inside the model,
so pushing as its own PR to keep the next piece of work clean.
2019-01-07 14:02:12 -08:00
Anthony Dresser
fc581253a4 Fix gap with result streaming (#3629)
* handle updating item sizing when being updated

* change back scrolling delay

* remove unused code
2019-01-07 12:58:00 -08:00
Chris LaFreniere
47c4609f23 Ensure we call Dispose() on NotebookModel when notebook component is destroyed (#3667) 2019-01-04 12:00:42 -08:00
Chris LaFreniere
2d52bc2a49 Notebooks: Fix Selection/Focus when New Cells Added (#3649)
* Improvemnents to Active Cell

* Fix minor spacing issue

* fix editor focus order

* Fix for add cell above/below

* cleanup logic to have activeCell logic all reside in notebook model
2019-01-02 15:20:05 -08:00
kisantia
5367101330 Fix database not getting set correctly in DacFx wizard deploy scenario (#3641)
* fix db not getting set correctly for deploy scenario if coming from import page

* removed space so that comments go directly after //
2018-12-19 18:45:40 -05:00
Matt Irvine
db145b4999 Run TSLint in Azure Pipelines (#3639) 2018-12-19 12:17:23 -08:00
Matt Irvine
7f950ddb80 Update edit data for result set streaming changes (#3634) 2018-12-18 11:07:26 -08:00
Vincent Feng
50e2251e74 Feature/extensible azure resource explorer (#3504)
Extensible Azure Resource Explorer
2018-12-18 15:44:08 +08:00
Karl Burtram
33d5455b6f Update Azure Data Studio to 1.4.1 2018-12-15 14:47:47 -08:00
khoiph1
ac018500cd Loc Update (#3548) 2018-12-13 09:53:22 -08:00
Aditya Bist
3e661db283 removed potentially PII (#3619) 2018-12-12 13:55:22 -08:00
Anthony Dresser
18fb78b3ec Account for different situations for stream setting (#3615)
* add cases for different situation

* default streaming setting false
2018-12-12 12:03:59 -08:00
Matt Irvine
58ff13d399 Fix some TSLint issues (#3605) 2018-12-11 16:27:32 -08:00
17230 changed files with 641895 additions and 485331 deletions

View File

@@ -1,4 +1,4 @@
# EditorConfig is awesome: http://EditorConfig.org
# EditorConfig is awesome: https://EditorConfig.org
# top-most EditorConfig file
root = true
@@ -6,7 +6,6 @@ root = true
# Tab indentation
[*]
indent_style = tab
indent_size = 4
trim_trailing_whitespace = true
# The indent size used in the `package.json` file cannot be changed

View File

@@ -1,19 +0,0 @@
{
"env": {
"node": true,
"es6": true
},
"rules": {
"no-console": 0,
"no-cond-assign": 0,
"no-unused-vars": 1,
"no-extra-semi": "warn",
"semi": "warn"
},
"extends": "eslint:recommended",
"parserOptions": {
"ecmaFeatures": {
"experimentalObjectRestSpread": true
}
}
}

20
.eslintrc.json Normal file
View File

@@ -0,0 +1,20 @@
{
"root": true,
"env": {
"node": true,
"es6": true
},
"rules": {
"no-console": 0,
"no-cond-assign": 0,
"no-unused-vars": 1,
"no-extra-semi": "warn",
"semi": "warn"
},
"extends": "eslint:recommended",
"parserOptions": {
"ecmaFeatures": {
"experimentalObjectRestSpread": true
}
}
}

3
.gitattributes vendored
View File

@@ -6,4 +6,5 @@ ThirdPartyNotices.txt eol=crlf
*.bat eol=crlf
*.cmd eol=crlf
*.ps1 eol=lf
*.sh eol=lf
*.sh eol=lf
*.rtf -text

View File

@@ -2,7 +2,7 @@
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
labels: Bug
assignees: ''
---

View File

@@ -2,7 +2,7 @@
name: Feature request
about: Suggest an idea for this project
title: ''
labels: feature request
labels: Enhancement
assignees: ''
---

49
.github/classifier.yml vendored Normal file
View File

@@ -0,0 +1,49 @@
{
perform: false,
alwaysRequireAssignee: false,
labelsRequiringAssignee: [],
autoAssignees: {
accessibility: [],
acquisition: [],
agent: [],
azure: [],
backup: [],
bcdr: [],
'chart viewer': [],
connection: [],
dacfx: [],
dashboard: [],
'data explorer': [],
documentation: [],
'edit data': [],
export: [],
extensibility: [],
extensionManager: [],
globalization: [],
grid: [],
import: [],
insights: [],
intellisense: [],
localization: [],
'managed instance': [],
notebooks: [],
'object explorer': [],
performance: [],
profiler: [],
'query editor': [],
'query execution': [],
reliability: [],
restore: [],
scripting: [],
'server group': [],
settings: [],
setup: [],
shell: [],
showplan: [],
snippet: [],
sql2019Preview: [],
sqldw: [],
supportability: [],
ux: []
}
}

12
.github/commands.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
{
perform: false,
commands: [
{
type: 'label',
name: 'duplicate',
allowTriggerByBot: true,
action: 'close',
comment: "Thanks for creating this issue! We figured it's covering the same as another one we already have. Thus, we closed this one as a duplicate. You can search for existing issues [here](https://aka.ms/vscodeissuesearch). See also our [issue reporting](https://aka.ms/vscodeissuereporting) guidelines.\n\nHappy Coding!"
}
]
}

5
.github/copycat.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
{
perform: true,
target_owner: 'anthonydresser',
target_repo: 'testissues'
}

6
.github/locker.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
daysAfterClose: 45,
daysSinceLastUpdate: 3,
ignoredLabels: [],
perform: true
}

6
.github/needs_more_info.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
daysUntilClose: 7,
needsMoreInfoLabel: 'needs more info',
perform: true,
closeComment: "This issue has been closed automatically because it needs more information and has not had recent activity in the last 7 days. If you have more info to help resolve the issue, leave a comment"
}

6
.github/new_release.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
newReleaseLabel: 'new-release',
newReleaseColor: '006b75',
daysAfterRelease: 5,
perform: true
}

5
.github/similarity.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
{
perform: true,
whenCreatedByTeam: true,
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}"
}

9
.gitignore vendored
View File

@@ -1,8 +1,10 @@
.DS_Store
.cache
npm-debug.log
Thumbs.db
node_modules/
.build/
extensions/**/dist/
out/
out-build/
out-editor/
@@ -13,8 +15,11 @@ out-editor-min/
out-monaco-editor-core/
out-vscode/
out-vscode-min/
build/node_modules
out-vscode-reh/
out-vscode-reh-min/
out-vscode-reh-pkg/
**/node_modules
coverage/
test_data/
test-results/
yarn-error.log
yarn-error.log

2
.nvmrc
View File

@@ -1 +1 @@
8.9.2
10

23
.vscode/cglicenses.schema.json vendored Normal file
View File

@@ -0,0 +1,23 @@
{
"type": "array",
"items": {
"type": "object",
"required": [
"name",
"licenseDetail"
],
"properties": {
"name": {
"type": "string",
"description": "The name of the dependency"
},
"licenseDetail": {
"type": "array",
"description": "The complete license text of the dependency",
"items": {
"type": "string"
}
}
}
}
}

142
.vscode/cgmanifest.schema.json vendored Normal file
View File

@@ -0,0 +1,142 @@
{
"type": "object",
"properties": {
"registrations": {
"type": "array",
"items": {
"type": "object",
"properties": {
"component": {
"oneOf": [
{
"type": "object",
"required": [
"type",
"git"
],
"properties": {
"type": {
"type": "string",
"enum": [
"git"
]
},
"git": {
"type": "object",
"required": [
"name",
"repositoryUrl",
"commitHash"
],
"properties": {
"name": {
"type": "string"
},
"repositoryUrl": {
"type": "string"
},
"commitHash": {
"type": "string"
}
}
}
}
},
{
"type": "object",
"required": [
"type",
"npm"
],
"properties": {
"type": {
"type": "string",
"enum": [
"npm"
]
},
"npm": {
"type": "object",
"required": [
"name",
"version"
],
"properties": {
"name": {
"type": "string"
},
"version": {
"type": "string"
}
}
}
}
},
{
"type": "object",
"required": [
"type",
"other"
],
"properties": {
"type": {
"type": "string",
"enum": [
"other"
]
},
"other": {
"type": "object",
"required": [
"name",
"downloadUrl",
"version"
],
"properties": {
"name": {
"type": "string"
},
"downloadUrl": {
"type": "string"
},
"version": {
"type": "string"
}
}
}
}
}
]
},
"repositoryUrl": {
"type": "string",
"description": "The git url of the component"
},
"version": {
"type": "string",
"description": "The version of the component"
},
"license": {
"type": "string",
"description": "The name of the license"
},
"developmentDependency": {
"type": "boolean",
"description": "This component is inlined in the vscode repo and **is not shipped**."
},
"isOnlyProductionDependency": {
"type": "boolean",
"description": "This component is shipped and **is not inlined in the vscode repo**."
},
"licenseDetail": {
"type": "array",
"items": {
"type": "string"
},
"description": "The license text"
}
}
}
}
}
}

View File

@@ -2,7 +2,7 @@
// See https://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"eg2.tslint",
"ms-vscode.vscode-typescript-tslint-plugin",
"dbaeumer.vscode-eslint",
"msjsdiag.debugger-for-chrome"
]

149
.vscode/launch.json vendored
View File

@@ -9,14 +9,12 @@
"stopOnEntry": true,
"args": [
"hygiene"
],
"cwd": "${workspaceFolder}"
]
},
{
"type": "node",
"request": "attach",
"name": "Attach to Extension Host",
"protocol": "inspector",
"port": 5870,
"restart": true,
"outFiles": [
@@ -24,19 +22,15 @@
]
},
{
"type": "node",
"type": "chrome",
"request": "attach",
"name": "Attach to Shared Process",
"protocol": "inspector",
"port": 5871,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
"port": 9222,
"urlFilter": "*"
},
{
"type": "node",
"request": "attach",
"protocol": "inspector",
"name": "Attach to Search Process",
"port": 5876,
"outFiles": [
@@ -47,7 +41,6 @@
"type": "node",
"request": "attach",
"name": "Attach to CLI Process",
"protocol": "inspector",
"port": 5874,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -57,7 +50,6 @@
"type": "node",
"request": "attach",
"name": "Attach to Main Process",
"protocol": "inspector",
"port": 5875,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -73,6 +65,45 @@
"type": "chrome",
"request": "launch",
"name": "Launch azuredatastudio",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
"timeout": 20000
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
"timeout": 20000
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
"timeout": 20000
},
"env": {
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null
},
"breakOnLoad": false,
"urlFilter": "*workbench.html*",
"runtimeArgs": [
"--inspect=5875",
"--no-cached-data"
],
"webRoot": "${workspaceFolder}"
},
{
"type": "node",
"request": "launch",
"name": "Launch ADS (Main Process)",
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
"runtimeArgs": [
"--no-cached-data"
],
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "chrome",
"request": "launch",
"name": "Launch azuredatastudio with new notebook command",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
},
@@ -84,7 +115,8 @@
},
"urlFilter": "*index.html*",
"runtimeArgs": [
"--inspect=5875"
"--inspect=5875",
"--command=notebook.command.new"
],
"skipFiles": [
"**/winjs*.js"
@@ -92,34 +124,6 @@
"webRoot": "${workspaceFolder}",
"timeout": 45000
},
{
"type": "node",
"request": "launch",
"name": "Unit Tests",
"protocol": "inspector",
"program": "${workspaceFolder}/node_modules/mocha/bin/_mocha",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"stopOnEntry": false,
"outputCapture": "std",
"args": [
"--delay",
"--timeout",
"2000"
],
"cwd": "${workspaceFolder}",
"env": {
"ELECTRON_RUN_AS_NODE": "true"
},
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"name": "Launch Built-in Extension",
"type": "extensionHost",
@@ -128,9 +132,70 @@
"args": [
"--extensionDevelopmentPath=${workspaceRoot}/extensions/debug-auto-launch"
]
}
},
{
"type": "node",
"request": "launch",
"name": "Launch Smoke Test",
"program": "${workspaceFolder}/test/smoke/test/index.js",
"cwd": "${workspaceFolder}/test/smoke",
"env": {
"BUILD_ARTIFACTSTAGINGDIRECTORY": "${workspaceFolder}"
}
},
{
"type": "node",
"request": "launch",
"name": "Run Unit Tests",
"program": "${workspaceFolder}/test/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "chrome",
"request": "launch",
"name": "Run Extension Unit Tests",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"webRoot": "${workspaceFolder}",
"timeout": 45000
},
],
"compounds": [
{
"name": "Debug Unit Tests",
"configurations": [
"Attach to azuredatastudio",
"Run Unit Tests"
]
},
{
"name": "Debug Extension Unit Tests",
"configurations": [
"Attach to Extension Host",
"Run Extension Unit Tests"
]
},
{
"name": "Debug azuredatastudio Main and Renderer",
"configurations": [

26
.vscode/settings.json vendored
View File

@@ -11,7 +11,7 @@
}
},
"files.associations": {
"OSSREADME.json": "jsonc"
"cglicenses.json": "jsonc"
},
"search.exclude": {
"**/node_modules": true,
@@ -22,9 +22,9 @@
"out-vscode/**": true,
"i18n/**": true,
"extensions/**/out/**": true,
"test/smoke/out/**": true
"test/smoke/out/**": true,
"src/vs/base/test/node/uri.test.data.txt": true
},
"tslint.enable": true,
"lcov.path": [
"./.build/coverage/lcov.info",
"./.build/coverage-single/lcov.info"
@@ -43,6 +43,20 @@
"git.ignoreLimitWarning": true,
"emmet.excludeLanguages": [],
"typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.quoteStyle": "single"
}
"typescript.preferences.quoteStyle": "single",
"json.schemas": [
{
"fileMatch": [
"cgmanifest.json"
],
"url": "./.vscode/cgmanifest.schema.json"
},
{
"fileMatch": [
"cglicenses.json"
],
"url": "./.vscode/cglicenses.schema.json"
}
],
"git.ignoreLimitWarning": true
}

40
.vscode/shared.code-snippets vendored Normal file
View File

@@ -0,0 +1,40 @@
{
// Each snippet is defined under a snippet name and has a scope, prefix, body and
// description. The scope defines in watch languages the snippet is applicable. The prefix is what is
// used to trigger the snippet and the body will be expanded and inserted.Possible variables are:
// $1, $2 for tab stops, $0 for the final cursor position, and ${1:label}, ${2:another} for placeholders.
// Placeholders with the same ids are connected.
// Example:
"MSFT Copyright Header": {
"scope": "javascript,typescript,css",
"prefix": [
"header",
"stub",
"copyright"
],
"body": [
"/*---------------------------------------------------------------------------------------------",
" * Copyright (c) Microsoft Corporation. All rights reserved.",
" * Licensed under the Source EULA. See License.txt in the project root for license information.",
" *--------------------------------------------------------------------------------------------*/",
"",
"$0"
],
"description": "Insert Copyright Statement"
},
"TS -> Inject Service": {
"scope": "typescript",
"description": "Constructor Injection Pattern",
"prefix": "@inject",
"body": "@$1 private readonly _$2: ${1},$0"
},
"TS -> Event & Emitter": {
"scope": "typescript",
"prefix": "emitter",
"description": "Add emitter and event properties",
"body": [
"private readonly _onDid$1 = new Emitter<$2>();",
"readonly onDid$1: Event<$2> = this._onDid$1.event;"
],
}
}

16
.vscode/tasks.json vendored
View File

@@ -28,6 +28,20 @@
}
}
},
{
"type": "npm",
"script": "strict-initialization-watch",
"label": "TS - Strict Initialization",
"isBackground": true,
"presentation": {
"reveal": "never"
},
"problemMatcher": {
"base": "$tsc-watch",
"owner": "typescript-strict-initialization",
"applyTo": "allDocuments"
}
},
{
"type": "gulp",
"task": "tslint",
@@ -69,4 +83,4 @@
"problemMatcher": []
}
]
}
}

View File

@@ -1,3 +1,3 @@
disturl "https://atom.io/download/electron"
target "2.0.9"
target "3.1.8"
runtime "electron"

View File

@@ -1,5 +1,114 @@
# Change Log
## Version 1.8.0
* Release date: June 6, 2019
* Release status: General Availability
## What's new in this version
* Initial release of the Database Admin Tool Extensions for Windows *Preview* extension
* Initial release of the Central Management Servers extension
* **Schema Compare**
* Added Exclude/Include Options
* Generate Script opens script after being generated
* Removed double scroll bars
* Formatting and layout improvements
* Complete changes can be found [here](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22June+2019+Release%22+label%3A%22Area%3A+Schema+Compare%22+is%3Aclosed)
* Messages panel moved into results panel - when users ran SQL queries, results and messages were in stacked panels. Now they are in separate tabs in a single panel similar to SSMS.
* **Notebook**
* Users can now choose to use their own Python 3 or Anaconda installs in notebooks
* Multiple Stability + fit/finish fixes
* View the full list of improvements and fixes [here](https://github.com/microsoft/azuredatastudio/issues?q=is%3Aissue+milestone%3A%22June+2019+Release%22+is%3Aclosed+label%3A%22Area%3A+Notebooks%22)
* Visual Studio Code May Release Merge 1.34 - the latest improvements can be found [here](https://code.visualstudio.com/updates/v1_34)
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/milestone/32?closed=1).
## Version 1.7.0
* Release date: May 8, 2019
* Release status: General Availability
## What's new in this version
* Announcing Schema Compare *Preview* extension
* Tasks Panel UX improvement
* Announcing new Welcome page
* Resolved [bugs and issues](https://github.com/microsoft/azuredatastudio/milestone/31?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues.
## Version 1.6.0
* Release date: April 18, 2019
* Release status: General Availability
## What's new in this version
* Align with latest VS Code editor platform (currently 1.33.1)
* Resolved [bugs and issues](https://github.com/Microsoft/azuredatastudio/milestone/26?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* yamatoya for `fix the format (#4899)`
## Version 1.5.1
* Release date: March 18, 2019
* Release status: General Availability
## What's new in this version
* Announcing T-SQL Notebooks
* Announcing PostgreSQL extension
* Announcing SQL Server Dacpac extension
* Resolved [bugs and issues](https://github.com/Microsoft/azuredatastudio/milestone/25?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* GeoffYoung for `Fix sqlDropColumn description #4422`
## Version 1.4.5
* Release date: February 13, 2019
* Release status: General Availability
## What's new in this version
* Added **Admin pack for SQL Server** extension pack to make it easier to install SQL Server admin-related extensions. This includes:
* [SQL Server Agent](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-agent-extension?view=sql-server-2017)
* [SQL Server Profiler](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-profiler-extension?view=sql-server-2017)
* [SQL Server Import](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-import-extension?view=sql-server-2017)
* Added filtering extended event support in Profiler extension
* Added Save as XML feature that can save T-SQL results as XML
* Added Data-Tier Application Wizard improvements
* Added Generate script button
* Added view to give warnings of possible data loss during deployment
* Updates to the [SQL Server 2019 Preview extension](https://docs.microsoft.com/sql/azure-data-studio/sql-server-2019-extension?view=sql-server-ver15)
* Results streaming enabled by default for long running queries
* Resolved [bugs and issues](https://github.com/Microsoft/azuredatastudio/milestone/23?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* AlexFsmn for `Added context menu for DBs in explorer view to backup & restore db. #2277`
* sadedil for `Missing feature request: Save as XML #3729`
* gbritton1 for `Removed reference to object explorer #3463`
## Version 1.3.8
* Release date: January 9, 2019
* Release status: General Availability
## What's new in this version
* #13 Feature Request: Azure Active Directory Authentication
* #1040 Stream initial query results as they become available
* #3298 Сan't add an azure account.
* #2387 Support Per-User Installer
* SQL Server Import updates for DACPAC\BACPAC
* SQL Server Profiler UI and UX improvements
* Updates to [SQL Server 2019 extension](https://docs.microsoft.com/sql/azure-data-studio/sql-server-2019-extension?view=sql-server-ver15)
* **sp_executesql to SQL** and **New Database** extensions
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* Tarig0 for `Add Routine_Type to CreateStoredProc fixes #3257 (#3286)`
* oltruong for `typo fix #3025'`
* Thomas-S-B for `Removed unnecessary IErrorDetectionStrategy #749`
* Thomas-S-B for `Simplified code #750`
## Version 1.2.4
* Release date: November 6, 2018
* Release status: General Availability

View File

@@ -18,11 +18,15 @@ File a single issue per problem and feature request.
* Do not enumerate multiple bugs or feature requests in the same issue.
* Do not add your issue as a comment to an existing issue unless it's for the identical input. Many issues look similar, but have different causes.
The more information you can provide, the more likely someone will be successful reproducing the issue and finding a fix.
The more information you can provide, the more likely someone will be successful at reproducing the issue and finding a fix.
The built-in tool for reporting an issue, which you can access by using `Report Issue` in Azure Data Studio's Help menu, can help streamline this process by automatically providing the version of Azure Data Studio, all your installed extensions, and your system info.
Please include the following with each issue.
* Version of Azure Data Studio (formerly SQL Operations Studio).
* Version of Azure Data Studio (formerly SQL Operations Studio)
* Your operating system
> **Tip:** You can easily create an issue using `Report Issues` from Azure Data Studio Help menu.

File diff suppressed because it is too large Load Diff

View File

@@ -5,24 +5,30 @@
Azure Data Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux.
**Download the latest Azure Data Studio release**
## **Download the latest Azure Data Studio release**
Platform | Link
-- | --
Windows Setup Installer | https://go.microsoft.com/fwlink/?linkid=2038320
Windows ZIP | https://go.microsoft.com/fwlink/?linkid=2038323
macOS ZIP | https://go.microsoft.com/fwlink/?linkid=2038327
Linux TAR.GZ | https://go.microsoft.com/fwlink/?linkid=2038332
Linux RPM | https://go.microsoft.com/fwlink/?linkid=2038401
Linux DEB | https://go.microsoft.com/fwlink/?linkid=2038405
Windows User Installer | https://go.microsoft.com/fwlink/?linkid=2094100
Windows System Installer | https://go.microsoft.com/fwlink/?linkid=2094200
Windows ZIP | https://go.microsoft.com/fwlink/?linkid=2094201
macOS ZIP | https://go.microsoft.com/fwlink/?linkid=2094202
Linux TAR.GZ | https://go.microsoft.com/fwlink/?linkid=2094101
Linux RPM | https://go.microsoft.com/fwlink/?linkid=2094102
Linux DEB | https://go.microsoft.com/fwlink/?linkid=2094203
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
Try out the latest insiders build from `master` at https://github.com/Microsoft/azuredatastudio/releases.
## Try out the latest insiders build from `master`:
- [Windows User Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-user/insider)
- [Windows System Installer - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64/insider)
- [Windows ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/win32-x64-archive/insider)
- [macOS ZIP - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/darwin/insider)
- [Linux TAR.GZ - **Insiders build**](https://azuredatastudio-update.azurewebsites.net/latest/linux-x64/insider)
See the [change log](https://github.com/Microsoft/azuredatastudio/blob/master/CHANGELOG.md) for additional details of what's in this release.
**Feature Highlights**
## **Feature Highlights**
- Cross-Platform DB management for Windows, macOS and Linux with simple XCopy deployment
- SQL Server Connection Management with Connection Dialog, Server Groups, Azure Integration and Registered Servers
@@ -62,6 +68,16 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
## Contributions and "Thank You"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* Stevoni for `Corrected Keyboard Shortcut Execution Issue #5480`
* yamatoya for `fix the format #4899`
* GeoffYoung for `Fix sqlDropColumn description #4422`
* AlexFsmn for `Added context menu for DBs in explorer view to backup & restore db. #2277`
* sadedil for `Missing feature request: Save as XML #3729`
* gbritton1 for `Removed reference to object explorer #3463`
* Tarig0 for `Add Routine_Type to CreateStoredProc fixes #3257 (#3286)`
* oltruong for `typo fix #3025'`
* Thomas-S-B for `Removed unnecessary IErrorDetectionStrategy #749`
* Thomas-S-B for `Simplified code #750`
* rdaniels6813 for `Add query plan theme support #3031`
* Ruturaj123 for `Fixed some typos and grammatical errors #3027`
* PromoFaux for `Use emoji shortcodes in CONTRIBUTING.md instead of <20> #3009`

View File

@@ -17,10 +17,12 @@ expressly granted herein, whether by implication, estoppel or otherwise.
chokidar: https://github.com/paulmillr/chokidar
comment-json: https://github.com/kaelzhang/node-comment-json
core-js: https://github.com/zloirock/core-js
decompress: https://github.com/kevva/decompress
emmet: https://github.com/emmetio/emmet
error-ex: https://github.com/Qix-/node-error-ex
escape-string-regexp: https://github.com/sindresorhus/escape-string-regexp
fast-plist: https://github.com/Microsoft/node-fast-plist
figures: https://github.com/sindresorhus/figures
find-remove: https://www.npmjs.com/package/find-remove
fs-extra: https://github.com/jprichardson/node-fs-extra
gc-signals: https://github.com/Microsoft/node-gc-signals
@@ -41,22 +43,27 @@ expressly granted herein, whether by implication, estoppel or otherwise.
native-keymap: https://github.com/Microsoft/node-native-keymap
native-watchdog: https://github.com/Microsoft/node-native-watchdog
ng2-charts: https://github.com/valor-software/ng2-charts
node-fetch: https://github.com/bitinn/node-fetch
node-pty: https://github.com/Tyriar/node-pty
nsfw: https://github.com/Axosoft/nsfw
pretty-data: https://github.com/vkiryukhin/pretty-data
primeng: https://github.com/primefaces/primeng
process-nextick-args: https://github.com/calvinmetcalf/process-nextick-args
pty.js: https://github.com/chjj/pty.js
reflect-metadata: https://github.com/rbuckton/reflect-metadata
request: https://github.com/request/request
rxjs: https://github.com/ReactiveX/RxJS
semver: https://github.com/npm/node-semver
slickgrid: https://github.com/6pac/SlickGrid
sqltoolsservice: https://github.com/Microsoft/sqltoolsservice
svg.js: https://github.com/svgdotjs/svg.js
systemjs: https://github.com/systemjs/systemjs
temp-write: https://github.com/sindresorhus/temp-write
underscore: https://github.com/jashkenas/underscore
v8-profiler: https://github.com/node-inspector/v8-profiler
vscode: https://github.com/microsoft/vscode
vscode-debugprotocol: https://github.com/Microsoft/vscode-debugadapter-node
vscode-languageclient: https://github.com/Microsoft/vscode-languageserver-node
vscode-nls: https://github.com/Microsoft/vscode-nls
vscode-ripgrep: https://github.com/roblourens/vscode-ripgrep
vscode-textmate: https://github.com/Microsoft/vscode-textmate
winreg: https://github.com/fresc81/node-winreg
@@ -64,10 +71,9 @@ expressly granted herein, whether by implication, estoppel or otherwise.
yauzl: https://github.com/thejoshwolfe/yauzl
zone.js: https://www.npmjs.com/package/zone
Microsoft PROSE SDK: https://microsoft.github.io/prose
%% angular NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License
Copyright (c) 2014-2017 Google, Inc. http://angular.io
@@ -293,6 +299,20 @@ THE SOFTWARE.
=========================================
END OF core-js NOTICES AND INFORMATION
%% decompress NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Kevin Mårtensson <kevinmartensson@gmail.com> (github.com/kevva)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF decompress NOTICES AND INFORMATION
%% emmet NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
@@ -322,32 +342,6 @@ END OF emmet NOTICES AND INFORMATION
=========================================
The MIT License (MIT)
Copyright (c) 2015 JD Ballard
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
=========================================
END OF error-ex NOTICES AND INFORMATION
%% escape-string-regexp NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy
@@ -394,6 +388,20 @@ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEAL
=========================================
END OF fast-plist NOTICES AND INFORMATION
%% figures NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF figures NOTICES AND INFORMATION
%% fs-extra NOTICES AND INFORMATION BEGIN HERE
=========================================
(The MIT License)
@@ -1335,6 +1343,32 @@ SOFTWARE.
=========================================
END OF ng2-charts NOTICES AND INFORMATION
%% node-fetch NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) 2016 David Frank
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
=========================================
END OF node-fetch NOTICES AND INFORMATION
%% node-pty NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
@@ -1385,16 +1419,6 @@ SOFTWARE.
=========================================
END OF nsfw NOTICES AND INFORMATION
%% pretty-data NOTICES AND INFORMATION BEGIN HERE
=========================================
License: Dual licensed under the MIT and GPL licenses:
http://www.opensource.org/licenses/mit-license.php
http://www.gnu.org/licenses/gpl.html
=========================================
END OF pretty-data NOTICES AND INFORMATION
%% primeng NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
@@ -1409,6 +1433,30 @@ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLI
=========================================
END OF primeng NOTICES AND INFORMATION
%% process-nextick-args NOTICES AND INFORMATION BEGIN HERE
=========================================
# Copyright (c) 2015 Calvin Metcalf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
**THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.**
=========================================
END OF process-nextick-args NOTICES AND INFORMATION
%% pty.js NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
@@ -1493,6 +1541,66 @@ END OF TERMS AND CONDITIONS
=========================================
END OF reflect-metadata NOTICES AND INFORMATION
%% request NOTICES AND INFORMATION BEGIN HERE
=========================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
You must give any other recipients of the Work or Derivative Works a copy of this License; and
You must cause any modified files to carry prominent notices stating that You changed the files; and
You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
=========================================
END OF request NOTICES AND INFORMATION
%% rxjs NOTICES AND INFORMATION BEGIN HERE
=========================================
Apache License
@@ -1818,6 +1926,20 @@ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEAL
=========================================
END OF systemjs NOTICES AND INFORMATION
%% temp-write NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF temp-write NOTICES AND INFORMATION
%% underscore NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2009-2017 Jeremy Ashkenas, DocumentCloud and Investigative
@@ -1920,6 +2042,50 @@ OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWA
=========================================
END OF vscode-debugprotocol NOTICES AND INFORMATION
%% vscode-languageclient NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) Microsoft Corporation
All rights reserved.
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF vscode-languageclient NOTICES AND INFORMATION
%% vscode-nls NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) Microsoft Corporation
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF vscode-nls NOTICES AND INFORMATION
%% vscode-ripgrep NOTICES AND INFORMATION BEGIN HERE
=========================================
vscode-ripgrep
@@ -2079,3 +2245,187 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
=========================================
END OF zone.js NOTICES AND INFORMATION
%% Microsoft.ProgramSynthesis.Common NOTICES AND INFORMATION BEGIN HERE
=========================================
NOTICES AND INFORMATION
Do Not Translate or Localize
This software incorporates material from third parties. Microsoft makes certain
open source code available at http://3rdpartysource.microsoft.com, or you may
send a check or money order for US $5.00, including the product name, the open
source component name, and version number, to:
Source Code Compliance Team
Microsoft Corporation
One Microsoft Way
Redmond, WA 98052
USA
Notwithstanding any other terms, you may reverse engineer this software to the
extent required to debug changes to any libraries licensed under the GNU Lesser
General Public License.
-------------------------------START OF THIRD-PARTY NOTICES-------------------------------------------
===================================CoreFx (BEGIN)
The MIT License (MIT)
Copyright (c) .NET Foundation and Contributors
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================CoreFx (END)
===================================CoreFxLab (BEGIN)
The MIT License (MIT)
Copyright (c) Microsoft Corporation
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================CoreFxLab (END)
===================================Reactive Extensions (BEGIN)
Copyright (c) .NET Foundation and Contributors
All Rights Reserved
Licensed under the Apache License, Version 2.0 (the "License"); you
may not use this file except in compliance with the License. You may
obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied. See the License for the specific language governing permissions
and limitations under the License.
List of contributors to the Rx libraries
Rx and Ix.NET:
Wes Dyer
Jeffrey van Gogh
Matthew Podwysocki
Bart De Smet
Danny van Velzen
Erik Meijer
Brian Beckman
Aaron Lahman
Georgi Chkodrov
Arthur Watson
Gert Drapers
Mark Shields
Eric Rozell
Rx.js and Ix.js:
Matthew Podwysocki
Jeffrey van Gogh
Bart De Smet
Brian Beckman
Wes Dyer
Erik Meijer
Tx:
Georgi Chkodrov
Bart De Smet
Aaron Lahman
Erik Meijer
Brian Grunkemeyer
Beysim Sezgin
Tiho Tarnavski
Collin Meek
Sajay Anthony
Karen Albrecht
John Allen
Zach Kramer
Rx++ and Ix++:
Aaron Lahman
===================================Reactive Extensions (END)
-------------------------------END OF THIRD-PARTY NOTICES-------------------------------------------
=========================================
END OF Microsoft.ProgramSynthesis.Common NOTICES AND INFORMATION
%% Microsoft.ProgramSynthesis.Detection NOTICES AND INFORMATION BEGIN HERE
=========================================
NOTICES AND INFORMATION
Do Not Translate or Localize
This software incorporates material from third parties. Microsoft makes certain
open source code available at http://3rdpartysource.microsoft.com, or you may
send a check or money order for US $5.00, including the product name, the open
source component name, and version number, to:
Source Code Compliance Team
Microsoft Corporation
One Microsoft Way
Redmond, WA 98052
USA
Notwithstanding any other terms, you may reverse engineer this software to the
extent required to debug changes to any libraries licensed under the GNU Lesser
General Public License.
-------------------------------START OF THIRD-PARTY NOTICES-------------------------------------------
The MIT License (MIT)
Copyright (c) 2014 ExcelDataReader
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================ExcelDataReader (END)
-------------------------------END OF THIRD-PARTY NOTICES-------------------------------------------
=========================================
END OF Microsoft.ProgramSynthesis.Detection NOTICES AND INFORMATION

View File

@@ -1,38 +1,66 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: '8.x'
versionSpec: '10.15.1'
displayName: 'Install Node.js'
- script: |
git submodule update --init --recursive
nvm install 8.9.1
nvm use 8.9.1
npm i -g yarn
displayName: 'preinstall'
- script: |
export CXX="g++-4.9" CC="gcc-4.9" DISPLAY=:99.0
sh -e /etc/init.d/xvfb start
sleep 3
export CXX="g++-4.9" CC="gcc-4.9" DISPLAY=:10
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
# sh -e /etc/init.d/xvfb start
# sleep 3
displayName: 'Linux preinstall'
condition: eq(variables['Agent.OS'], 'Linux')
- script: |
yarn
displayName: 'Install'
displayName: 'Install'
- script: |
node_modules/.bin/gulp electron --silent
node_modules/.bin/gulp compile --silent --max_old_space_size=4096
node_modules/.bin/gulp optimize-vscode --silent --max_old_space_size=4096
displayName: 'Scripts'
yarn gulp electron-x64
displayName: Download Electron
- script: |
./scripts/test.sh --reporter mocha-junit-reporter
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn tslint
displayName: 'Run TSLint'
- script: |
yarn strict-null-check
displayName: 'Run Strict Null Check'
- script: |
yarn compile
displayName: 'Compile'
- script: |
DISPLAY=:10 ./scripts/test.sh --reporter mocha-junit-reporter
displayName: 'Tests'
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
- script: |
DISPLAY=:10 ./scripts/test.sh --reporter mocha-junit-reporter --coverage
displayName: 'Tests'
condition: and(succeeded(), ne(variables['Agent.OS'], 'Linux'))
- task: PublishTestResults@2
inputs:
testResultsFiles: '**/test-results.xml'
condition: succeededOrFailed()
testResultsFiles: '**/test-results.xml'
condition: succeededOrFailed()
- task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: 'cobertura'
summaryFileLocation: $(System.DefaultWorkingDirectory)/.build/coverage/cobertura-coverage.xml
reportDirectory: $(System.DefaultWorkingDirectory)/.build/coverage/lcov-reports
condition: ne(variables['Agent.OS'], 'Linux')

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: '8.9'
versionSpec: '10.15.1'
displayName: 'Install Node.js'
- script: |
@@ -9,18 +9,36 @@ steps:
displayName: 'Yarn Install'
- script: |
.\node_modules\.bin\gulp electron
yarn gulp electron-x64
displayName: 'Electron'
- script: |
npm run compile
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn tslint
displayName: 'Run TSLint'
- script: |
yarn strict-null-check
displayName: 'Run Strict Null Check'
- script: |
yarn compile
displayName: 'Compile'
- script: |
.\scripts\test.bat --reporter mocha-junit-reporter
.\scripts\test.bat --reporter mocha-junit-reporter --coverage
displayName: 'Test'
- task: PublishTestResults@2
inputs:
testResultsFiles: 'test-results.xml'
condition: succeededOrFailed()
condition: succeededOrFailed()
- task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: 'cobertura'
summaryFileLocation: $(System.DefaultWorkingDirectory)\.build\coverage\cobertura-coverage.xml
reportDirectory: $(System.DefaultWorkingDirectory)\.build\coverage\lcov-report

126
build/.nativeignore Normal file
View File

@@ -0,0 +1,126 @@
# cleanup rules for native node modules, .gitignore style
fsevents/binding.gyp
fsevents/fsevents.cc
fsevents/build/**
fsevents/src/**
fsevents/test/**
!fsevents/**/*.node
vscode-sqlite3/binding.gyp
vscode-sqlite3/benchmark/**
vscode-sqlite3/cloudformation/**
vscode-sqlite3/deps/**
vscode-sqlite3/test/**
vscode-sqlite3/build/**
vscode-sqlite3/src/**
!vscode-sqlite3/build/Release/*.node
oniguruma/binding.gyp
oniguruma/build/**
oniguruma/src/**
oniguruma/deps/**
!oniguruma/build/Release/*.node
!oniguruma/src/*.js
windows-mutex/binding.gyp
windows-mutex/build/**
windows-mutex/src/**
!windows-mutex/**/*.node
native-keymap/binding.gyp
native-keymap/build/**
native-keymap/src/**
native-keymap/deps/**
!native-keymap/build/Release/*.node
native-is-elevated/binding.gyp
native-is-elevated/build/**
native-is-elevated/src/**
native-is-elevated/deps/**
!native-is-elevated/build/Release/*.node
native-watchdog/binding.gyp
native-watchdog/build/**
native-watchdog/src/**
!native-watchdog/build/Release/*.node
spdlog/binding.gyp
spdlog/build/**
spdlog/deps/**
spdlog/src/**
spdlog/test/**
!spdlog/build/Release/*.node
jschardet/dist/**
windows-foreground-love/binding.gyp
windows-foreground-love/build/**
windows-foreground-love/src/**
!windows-foreground-love/**/*.node
windows-process-tree/binding.gyp
windows-process-tree/build/**
windows-process-tree/src/**
!windows-process-tree/**/*.node
gc-signals/binding.gyp
gc-signals/build/**
gc-signals/src/**
gc-signals/deps/**
!gc-signals/build/Release/*.node
!gc-signals/src/index.js
keytar/binding.gyp
keytar/build/**
keytar/src/**
keytar/script/**
keytar/node_modules/**
!keytar/**/*.node
node-pty/binding.gyp
node-pty/build/**
node-pty/src/**
node-pty/tools/**
!node-pty/build/Release/*.exe
!node-pty/build/Release/*.dll
!node-pty/build/Release/*.node
chart.js/node_modules/**
emmet/node_modules/**
pty.js/build/**
!pty.js/build/Release/**
jquery-ui/external/**
jquery-ui/demos/**
core-js/**/**
slickgrid/node_modules/**
slickgrid/examples/**
vscode-nsfw/binding.gyp
vscode-nsfw/build/**
vscode-nsfw/src/**
vscode-nsfw/openpa/**
vscode-nsfw/includes/**
!vscode-nsfw/build/Release/*.node
!vscode-nsfw/**/*.a
vsda/binding.gyp
vsda/README.md
vsda/build/**
vsda/*.bat
vsda/*.sh
vsda/*.cpp
vsda/*.h
!vsda/build/Release/vsda.node
vscode-windows-ca-certs/**/*
!vscode-windows-ca-certs/package.json
!vscode-windows-ca-certs/**/*.node
node-addon-api/**/*

View File

@@ -0,0 +1,20 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as cp from 'child_process';
import * as path from 'path';
function yarnInstall(packageName: string): void {
cp.execSync(`yarn add --no-lockfile ${packageName}`);
cp.execSync(`yarn add --no-lockfile ${packageName}`, { cwd: path.join( process.cwd(), 'remote') });
}
const product = require('../../../product.json');
const dependencies = product.dependencies || {} as { [name: string]: string; };
Object.keys(dependencies).forEach(name => {
const url = dependencies[name];
yarnInstall(url);
});

View File

@@ -6,7 +6,6 @@
'use strict';
import * as fs from 'fs';
import { execSync } from 'child_process';
import { Readable } from 'stream';
import * as crypto from 'crypto';
import * as azure from 'azure-storage';
@@ -44,7 +43,7 @@ function createDefaultConfig(quality: string): Config {
}
function getConfig(quality: string): Promise<Config> {
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
@@ -66,7 +65,7 @@ interface Asset {
platform: string;
type: string;
url: string;
mooncakeUrl: string;
mooncakeUrl?: string;
hash: string;
sha256hash: string;
size: number;
@@ -74,7 +73,7 @@ interface Asset {
}
function createOrUpdate(commit: string, quality: string, platform: string, type: string, release: NewDocument, asset: Asset, isUpdate: boolean): Promise<void> {
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/' + quality;
const updateQuery = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
@@ -128,7 +127,7 @@ async function assertContainer(blobService: azure.BlobService, quality: string):
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean> {
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
const existsResult = await new Promise<azure.BlobService.BlobResult>((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
@@ -151,11 +150,15 @@ interface PublishOptions {
async function publish(commit: string, quality: string, platform: string, type: string, name: string, version: string, _isUpdate: string, file: string, opts: PublishOptions): Promise<void> {
const isUpdate = _isUpdate === 'true';
const queuedBy = process.env['BUILD_QUEUEDBY'];
const sourceBranch = process.env['BUILD_SOURCEBRANCH'];
const isReleased = quality === 'insider'
&& /^master$|^refs\/heads\/master$/.test(sourceBranch)
&& /Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy);
const queuedBy = process.env['BUILD_QUEUEDBY']!;
const sourceBranch = process.env['BUILD_SOURCEBRANCH']!;
const isReleased = (
// Insiders: nightly build from master
(quality === 'insider' && /^master$|^refs\/heads\/master$/.test(sourceBranch) && /Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)) ||
// Exploration: any build from electron-4.0.x branch
(quality === 'exploration' && /^electron-4.0.x$|^refs\/heads\/electron-4.0.x$/.test(sourceBranch))
);
console.log('Publishing...');
console.log('Quality:', quality);
@@ -180,62 +183,23 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// {{SQL CARBON EDIT}}
await assertContainer(blobService, quality);
const blobExists = await doesAssetExist(blobService, quality, blobName);
const promises = [];
if (!blobExists) {
promises.push(uploadBlob(blobService, quality, blobName, file));
}
// {{SQL CARBON EDIT}}
if (process.env['MOONCAKE_STORAGE_ACCESS_KEY']) {
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// mooncake is fussy and far away, this is needed!
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
await Promise.all([
assertContainer(blobService, quality),
assertContainer(mooncakeBlobService, quality)
]);
const [blobExists, moooncakeBlobExists] = await Promise.all([
doesAssetExist(blobService, quality, blobName),
doesAssetExist(mooncakeBlobService, quality, blobName)
]);
const promises = [];
if (!blobExists) {
promises.push(uploadBlob(blobService, quality, blobName, file));
}
if (!moooncakeBlobExists) {
promises.push(uploadBlob(mooncakeBlobService, quality, blobName, file));
}
} else {
console.log('Skipping Mooncake publishing.');
}
if (promises.length === 0) {
if (blobExists) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
}
console.log('Uploading blobs to Azure storage...');
await Promise.all(promises);
await uploadBlob(blobService, quality, blobName, file);
console.log('Blobs successfully uploaded.');
@@ -247,8 +211,6 @@ async function publish(commit: string, quality: string, platform: string, type:
platform: platform,
type: type,
url: `${process.env['AZURE_CDN_URL']}/${quality}/${blobName}`,
// {{SQL CARBON EDIT}}
mooncakeUrl: process.env['MOONCAKE_CDN_URL'] ? `${process.env['MOONCAKE_CDN_URL']}/${quality}/${blobName}` : undefined,
hash: sha1hash,
sha256hash,
size
@@ -268,7 +230,7 @@ async function publish(commit: string, quality: string, platform: string, type:
isReleased: config.frozen ? false : isReleased,
sourceBranch,
queuedBy,
assets: [],
assets: [] as Array<Asset>,
updates: {} as any
};
@@ -284,15 +246,23 @@ async function publish(commit: string, quality: string, platform: string, type:
}
function main(): void {
if (process.env['VSCODE_BUILD_SKIP_PUBLISH']) {
console.warn('Skipping publish due to VSCODE_BUILD_SKIP_PUBLISH');
return;
}
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const opts = minimist<PublishOptions>(process.argv.slice(2), {
boolean: ['upload-only']
});
// {{SQL CARBON EDIT}}
let [quality, platform, type, name, version, _isUpdate, file, commit] = opts._;
if (!commit) {
commit = execSync('git rev-parse HEAD', { encoding: 'utf8' }).trim();
}
const [quality, platform, type, name, version, _isUpdate, file] = opts._;
publish(commit, quality, platform, type, name, version, _isUpdate, file, opts).catch(err => {
console.error(err);

View File

@@ -97,7 +97,7 @@ function updateVersion(accessor: IVersionAccessor, symbolsPath: string) {
function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Promise<T> {
return new Promise<T>((resolve, reject) => {
request(options, (error, response, body) => {
request(options, (error, _response, body) => {
if (error) {
reject(error);
} else {
@@ -107,17 +107,17 @@ function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Pro
});
}
function downloadAsset(repository, assetName: string, targetPath: string, electronVersion: string) {
function downloadAsset(repository: any, assetName: string, targetPath: string, electronVersion: string) {
return new Promise((resolve, reject) => {
repository.getReleases({ tag_name: `v${electronVersion}` }, (err, releases) => {
repository.getReleases({ tag_name: `v${electronVersion}` }, (err: any, releases: any) => {
if (err) {
reject(err);
} else {
const asset = releases[0].assets.filter(asset => asset.name === assetName)[0];
const asset = releases[0].assets.filter((asset: any) => asset.name === assetName)[0];
if (!asset) {
reject(new Error(`Asset with name ${assetName} not found`));
} else {
repository.downloadAsset(asset, (err, reader) => {
repository.downloadAsset(asset, (err: any, reader: any) => {
if (err) {
reject(err);
} else {
@@ -156,7 +156,7 @@ async function ensureVersionAndSymbols(options: IOptions) {
const symbolsName = symbolsZipName(options.platform, options.versions.electron, options.versions.insiders);
const symbolsPath = await tmpFile('symbols.zip');
console.log(`HockeyApp: downloading symbols ${symbolsName} for electron ${options.versions.electron} (${options.platform}) into ${symbolsPath}`);
await downloadAsset(new github({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
await downloadAsset(new (github as any)({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
// Create version
console.log(`HockeyApp: creating new version ${options.versions.code} (${options.platform})`);

View File

@@ -0,0 +1,176 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as url from 'url';
import * as azure from 'azure-storage';
import * as mime from 'mime';
import { DocumentClient, RetrievedDocument } from 'documentdb';
function log(...args: any[]) {
console.log(...[`[${new Date().toISOString()}]`, ...args]);
}
function error(...args: any[]) {
console.error(...[`[${new Date().toISOString()}]`, ...args]);
}
if (process.argv.length < 3) {
error('Usage: node sync-mooncake.js <quality>');
process.exit(-1);
}
interface Build extends RetrievedDocument {
assets: Asset[];
}
interface Asset {
platform: string;
type: string;
url: string;
mooncakeUrl: string;
hash: string;
sha256hash: string;
size: number;
supportsFastUpdate?: boolean;
}
function updateBuild(commit: string, quality: string, platform: string, type: string, asset: Asset): Promise<void> {
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/' + quality;
const updateQuery = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
parameters: [{ name: '@id', value: commit }]
};
let updateTries = 0;
function _update(): Promise<void> {
updateTries++;
return new Promise<void>((c, e) => {
client.queryDocuments(collection, updateQuery).toArray((err, results) => {
if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); }
const release = results[0];
release.assets = [
...release.assets.filter((a: any) => !(a.platform === platform && a.type === type)),
asset
];
client.replaceDocument(release._self, release, err => {
if (err && err.code === 409 && updateTries < 5) { return c(_update()); }
if (err) { return e(err); }
log('Build successfully updated.');
c();
});
});
});
}
return _update();
}
async function sync(commit: string, quality: string): Promise<void> {
log(`Synchronizing Mooncake assets for ${quality}, ${commit}...`);
const cosmosdb = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = `dbs/builds/colls/${quality}`;
const query = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
parameters: [{ name: '@id', value: commit }]
};
const build = await new Promise<Build>((c, e) => {
cosmosdb.queryDocuments(collection, query).toArray((err, results) => {
if (err) { return e(err); }
if (results.length !== 1) { return e(new Error('No documents')); }
c(results[0] as Build);
});
});
log(`Found build for ${commit}, with ${build.assets.length} assets`);
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY']!, `${storageAccount}.blob.core.chinacloudapi.cn`)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// mooncake is fussy and far away, this is needed!
blobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
mooncakeBlobService.defaultClientRequestTimeoutInMs = 10 * 60 * 1000;
for (const asset of build.assets) {
try {
const blobPath = url.parse(asset.url).path;
if (!blobPath) {
throw new Error(`Failed to parse URL: ${asset.url}`);
}
const blobName = blobPath.replace(/^\/\w+\//, '');
log(`Found ${blobName}`);
if (asset.mooncakeUrl) {
log(` Already in Mooncake ✔️`);
continue;
}
const readStream = blobService.createReadStream(quality, blobName, undefined!);
const blobOptions: azure.BlobService.CreateBlockBlobRequestOptions = {
contentSettings: {
contentType: mime.lookup(blobPath),
cacheControl: 'max-age=31536000, public'
}
};
const writeStream = mooncakeBlobService.createWriteStreamToBlockBlob(quality, blobName, blobOptions, undefined);
log(` Uploading to Mooncake...`);
await new Promise((c, e) => readStream.pipe(writeStream).on('finish', c).on('error', e));
log(` Updating build in DB...`);
asset.mooncakeUrl = `${process.env['MOONCAKE_CDN_URL']}${blobPath}`;
await updateBuild(commit, quality, asset.platform, asset.type, asset);
log(` Done ✔️`);
} catch (err) {
error(err);
}
}
log(`All done ✔️`);
}
function main(): void {
if (process.env['VSCODE_BUILD_SKIP_PUBLISH']) {
error('Skipping publish due to VSCODE_BUILD_SKIP_PUBLISH');
return;
}
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
error('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const quality = process.argv[2];
sync(commit, quality).catch(err => {
error(err);
process.exit(1);
});
}
main();

View File

@@ -0,0 +1,5 @@
#!/usr/bin/env bash
set -e
yarn gulp vscode-darwin-min
yarn gulp vscode-reh-darwin-min
yarn gulp upload-vscode-sourcemaps

View File

@@ -0,0 +1,50 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- script: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- script: |
yarn gulp electron-x64
displayName: Download Electron
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- script: |
./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests
- script: |
./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Run Integration Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,96 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
cat << EOF > ~/.netrc
machine monacotools.visualstudio.com
password $(devops-pat)
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
yarn
yarn gulp mixin
yarn gulp hygiene
yarn monaco-compile-check
node build/azure-pipelines/common/installDistro.js
node build/lib/builtInExtensions.js
displayName: Prepare build
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
./build/azure-pipelines/darwin/build.sh
displayName: Build
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
# APP_NAME="`ls $(agent.builddirectory)/VSCode-darwin | head -n 1`"
# yarn smoketest -- --build "$(agent.builddirectory)/VSCode-darwin/$APP_NAME"
displayName: Run unit tests
- script: |
set -e
./scripts/test-integration.sh --build --tfs "Integration Tests"
displayName: Run integration tests
- script: |
set -e
pushd ../VSCode-darwin && zip -r -X -y ../VSCode-darwin.zip * && popd
displayName: Archive build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)'
Pattern: 'VSCode-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY="$(ticino-storage-key)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
./build/azure-pipelines/darwin/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -0,0 +1,36 @@
#!/usr/bin/env bash
set -e
# remove pkg from archive
zip -d ../VSCode-darwin.zip "*.pkg"
# publish the build
PACKAGEJSON=`ls ../VSCode-darwin/*.app/Contents/Resources/app/package.json`
VERSION=`node -p "require(\"$PACKAGEJSON\").version"`
node build/azure-pipelines/common/publish.js \
"$VSCODE_QUALITY" \
darwin \
archive \
"VSCode-darwin-$VSCODE_QUALITY.zip" \
$VERSION \
true \
../VSCode-darwin.zip
# package Remote Extension Host
pushd .. && mv vscode-reh-darwin vscode-server-darwin && zip -Xry vscode-server-darwin.zip vscode-server-darwin && popd
# publish Remote Extension Host
node build/azure-pipelines/common/publish.js \
"$VSCODE_QUALITY" \
server-darwin \
archive-unsigned \
"vscode-server-darwin.zip" \
$VERSION \
true \
../vscode-server-darwin.zip
# publish hockeyapp symbols
node build/azure-pipelines/common/symbols.js "$VSCODE_MIXIN_PASSWORD" "$VSCODE_HOCKEYAPP_TOKEN" "$VSCODE_ARCH" "$VSCODE_HOCKEYAPP_ID_MACOS"
# upload configuration
yarn gulp upload-vscode-configuration

View File

@@ -0,0 +1,36 @@
trigger:
branches:
include: ['master', 'release/*']
pr:
branches:
include: ['master', 'release/*']
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
cat << EOF > ~/.netrc
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git remote add distro "https://github.com/$VSCODE_MIXIN_REPO.git"
git fetch distro
git push distro origin/master:refs/heads/master
git merge $(node -p "require('./package.json').distro")
displayName: Sync & Merge Distro

View File

@@ -0,0 +1 @@
pat

View File

@@ -0,0 +1,7 @@
#!/usr/bin/env bash
set -e
yarn gulp "vscode-linux-$VSCODE_ARCH-min"
if [[ "$VSCODE_ARCH" != "ia32" ]]; then
yarn gulp vscode-reh-linux-$VSCODE_ARCH-min
fi

View File

@@ -0,0 +1,55 @@
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- script: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- script: |
yarn gulp electron-x64
displayName: Download Electron
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- script: |
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,40 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const documentdb_1 = require("documentdb");
function createDefaultConfig(quality) {
return {
id: quality,
frozen: false
};
}
function getConfig(quality) {
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
parameters: [
{ name: '@quality', value: quality }
]
};
return new Promise((c, e) => {
client.queryDocuments(collection, query).toArray((err, results) => {
if (err && err.code !== 409) {
return e(err);
}
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
});
});
}
getConfig(process.argv[2])
.then(config => {
console.log(config.frozen);
process.exit(0);
})
.catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,79 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
export npm_config_arch="$(VSCODE_ARCH)"
if [[ "$(VSCODE_ARCH)" == "ia32" ]]; then
export PKG_CONFIG_PATH="/usr/lib/i386-linux-gnu/pkgconfig"
fi
cat << EOF > ~/.netrc
machine monacotools.visualstudio.com
password $(devops-pat)
machine github.com
login vscode
password $(github-distro-mixin-password)
EOF
git config user.email "vscode@microsoft.com"
git config user.name "VSCode"
git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git"
git fetch distro
git merge $(node -p "require('./package.json').distro")
CHILD_CONCURRENCY=1 yarn
yarn gulp mixin
yarn gulp hygiene
yarn monaco-compile-check
node build/azure-pipelines/common/installDistro.js
node build/lib/builtInExtensions.js
displayName: Prepare build
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
./build/azure-pipelines/linux/build.sh
displayName: Build
- script: |
set -e
yarn gulp "electron-$(VSCODE_ARCH)"
# xvfb seems to be crashing often, let's make sure it's always up
service xvfb start
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
# yarn smoketest -- --build "$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)"
displayName: Run unit tests
- script: |
set -e
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)" \
VSCODE_HOCKEYAPP_TOKEN="$(vscode-hockeyapp-token)" \
./build/azure-pipelines/linux/publish.sh
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true
- task: PublishPipelineArtifact@0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: snap-$(VSCODE_ARCH)
targetPath: .build/linux/snap-tarball

View File

@@ -0,0 +1,64 @@
#!/usr/bin/env bash
set -e
REPO="$(pwd)"
ROOT="$REPO/.."
# Publish tarball
PLATFORM_LINUX="linux-$VSCODE_ARCH"
[[ "$VSCODE_ARCH" == "ia32" ]] && DEB_ARCH="i386" || DEB_ARCH="amd64"
[[ "$VSCODE_ARCH" == "ia32" ]] && RPM_ARCH="i386" || RPM_ARCH="x86_64"
BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
PACKAGEJSON="$BUILD/resources/app/package.json"
VERSION=$(node -p "require(\"$PACKAGEJSON\").version")
rm -rf $ROOT/code-*.tar.*
(cd $ROOT && tar -czf $TARBALL_PATH $BUILDNAME)
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_LINUX" archive-unsigned "$TARBALL_FILENAME" "$VERSION" true "$TARBALL_PATH"
# Publish Remote Extension Host
if [[ "$VSCODE_ARCH" != "ia32" ]]; then
LEGACY_SERVER_BUILD_NAME="vscode-reh-$PLATFORM_LINUX"
SERVER_BUILD_NAME="vscode-server-$PLATFORM_LINUX"
SERVER_TARBALL_FILENAME="vscode-server-$PLATFORM_LINUX.tar.gz"
SERVER_TARBALL_PATH="$ROOT/$SERVER_TARBALL_FILENAME"
rm -rf $ROOT/vscode-server-*.tar.*
(cd $ROOT && mv $LEGACY_SERVER_BUILD_NAME $SERVER_BUILD_NAME && tar -czf $SERVER_TARBALL_PATH $SERVER_BUILD_NAME)
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "server-$PLATFORM_LINUX" archive-unsigned "$SERVER_TARBALL_FILENAME" "$VERSION" true "$SERVER_TARBALL_PATH"
fi
# Publish hockeyapp symbols
node build/azure-pipelines/common/symbols.js "$VSCODE_MIXIN_PASSWORD" "$VSCODE_HOCKEYAPP_TOKEN" "$VSCODE_ARCH" "$VSCODE_HOCKEYAPP_ID_LINUX64"
# Publish DEB
yarn gulp "vscode-linux-$VSCODE_ARCH-build-deb"
PLATFORM_DEB="linux-deb-$VSCODE_ARCH"
[[ "$VSCODE_ARCH" == "ia32" ]] && DEB_ARCH="i386" || DEB_ARCH="amd64"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_DEB" package "$DEB_FILENAME" "$VERSION" true "$DEB_PATH"
# Publish RPM
yarn gulp "vscode-linux-$VSCODE_ARCH-build-rpm"
PLATFORM_RPM="linux-rpm-$VSCODE_ARCH"
[[ "$VSCODE_ARCH" == "ia32" ]] && RPM_ARCH="i386" || RPM_ARCH="x86_64"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_RPM" package "$RPM_FILENAME" "$VERSION" true "$RPM_PATH"
# Publish Snap
yarn gulp "vscode-linux-$VSCODE_ARCH-prepare-snap"
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$VSCODE_ARCH.tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)

View File

@@ -0,0 +1,55 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- task: DownloadPipelineArtifact@0
displayName: 'Download Pipeline Artifact'
inputs:
artifactName: snap-$(VSCODE_ARCH)
targetPath: .build/linux/snap-tarball
- script: |
set -e
# Get snapcraft version
snapcraft --version
# Make sure we get latest packages
sudo apt-get update
sudo apt-get upgrade -y
# Define variables
REPO="$(pwd)"
ARCH="$(VSCODE_ARCH)"
SNAP_ROOT="$REPO/.build/linux/snap/$ARCH"
# Install build dependencies
(cd build && yarn)
# Unpack snap tarball artifact, in order to preserve file perms
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$ARCH.tar.gz"
(cd .build/linux && tar -xzf $SNAP_TARBALL_PATH)
# Create snap package
BUILD_VERSION="$(date +%s)"
SNAP_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.snap"
PACKAGEJSON="$(ls $SNAP_ROOT/code*/usr/share/code*/resources/app/package.json)"
VERSION=$(node -p "require(\"$PACKAGEJSON\").version")
SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME"
(cd $SNAP_ROOT/code-* && sudo snapcraft snap --output "$SNAP_PATH")
# Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "linux-snap-$ARCH" package "$SNAP_FILENAME" "$VERSION" true "$SNAP_PATH"

View File

@@ -0,0 +1,81 @@
resources:
containers:
- container: vscode-x64
endpoint: VSCodeHub
image: vscodehub.azurecr.io/vscode-linux-build-agent:x64
- container: vscode-ia32
endpoint: VSCodeHub
image: vscodehub.azurecr.io/vscode-linux-build-agent:ia32
- container: snapcraft
image: snapcore/snapcraft
jobs:
- job: Windows
condition: eq(variables['VSCODE_BUILD_WIN32'], 'true')
pool:
vmImage: VS2017-Win2016
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: Windows32
condition: eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true')
pool:
vmImage: VS2017-Win2016
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- job: Linux
condition: eq(variables['VSCODE_BUILD_LINUX'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: x64
container: vscode-x64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnap
condition: eq(variables['VSCODE_BUILD_LINUX'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: x64
container: snapcraft
dependsOn: Linux
steps:
- template: linux/snap-build-linux.yml
- job: Linux32
condition: eq(variables['VSCODE_BUILD_LINUX_32BIT'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: ia32
container: vscode-ia32
steps:
- template: linux/product-build-linux.yml
- job: macOS
condition: eq(variables['VSCODE_BUILD_MACOS'], 'true')
pool:
vmImage: macOS 10.13
steps:
- template: darwin/product-build-darwin.yml
- job: Mooncake
pool:
vmImage: 'Ubuntu-16.04'
condition: true
dependsOn:
- Windows
- Windows32
- Linux
- LinuxSnap
- Linux32
- macOS
steps:
- template: sync-mooncake.yml

View File

@@ -0,0 +1,2 @@
node_modules/
*.js

View File

@@ -0,0 +1,36 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const cp = require("child_process");
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
if (!isValidTag(tag)) {
throw Error(`Invalid tag ${tag}`);
}
}
catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function isValidTag(t) {
if (t.split('.').length !== 3) {
return false;
}
const [major, minor, bug] = t.split('.');
// Only release for tags like 1.34.0
if (bug !== '0') {
return false;
}
if (parseInt(major, 10) === NaN || parseInt(minor, 10) === NaN) {
return false;
}
return true;
}

View File

@@ -0,0 +1,43 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as cp from 'child_process';
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
if (!isValidTag(tag)) {
throw Error(`Invalid tag ${tag}`);
}
} catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function isValidTag(t: string) {
if (t.split('.').length !== 3) {
return false;
}
const [major, minor, bug] = t.split('.');
// Only release for tags like 1.34.0
if (bug !== '0') {
return false;
}
if (parseInt(major, 10) === NaN || parseInt(minor, 10) === NaN) {
return false;
}
return true;
}

View File

@@ -0,0 +1,67 @@
# Publish @types/vscode for each release
trigger:
branches:
include: ['refs/tags/*']
pr: none
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- bash: |
# Install build dependencies
(cd build && yarn)
node build/azure-pipelines/publish-types/check-version.js
displayName: Check version
- bash: |
git config --global user.email "vscode@microsoft.com"
git config --global user.name "VSCode"
git clone https://$(GITHUB_TOKEN)@github.com/DefinitelyTyped/DefinitelyTyped.git --depth=1
node build/azure-pipelines/publish-types/update-types.js
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
cd DefinitelyTyped
git diff --color | cat
git add -A
git status
git checkout -b "vscode-types-$TAG_VERSION"
git commit -m "VS Code $TAG_VERSION Extension API"
git push origin "vscode-types-$TAG_VERSION"
displayName: Push update to DefinitelyTyped
- bash: |
TAG_VERSION=$(git describe --tags `git rev-list --tags --max-count=1`)
CHANNEL="G1C14HJ2F"
MESSAGE="DefinitelyTyped/DefinitelyTyped#vscode-types-$TAG_VERSION created. Endgame master, please open this link, examine changes and create a PR:"
LINK="https://github.com/DefinitelyTyped/DefinitelyTyped/compare/vscode-types-$TAG_VERSION?quick_pull=1&body=Updating%20VS%20Code%20Extension%20API.%20See%20https%3A%2F%2Fgithub.com%2Fmicrosoft%2Fvscode%2Fissues%2F70175%20for%20details."
MESSAGE2="[@octref, @jrieken, @kmaetzel, @egamma]. Please review and merge PR to publish @types/vscode."
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$LINK"'"}' \
https://slack.com/api/chat.postMessage
curl -X POST -H "Authorization: Bearer $(SLACK_TOKEN)" \
-H 'Content-type: application/json; charset=utf-8' \
--data '{"channel":"'"$CHANNEL"'", "link_names": true, "text":"'"$MESSAGE2"'"}' \
https://slack.com/api/chat.postMessage
displayName: Send message on Slack

View File

@@ -0,0 +1,62 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fs = require("fs");
const cp = require("child_process");
const path = require("path");
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
const dtsUri = `https://raw.githubusercontent.com/microsoft/vscode/${tag}/src/vs/vscode.d.ts`;
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/vscode/index.d.ts');
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
updateDTSFile(outPath, tag);
console.log(`Done updating vscode.d.ts at ${outPath}`);
}
catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function updateDTSFile(outPath, tag) {
const oldContent = fs.readFileSync(outPath, 'utf-8');
const newContent = getNewFileContent(oldContent, tag);
fs.writeFileSync(outPath, newContent);
}
function getNewFileContent(content, tag) {
const oldheader = [
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA. See License.txt in the project root for license information.`,
` *--------------------------------------------------------------------------------------------*/`
].join('\n');
return getNewFileHeader(tag) + content.slice(oldheader.length);
}
function getNewFileHeader(tag) {
const [major, minor] = tag.split('.');
const shorttag = `${major}.${minor}`;
const header = [
`// Type definitions for Visual Studio Code ${shorttag}`,
`// Project: https://github.com/microsoft/vscode`,
`// Definitions by: Visual Studio Code Team, Microsoft <https://github.com/Microsoft>`,
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
``,
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA.`,
` * See https://github.com/Microsoft/vscode/blob/master/LICENSE.txt for license information.`,
` *--------------------------------------------------------------------------------------------*/`,
``,
`/**`,
` * Type Definition for Visual Studio Code ${shorttag} Extension API`,
` * See https://code.visualstudio.com/api for more information`,
` */`
].join('\n');
return header;
}

View File

@@ -0,0 +1,73 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fs from 'fs';
import * as cp from 'child_process';
import * as path from 'path';
let tag = '';
try {
tag = cp
.execSync('git describe --tags `git rev-list --tags --max-count=1`')
.toString()
.trim();
const dtsUri = `https://raw.githubusercontent.com/microsoft/vscode/${tag}/src/vs/vscode.d.ts`;
const outPath = path.resolve(process.cwd(), 'DefinitelyTyped/types/vscode/index.d.ts');
cp.execSync(`curl ${dtsUri} --output ${outPath}`);
updateDTSFile(outPath, tag);
console.log(`Done updating vscode.d.ts at ${outPath}`);
} catch (err) {
console.error(err);
console.error('Failed to update types');
process.exit(1);
}
function updateDTSFile(outPath: string, tag: string) {
const oldContent = fs.readFileSync(outPath, 'utf-8');
const newContent = getNewFileContent(oldContent, tag);
fs.writeFileSync(outPath, newContent);
}
function getNewFileContent(content: string, tag: string) {
const oldheader = [
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA. See License.txt in the project root for license information.`,
` *--------------------------------------------------------------------------------------------*/`
].join('\n');
return getNewFileHeader(tag) + content.slice(oldheader.length);
}
function getNewFileHeader(tag: string) {
const [major, minor] = tag.split('.');
const shorttag = `${major}.${minor}`;
const header = [
`// Type definitions for Visual Studio Code ${shorttag}`,
`// Project: https://github.com/microsoft/vscode`,
`// Definitions by: Visual Studio Code Team, Microsoft <https://github.com/Microsoft>`,
`// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped`,
``,
`/*---------------------------------------------------------------------------------------------`,
` * Copyright (c) Microsoft Corporation. All rights reserved.`,
` * Licensed under the Source EULA.`,
` * See https://github.com/Microsoft/vscode/blob/master/LICENSE.txt for license information.`,
` *--------------------------------------------------------------------------------------------*/`,
``,
`/**`,
` * Type Definition for Visual Studio Code ${shorttag} Extension API`,
` * See https://code.visualstudio.com/api for more information`,
` */`
].join('\n');
return header;
}

View File

@@ -0,0 +1,24 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- script: |
set -e
(cd build ; yarn)
AZURE_DOCUMENTDB_MASTERKEY="$(builds-docdb-key-readwrite)" \
AZURE_STORAGE_ACCESS_KEY_2="$(vscode-storage-key)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(vscode-mooncake-storage-key)" \
node build/azure-pipelines/common/sync-mooncake.js "$VSCODE_QUALITY"

View File

@@ -0,0 +1,5 @@
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-min" }
exec { yarn gulp "vscode-reh-win32-$env:VSCODE_ARCH-min" }
exec { yarn gulp "vscode-win32-$env:VSCODE_ARCH-inno-updater" }

View File

@@ -0,0 +1,54 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- powershell: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- powershell: |
yarn gulp electron
displayName: Download Electron
- powershell: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- powershell: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- powershell: |
yarn compile
displayName: Compile Sources
- powershell: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- powershell: |
.\scripts\test.bat --tfs "Unit Tests"
displayName: Run Unit Tests
- powershell: |
.\scripts\test-integration.bat --tfs "Integration Tests"
displayName: Run Integration Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,149 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- task: AzureKeyVault@1
displayName: 'Azure Key Vault: Get Secrets'
inputs:
azureSubscription: 'vscode-builds-subscription'
KeyVaultName: vscode
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine monacotools.visualstudio.com`npassword $(devops-pat)`nmachine github.com`nlogin vscode`npassword $(github-distro-mixin-password)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
exec { git config user.email "vscode@microsoft.com" }
exec { git config user.name "VSCode" }
exec { git remote add distro "https://github.com/$(VSCODE_MIXIN_REPO).git" }
exec { git fetch distro }
exec { git merge $(node -p "require('./package.json').distro") }
exec { yarn }
exec { yarn gulp mixin }
exec { yarn gulp hygiene }
exec { yarn monaco-compile-check }
exec { node build/azure-pipelines/common/installDistro.js }
exec { node build/lib/builtInExtensions.js }
displayName: Prepare build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(github-distro-mixin-password)"
.\build\azure-pipelines\win32\build.ps1
displayName: Build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "electron-$(VSCODE_ARCH)" }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
# yarn smoketest -- --build "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
displayName: Run unit tests
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { yarn gulp "electron-$(VSCODE_ARCH)" }
exec { .\scripts\test-integration.bat --build --tfs "Integration Tests" }
displayName: Run integration tests
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH),$(agent.builddirectory)/vscode-reh-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(esrp-auth-certificate) -AuthCertificateKey $(esrp-auth-certificate-key)
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(vscode-storage-key)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(builds-docdb-key-readwrite)"
$env:VSCODE_HOCKEYAPP_TOKEN = "$(vscode-hockeyapp-token)"
.\build\azure-pipelines\win32\publish.ps1
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -0,0 +1,37 @@
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$Arch = "$env:VSCODE_ARCH"
exec { yarn gulp "vscode-win32-$Arch-archive" "vscode-win32-$Arch-system-setup" "vscode-win32-$Arch-user-setup" --sign }
$Repo = "$(pwd)"
$Root = "$Repo\.."
$SystemExe = "$Repo\.build\win32-$Arch\system-setup\VSCodeSetup.exe"
$UserExe = "$Repo\.build\win32-$Arch\user-setup\VSCodeSetup.exe"
$Zip = "$Repo\.build\win32-$Arch\archive\VSCode-win32-$Arch.zip"
$LegacyServer = "$Root\vscode-reh-win32-$Arch"
$ServerName = "vscode-server-win32-$Arch"
$Server = "$Root\$ServerName"
$ServerZip = "$Repo\.build\vscode-server-win32-$Arch.zip"
$Build = "$Root\VSCode-win32-$Arch"
# Create server archive
exec { Rename-Item -Path $LegacyServer -NewName $ServerName }
exec { .\node_modules\7zip\7zip-lite\7z.exe a -tzip $ServerZip $Server -r }
# get version
$PackageJson = Get-Content -Raw -Path "$Build\resources\app\package.json" | ConvertFrom-Json
$Version = $PackageJson.version
$Quality = "$env:VSCODE_QUALITY"
$AssetPlatform = if ("$Arch" -eq "ia32") { "win32" } else { "win32-x64" }
exec { node build/azure-pipelines/common/publish.js $Quality "$AssetPlatform-archive" archive "VSCode-win32-$Arch-$Version.zip" $Version true $Zip }
exec { node build/azure-pipelines/common/publish.js $Quality "$AssetPlatform" setup "VSCodeSetup-$Arch-$Version.exe" $Version true $SystemExe }
exec { node build/azure-pipelines/common/publish.js $Quality "$AssetPlatform-user" setup "VSCodeUserSetup-$Arch-$Version.exe" $Version true $UserExe }
exec { node build/azure-pipelines/common/publish.js $Quality "server-$AssetPlatform" archive "vscode-server-win32-$Arch.zip" $Version true $ServerZip }
# publish hockeyapp symbols
$hockeyAppId = if ("$Arch" -eq "ia32") { "$env:VSCODE_HOCKEYAPP_ID_WIN32" } else { "$env:VSCODE_HOCKEYAPP_ID_WIN64" }
exec { node build/azure-pipelines/common/symbols.js "$env:VSCODE_MIXIN_PASSWORD" "$env:VSCODE_HOCKEYAPP_TOKEN" "$Arch" $hockeyAppId }

View File

@@ -0,0 +1,70 @@
function Create-TmpJson($Obj) {
$FileName = [System.IO.Path]::GetTempFileName()
ConvertTo-Json -Depth 100 $Obj | Out-File -Encoding UTF8 $FileName
return $FileName
}
$Auth = Create-TmpJson @{
Version = "1.0.0"
AuthenticationType = "AAD_CERT"
ClientId = $env:ESRPClientId
AuthCert = @{
SubjectName = $env:ESRPAuthCertificateSubjectName
StoreLocation = "LocalMachine"
StoreName = "My"
}
RequestSigningCert = @{
SubjectName = $env:ESRPCertificateSubjectName
StoreLocation = "LocalMachine"
StoreName = "My"
}
}
$Policy = Create-TmpJson @{
Version = "1.0.0"
}
$Input = Create-TmpJson @{
Version = "1.0.0"
SignBatches = @(
@{
SourceLocationType = "UNC"
SignRequestFiles = @(
@{
SourceLocation = $args[0]
}
)
SigningInfo = @{
Operations = @(
@{
KeyCode = "CP-230012"
OperationCode = "SigntoolSign"
Parameters = @{
OpusName = "VS Code"
OpusInfo = "https://code.visualstudio.com/"
Append = "/as"
FileDigest = "/fd `"SHA256`""
PageHash = "/NPH"
TimeStamp = "/tr `"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer`" /td sha256"
}
ToolName = "sign"
ToolVersion = "1.0"
},
@{
KeyCode = "CP-230012"
OperationCode = "SigntoolVerify"
Parameters = @{
VerifyAll = "/all"
}
ToolName = "sign"
ToolVersion = "1.0"
}
)
}
}
)
}
$Output = [System.IO.Path]::GetTempFileName()
$ScriptPath = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
& "$ScriptPath\ESRPClient\packages\EsrpClient.1.0.27\tools\ESRPClient.exe" Sign -a $Auth -p $Policy -i $Input -o $Output

View File

@@ -1,12 +1,2 @@
[
{
"name": "ms-vscode.node-debug",
"version": "1.26.7",
"repo": "https://github.com/Microsoft/vscode-node-debug"
},
{
"name": "ms-vscode.node-debug2",
"version": "1.26.8",
"repo": "https://github.com/Microsoft/vscode-node-debug2"
}
]

View File

@@ -43,7 +43,7 @@ function asYarnDependency(prefix, tree) {
}
function getYarnProductionDependencies(cwd) {
const raw = cp.execSync('yarn list --json', { cwd, encoding: 'utf8', env: { ...process.env, NODE_ENV: 'production' }, stdio: [null, null, 'ignore'] });
const raw = cp.execSync('yarn list --json', { cwd, encoding: 'utf8', env: { ...process.env, NODE_ENV: 'production' }, stdio: [null, null, 'inherit'] });
const match = /^{"type":"tree".*$/m.exec(raw);
if (!match || match.length !== 1) {

View File

@@ -0,0 +1,91 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const https = require("https");
const fs = require("fs");
const path = require("path");
const cp = require("child_process");
function ensureDir(filepath) {
if (!fs.existsSync(filepath)) {
ensureDir(path.dirname(filepath));
fs.mkdirSync(filepath);
}
}
function download(options, destination) {
ensureDir(path.dirname(destination));
return new Promise((c, e) => {
const fd = fs.openSync(destination, 'w');
const req = https.get(options, (res) => {
res.on('data', (chunk) => {
fs.writeSync(fd, chunk);
});
res.on('end', () => {
fs.closeSync(fd);
c();
});
});
req.on('error', (reqErr) => {
console.error(`request to ${options.host}${options.path} failed.`);
console.error(reqErr);
e(reqErr);
});
});
}
const MARKER_ARGUMENT = `_download_fork_`;
function base64encode(str) {
return Buffer.from(str, 'utf8').toString('base64');
}
function base64decode(str) {
return Buffer.from(str, 'base64').toString('utf8');
}
function downloadInExternalProcess(options) {
const url = `https://${options.requestOptions.host}${options.requestOptions.path}`;
console.log(`Downloading ${url}...`);
return new Promise((c, e) => {
const child = cp.fork(__filename, [MARKER_ARGUMENT, base64encode(JSON.stringify(options))], {
stdio: ['pipe', 'pipe', 'pipe', 'ipc']
});
let stderr = [];
child.stderr.on('data', (chunk) => {
stderr.push(typeof chunk === 'string' ? Buffer.from(chunk) : chunk);
});
child.on('exit', (code) => {
if (code === 0) {
// normal termination
console.log(`Finished downloading ${url}.`);
c();
}
else {
// abnormal termination
console.error(Buffer.concat(stderr).toString());
e(new Error(`Download of ${url} failed.`));
}
});
});
}
exports.downloadInExternalProcess = downloadInExternalProcess;
function _downloadInExternalProcess() {
let options;
try {
options = JSON.parse(base64decode(process.argv[3]));
}
catch (err) {
console.error(`Cannot read arguments`);
console.error(err);
process.exit(-1);
return;
}
download(options.requestOptions, options.destinationPath).then(() => {
process.exit(0);
}, (err) => {
console.error(err);
process.exit(-2);
});
}
if (process.argv.length >= 4 && process.argv[2] === MARKER_ARGUMENT) {
// running as forked download script
_downloadInExternalProcess();
}

111
build/download/download.ts Normal file
View File

@@ -0,0 +1,111 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as https from 'https';
import * as fs from 'fs';
import * as path from 'path';
import * as cp from 'child_process';
function ensureDir(filepath: string) {
if (!fs.existsSync(filepath)) {
ensureDir(path.dirname(filepath));
fs.mkdirSync(filepath);
}
}
function download(options: https.RequestOptions, destination: string): Promise<void> {
ensureDir(path.dirname(destination));
return new Promise<void>((c, e) => {
const fd = fs.openSync(destination, 'w');
const req = https.get(options, (res) => {
res.on('data', (chunk) => {
fs.writeSync(fd, chunk);
});
res.on('end', () => {
fs.closeSync(fd);
c();
});
});
req.on('error', (reqErr) => {
console.error(`request to ${options.host}${options.path} failed.`);
console.error(reqErr);
e(reqErr);
});
});
}
const MARKER_ARGUMENT = `_download_fork_`;
function base64encode(str: string): string {
return Buffer.from(str, 'utf8').toString('base64');
}
function base64decode(str: string): string {
return Buffer.from(str, 'base64').toString('utf8');
}
export interface IDownloadRequestOptions {
host: string;
path: string;
}
export interface IDownloadOptions {
requestOptions: IDownloadRequestOptions;
destinationPath: string;
}
export function downloadInExternalProcess(options: IDownloadOptions): Promise<void> {
const url = `https://${options.requestOptions.host}${options.requestOptions.path}`;
console.log(`Downloading ${url}...`);
return new Promise<void>((c, e) => {
const child = cp.fork(
__filename,
[MARKER_ARGUMENT, base64encode(JSON.stringify(options))],
{
stdio: ['pipe', 'pipe', 'pipe', 'ipc']
}
);
let stderr: Buffer[] = [];
child.stderr.on('data', (chunk) => {
stderr.push(typeof chunk === 'string' ? Buffer.from(chunk) : chunk);
});
child.on('exit', (code) => {
if (code === 0) {
// normal termination
console.log(`Finished downloading ${url}.`);
c();
} else {
// abnormal termination
console.error(Buffer.concat(stderr).toString());
e(new Error(`Download of ${url} failed.`));
}
});
});
}
function _downloadInExternalProcess() {
let options: IDownloadOptions;
try {
options = JSON.parse(base64decode(process.argv[3]));
} catch (err) {
console.error(`Cannot read arguments`);
console.error(err);
process.exit(-1);
return;
}
download(options.requestOptions, options.destinationPath).then(() => {
process.exit(0);
}, (err) => {
console.error(err);
process.exit(-2);
});
}
if (process.argv.length >= 4 && process.argv[2] === MARKER_ARGUMENT) {
// running as forked download script
_downloadInExternalProcess();
}

18
build/gulpfile.compile.js Normal file
View File

@@ -0,0 +1,18 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
const util = require('./lib/util');
const task = require('./lib/task');
const compilation = require('./lib/compilation');
const { compileExtensionsBuildTask } = require('./gulpfile.extensions');
// Full compile, including nls and inline sources in sourcemaps, for build
const compileClientBuildTask = task.define('compile-client-build', task.series(util.rimraf('out-build'), compilation.compileTask('src', 'out-build', true)));
// All Build
const compileBuildTask = task.define('compile-build', task.parallel(compileClientBuildTask, compileExtensionsBuildTask));
exports.compileBuildTask = compileBuildTask;

View File

@@ -6,6 +6,7 @@
const gulp = require('gulp');
const path = require('path');
const util = require('./lib/util');
const task = require('./lib/task');
const common = require('./lib/optimize');
const es = require('event-stream');
const File = require('vinyl');
@@ -28,7 +29,7 @@ var editorEntryPoints = [
name: 'vs/editor/editor.main',
include: [],
exclude: ['vs/css', 'vs/nls'],
prepend: ['out-build/vs/css.js', 'out-build/vs/nls.js'],
prepend: ['out-editor-build/vs/css.js', 'out-editor-build/vs/nls.js'],
},
{
name: 'vs/base/common/worker/simpleWorker',
@@ -48,9 +49,6 @@ var editorResources = [
'!**/test/**'
];
var editorOtherSources = [
];
var BUNDLED_FILE_HEADER = [
'/*!-----------------------------------------------------------',
' * Copyright (c) Microsoft Corporation. All rights reserved.',
@@ -63,8 +61,7 @@ var BUNDLED_FILE_HEADER = [
const languages = i18n.defaultLanguages.concat([]); // i18n.defaultLanguages.concat(process.env.VSCODE_QUALITY !== 'stable' ? i18n.extraLanguages : []);
gulp.task('clean-editor-src', util.rimraf('out-editor-src'));
gulp.task('extract-editor-src', ['clean-editor-src'], function () {
const extractEditorSrcTask = task.define('extract-editor-src', () => {
console.log(`If the build fails, consider tweaking shakeLevel below to a lower value.`);
const apiusages = monacoapi.execute().usageContent;
const extrausages = fs.readFileSync(path.join(root, 'build', 'monaco', 'monaco.usage.recipe')).toString();
@@ -79,35 +76,39 @@ gulp.task('extract-editor-src', ['clean-editor-src'], function () {
apiusages,
extrausages
],
typings: [
'typings/lib.ie11_safe_es6.d.ts',
'typings/thenable.d.ts',
'typings/es6-promise.d.ts',
'typings/require-monaco.d.ts',
"typings/lib.es2018.promise.d.ts",
'vs/monaco.d.ts'
],
libs: [
`lib.d.ts`,
`lib.es2015.collection.d.ts`
`lib.es5.d.ts`,
`lib.dom.d.ts`,
`lib.webworker.importscripts.d.ts`
],
redirects: {
'vs/base/browser/ui/octiconLabel/octiconLabel': 'vs/base/browser/ui/octiconLabel/octiconLabel.mock',
},
compilerOptions: {
module: 2, // ModuleKind.AMD
},
shakeLevel: 2, // 0-Files, 1-InnerFile, 2-ClassMembers
importIgnorePattern: /^vs\/css!/,
importIgnorePattern: /(^vs\/css!)|(promise-polyfill\/polyfill)/,
destRoot: path.join(root, 'out-editor-src')
});
});
// Full compile, including nls and inline sources in sourcemaps, for build
gulp.task('clean-editor-build', util.rimraf('out-editor-build'));
gulp.task('compile-editor-build', ['clean-editor-build', 'extract-editor-src'], compilation.compileTask('out-editor-src', 'out-editor-build', true));
const compileEditorAMDTask = task.define('compile-editor-amd', compilation.compileTask('out-editor-src', 'out-editor-build', true));
gulp.task('clean-optimized-editor', util.rimraf('out-editor'));
gulp.task('optimize-editor', ['clean-optimized-editor', 'compile-editor-build'], common.optimizeTask({
const optimizeEditorAMDTask = task.define('optimize-editor-amd', common.optimizeTask({
src: 'out-editor-build',
entryPoints: editorEntryPoints,
otherSources: editorOtherSources,
resources: editorResources,
loaderConfig: {
paths: {
'vs': 'out-editor-build/vs',
'vs/css': 'out-editor-build/vs/css.build',
'vs/nls': 'out-editor-build/vs/nls.build',
'vscode': 'empty:'
}
},
@@ -118,29 +119,45 @@ gulp.task('optimize-editor', ['clean-optimized-editor', 'compile-editor-build'],
languages: languages
}));
gulp.task('clean-minified-editor', util.rimraf('out-editor-min'));
gulp.task('minify-editor', ['clean-minified-editor', 'optimize-editor'], common.minifyTask('out-editor'));
const minifyEditorAMDTask = task.define('minify-editor-amd', common.minifyTask('out-editor'));
gulp.task('clean-editor-esm', util.rimraf('out-editor-esm'));
gulp.task('extract-editor-esm', ['clean-editor-esm', 'clean-editor-distro'], function () {
standalone.createESMSourcesAndResources({
entryPoints: [
'vs/editor/editor.main',
'vs/editor/editor.worker'
],
outFolder: './out-editor-esm/src',
const createESMSourcesAndResourcesTask = task.define('extract-editor-esm', () => {
standalone.createESMSourcesAndResources2({
srcFolder: './out-editor-src',
outFolder: './out-editor-esm',
outResourcesFolder: './out-monaco-editor-core/esm',
redirects: {
'vs/base/browser/ui/octiconLabel/octiconLabel': 'vs/base/browser/ui/octiconLabel/octiconLabel.mock',
'vs/nls': 'vs/nls.mock',
ignores: [
'inlineEntryPoint:0.ts',
'inlineEntryPoint:1.ts',
'vs/loader.js',
'vs/nls.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/css.js',
'vs/css.build.js',
'vs/css.d.ts',
'vs/base/worker/workerMain.ts',
],
renames: {
'vs/nls.mock.ts': 'vs/nls.ts'
}
});
});
gulp.task('compile-editor-esm', ['extract-editor-esm', 'clean-editor-distro'], function () {
const result = cp.spawnSync(`node`, [`../node_modules/.bin/tsc`], {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
const compileEditorESMTask = task.define('compile-editor-esm', () => {
if (process.platform === 'win32') {
const result = cp.spawnSync(`..\\node_modules\\.bin\\tsc.cmd`, {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
console.log(result.stderr.toString());
} else {
const result = cp.spawnSync(`node`, [`../node_modules/.bin/tsc`], {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
console.log(result.stderr.toString());
}
});
function toExternalDTS(contents) {
@@ -178,8 +195,16 @@ function toExternalDTS(contents) {
return lines.join('\n');
}
gulp.task('clean-editor-distro', util.rimraf('out-monaco-editor-core'));
gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify-editor', 'optimize-editor'], function () {
function filterStream(testFunc) {
return es.through(function (data) {
if (!testFunc(data.relative)) {
return;
}
this.emit('data', data);
});
}
const finalEditorResourcesTask = task.define('final-editor-resources', () => {
return es.merge(
// other assets
es.merge(
@@ -194,7 +219,7 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
this.emit('data', new File({
path: data.path.replace(/monaco\.d\.ts/, 'editor.api.d.ts'),
base: data.base,
contents: new Buffer(toExternalDTS(data.contents.toString()))
contents: Buffer.from(toExternalDTS(data.contents.toString()))
}));
}))
.pipe(gulp.dest('out-monaco-editor-core/esm/vs/editor')),
@@ -209,6 +234,14 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
}))
.pipe(gulp.dest('out-monaco-editor-core')),
// version.txt
gulp.src('build/monaco/version.txt')
.pipe(es.through(function (data) {
data.contents = Buffer.from(`monaco-editor-core: https://github.com/Microsoft/vscode/tree/${sha1}`);
this.emit('data', data);
}))
.pipe(gulp.dest('out-monaco-editor-core')),
// README.md
gulp.src('build/monaco/README-npm.md')
.pipe(es.through(function (data) {
@@ -242,7 +275,7 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
var strContents = data.contents.toString();
var newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/');
strContents = strContents.replace(/\/\/\# sourceMappingURL=[^ ]+$/, newStr);
strContents = strContents.replace(/\/\/# sourceMappingURL=[^ ]+$/, newStr);
data.contents = Buffer.from(strContents);
this.emit('data', data);
@@ -258,59 +291,31 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
);
});
gulp.task('analyze-editor-distro', function () {
// @ts-ignore
var bundleInfo = require('../out-editor/bundleInfo.json');
var graph = bundleInfo.graph;
var bundles = bundleInfo.bundles;
var inverseGraph = {};
Object.keys(graph).forEach(function (module) {
var dependencies = graph[module];
dependencies.forEach(function (dep) {
inverseGraph[dep] = inverseGraph[dep] || [];
inverseGraph[dep].push(module);
});
});
var detailed = {};
Object.keys(bundles).forEach(function (entryPoint) {
var included = bundles[entryPoint];
var includedMap = {};
included.forEach(function (included) {
includedMap[included] = true;
});
var explanation = [];
included.map(function (included) {
if (included.indexOf('!') >= 0) {
return;
}
var reason = (inverseGraph[included] || []).filter(function (mod) {
return !!includedMap[mod];
});
explanation.push({
module: included,
reason: reason
});
});
detailed[entryPoint] = explanation;
});
console.log(JSON.stringify(detailed, null, '\t'));
});
function filterStream(testFunc) {
return es.through(function (data) {
if (!testFunc(data.relative)) {
return;
}
this.emit('data', data);
});
}
gulp.task('editor-distro',
task.series(
task.parallel(
util.rimraf('out-editor-src'),
util.rimraf('out-editor-build'),
util.rimraf('out-editor-esm'),
util.rimraf('out-monaco-editor-core'),
util.rimraf('out-editor'),
util.rimraf('out-editor-min')
),
extractEditorSrcTask,
task.parallel(
task.series(
compileEditorAMDTask,
optimizeEditorAMDTask,
minifyEditorAMDTask
),
task.series(
createESMSourcesAndResourcesTask,
compileEditorESMTask
)
),
finalEditorResourcesTask
)
);
//#region monaco type checking
@@ -330,6 +335,7 @@ function createTscCompileTask(watch) {
let errors = [];
let reporter = createReporter();
let report;
// eslint-disable-next-line no-control-regex
let magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings
child.stdout.on('data', data => {
@@ -363,7 +369,10 @@ function createTscCompileTask(watch) {
};
}
gulp.task('monaco-typecheck-watch', createTscCompileTask(true));
gulp.task('monaco-typecheck', createTscCompileTask(false));
const monacoTypecheckWatchTask = task.define('monaco-typecheck-watch', createTscCompileTask(true));
exports.monacoTypecheckWatchTask = monacoTypecheckWatchTask;
const monacoTypecheckTask = task.define('monaco-typecheck', createTscCompileTask(false));
exports.monacoTypecheckTask = monacoTypecheckTask;
//#endregion

View File

@@ -11,8 +11,8 @@ const path = require('path');
const tsb = require('gulp-tsb');
const es = require('event-stream');
const filter = require('gulp-filter');
const rimraf = require('rimraf');
const util = require('./lib/util');
const task = require('./lib/task');
const watcher = require('./lib/watch');
const createReporter = require('./lib/reporter').createReporter;
const glob = require('glob');
@@ -21,6 +21,7 @@ const nlsDev = require('vscode-nls-dev');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
const plumber = require('gulp-plumber');
const _ = require('underscore');
const extensionsPath = path.join(path.dirname(__dirname), 'extensions');
@@ -35,22 +36,13 @@ const tasks = compilations.map(function (tsconfigFile) {
const absolutePath = path.join(extensionsPath, tsconfigFile);
const relativeDirname = path.dirname(tsconfigFile);
const tsOptions = require(absolutePath).compilerOptions;
const tsconfig = require(absolutePath);
const tsOptions = _.assign({}, tsconfig.extends ? require(path.join(extensionsPath, relativeDirname, tsconfig.extends)).compilerOptions : {}, tsconfig.compilerOptions);
tsOptions.verbose = false;
tsOptions.sourceMap = true;
const name = relativeDirname.replace(/\//g, '-');
// Tasks
const clean = 'clean-extension:' + name;
const compile = 'compile-extension:' + name;
const watch = 'watch-extension:' + name;
// Build Tasks
const cleanBuild = 'clean-extension-build:' + name;
const compileBuild = 'compile-extension-build:' + name;
const watchBuild = 'watch-extension-build:' + name;
const root = path.join('extensions', relativeDirname);
const srcBase = path.join(root, 'src');
const src = path.join(srcBase, '**');
@@ -109,18 +101,18 @@ const tasks = compilations.map(function (tsconfigFile) {
const srcOpts = { cwd: path.dirname(__dirname), base: srcBase };
gulp.task(clean, cb => rimraf(out, cb));
const cleanTask = task.define(`clean-extension-${name}`, util.rimraf(out));
gulp.task(compile, [clean], () => {
const compileTask = task.define(`compile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true);
const input = gulp.src(src, srcOpts);
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
});
}));
gulp.task(watch, [clean], () => {
const watchTask = task.define(`watch-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false);
const input = gulp.src(src, srcOpts);
const watchInput = watcher(src, srcOpts);
@@ -128,43 +120,35 @@ const tasks = compilations.map(function (tsconfigFile) {
return watchInput
.pipe(util.incremental(pipeline, input))
.pipe(gulp.dest(out));
});
}));
gulp.task(cleanBuild, cb => rimraf(out, cb));
gulp.task(compileBuild, [clean], () => {
const compileBuildTask = task.define(`compile-build-extension-${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(true, true);
const input = gulp.src(src, srcOpts);
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
});
}));
gulp.task(watchBuild, [clean], () => {
const pipeline = createPipeline(true);
const input = gulp.src(src, srcOpts);
const watchInput = watcher(src, srcOpts);
return watchInput
.pipe(util.incremental(() => pipeline(), input))
.pipe(gulp.dest(out));
});
// Tasks
gulp.task(compileTask);
gulp.task(watchTask);
return {
clean: clean,
compile: compile,
watch: watch,
cleanBuild: cleanBuild,
compileBuild: compileBuild,
watchBuild: watchBuild
compileTask: compileTask,
watchTask: watchTask,
compileBuildTask: compileBuildTask
};
});
gulp.task('clean-extensions', tasks.map(t => t.clean));
gulp.task('compile-extensions', tasks.map(t => t.compile));
gulp.task('watch-extensions', tasks.map(t => t.watch));
const compileExtensionsTask = task.define('compile-extensions', task.parallel(...tasks.map(t => t.compileTask)));
gulp.task(compileExtensionsTask);
exports.compileExtensionsTask = compileExtensionsTask;
gulp.task('clean-extensions-build', tasks.map(t => t.cleanBuild));
gulp.task('compile-extensions-build', tasks.map(t => t.compileBuild));
gulp.task('watch-extensions-build', tasks.map(t => t.watchBuild));
const watchExtensionsTask = task.define('watch-extensions', task.parallel(...tasks.map(t => t.watchTask)));
gulp.task(watchExtensionsTask);
exports.watchExtensionsTask = watchExtensionsTask;
const compileExtensionsBuildTask = task.define('compile-extensions-build', task.parallel(...tasks.map(t => t.compileBuildTask)));
exports.compileExtensionsBuildTask = compileExtensionsBuildTask;

View File

@@ -42,12 +42,15 @@ const indentationFilter = [
// except specific files
'!ThirdPartyNotices.txt',
'!LICENSE.txt',
'!LICENSE.{txt,rtf}',
'!LICENSES.chromium.html',
'!**/LICENSE',
'!src/vs/nls.js',
'!src/vs/nls.build.js',
'!src/vs/css.js',
'!src/vs/css.build.js',
'!src/vs/loader.js',
'!src/vs/base/common/marked/marked.js',
'!src/vs/base/common/winjs.base.js',
'!src/vs/base/node/terminateProcess.sh',
'!src/vs/base/node/cpuUsage.sh',
'!test/assert.js',
@@ -78,13 +81,23 @@ const indentationFilter = [
'!src/vs/*/**/*.d.ts',
'!src/typings/**/*.d.ts',
'!extensions/**/*.d.ts',
'!**/*.{svg,exe,png,bmp,scpt,bat,cmd,cur,ttf,woff,eot,md,ps1,template,yaml,yml,d.ts.recipe}',
'!build/{lib,tslintRules}/**/*.js',
'!**/*.{svg,exe,png,bmp,scpt,bat,cmd,cur,ttf,woff,eot,md,ps1,template,yaml,yml,d.ts.recipe,ico,icns}',
'!build/{lib,tslintRules,download}/**/*.js',
'!build/**/*.sh',
'!build/tfs/**/*.js',
'!build/tfs/**/*.config',
'!build/azure-pipelines/**/*.js',
'!build/azure-pipelines/**/*.config',
'!**/Dockerfile',
'!extensions/markdown-language-features/media/*.js'
'!**/Dockerfile.*',
'!**/*.Dockerfile',
'!**/*.dockerfile',
'!extensions/markdown-language-features/media/*.js',
// {{SQL CARBON EDIT}}
'!**/*.{xlf,docx,sql,vsix,bacpac}',
'!extensions/mssql/sqltoolsservice/**',
'!extensions/import/flatfileimportservice/**',
'!extensions/admin-tool-ext-win/ssmsmin/**',
'!extensions/resource-deployment/notebooks/**',
'!extensions/mssql/notebooks/**'
];
const copyrightFilter = [
@@ -96,6 +109,8 @@ const copyrightFilter = [
'!**/*.md',
'!**/*.bat',
'!**/*.cmd',
'!**/*.ico',
'!**/*.icns',
'!**/*.xml',
'!**/*.sh',
'!**/*.txt',
@@ -103,13 +118,47 @@ const copyrightFilter = [
'!**/*.opts',
'!**/*.disabled',
'!**/*.code-workspace',
'!**/promise-polyfill/polyfill.js',
'!build/**/*.init',
'!resources/linux/snap/snapcraft.yaml',
'!resources/linux/snap/electron-launch',
'!resources/win32/bin/code.js',
'!resources/completions/**',
'!extensions/markdown-language-features/media/highlight.css',
'!extensions/html-language-features/server/src/modes/typescript/*',
'!extensions/*/server/bin/*'
'!extensions/*/server/bin/*',
// {{SQL CARBON EDIT}}
'!extensions/notebook/src/intellisense/text.ts',
'!extensions/mssql/src/objectExplorerNodeProvider/webhdfs.ts',
'!src/sql/workbench/parts/notebook/outputs/tableRenderers.ts',
'!src/sql/workbench/parts/notebook/outputs/common/url.ts',
'!src/sql/workbench/parts/notebook/outputs/common/renderMimeInterfaces.ts',
'!src/sql/workbench/parts/notebook/outputs/common/outputProcessor.ts',
'!src/sql/workbench/parts/notebook/outputs/common/mimemodel.ts',
'!src/sql/workbench/parts/notebook/cellViews/media/*.css',
'!src/sql/base/browser/ui/table/plugins/rowSelectionModel.plugin.ts',
'!src/sql/base/browser/ui/table/plugins/rowDetailView.ts',
'!src/sql/base/browser/ui/table/plugins/headerFilter.plugin.ts',
'!src/sql/base/browser/ui/table/plugins/checkboxSelectColumn.plugin.ts',
'!src/sql/base/browser/ui/table/plugins/cellSelectionModel.plugin.ts',
'!src/sql/base/browser/ui/table/plugins/autoSizeColumns.plugin.ts',
'!src/sql/workbench/parts/notebook/outputs/sanitizer.ts',
'!src/sql/workbench/parts/notebook/outputs/renderers.ts',
'!src/sql/workbench/parts/notebook/outputs/registry.ts',
'!src/sql/workbench/parts/notebook/outputs/factories.ts',
'!src/sql/workbench/parts/notebook/models/nbformat.ts',
'!extensions/markdown-language-features/media/tomorrow.css',
'!src/sql/workbench/electron-browser/modelComponents/media/highlight.css',
'!src/sql/parts/modelComponents/highlight.css',
'!extensions/mssql/sqltoolsservice/**',
'!extensions/import/flatfileimportservice/**',
'!extensions/notebook/src/prompts/**',
'!extensions/mssql/src/prompts/**',
'!extensions/notebook/resources/jupyter_config/**',
'!**/*.gif',
'!**/*.xlf',
'!**/*.dacpac',
'!**/*.bacpac'
];
const eslintFilter = [
@@ -120,7 +169,6 @@ const eslintFilter = [
'!src/vs/nls.js',
'!src/vs/css.build.js',
'!src/vs/nls.build.js',
'!src/**/winjs.base.js',
'!src/**/marked.js',
'!**/test/**'
];
@@ -139,6 +187,11 @@ const tslintFilter = [
'!extensions/html-language-features/server/lib/jquery.d.ts'
];
// {{SQL CARBON EDIT}}
const useStrictFilter = [
'src/**'
];
// {{SQL CARBON EDIT}}
const copyrightHeaderLines = [
'/*---------------------------------------------------------------------------------------------',
@@ -156,8 +209,7 @@ gulp.task('eslint', () => {
});
gulp.task('tslint', () => {
// {{SQL CARBON EDIT}}
const options = { emitError: false };
const options = { emitError: true };
return vfs.src(all, { base: '.', follow: true, allowEmpty: true })
.pipe(filter(tslintFilter))
@@ -190,8 +242,8 @@ function hygiene(some) {
});
const copyrights = es.through(function (file) {
const lines = file.__lines;
const lines = file.__lines;
for (let i = 0; i < copyrightHeaderLines.length; i++) {
if (lines[i] !== copyrightHeaderLines[i]) {
console.error(file.relative + ': Missing or bad copyright statement');
@@ -203,6 +255,23 @@ function hygiene(some) {
this.emit('data', file);
});
// {{SQL CARBON EDIT}}
// Check for unnecessary 'use strict' lines. These are automatically added by the alwaysStrict compiler option so don't need to be added manually
const useStrict = es.through(function (file) {
const lines = file.__lines;
// Only take the first 10 lines to reduce false positives- the compiler will throw an error if it's not the first non-comment line in a file
// (10 is used to account for copyright and extraneous newlines)
lines.slice(0, 10).forEach((line, i) => {
if (/\s*'use\s*strict\s*'/.test(line)) {
console.error(file.relative + '(' + (i + 1) + ',1): Unnecessary \'use strict\' - this is already added by the compiler');
errorCount++;
}
});
this.emit('data', file);
});
// {{SQL CARBON EDIT}} END
const formatting = es.map(function (file, cb) {
tsfmt.processString(file.path, file.contents.toString('utf8'), {
verify: false,
@@ -223,7 +292,7 @@ function hygiene(some) {
let formatted = result.dest.replace(/\r\n/gm, '\n');
if (original !== formatted) {
console.error('File not formatted:', file.relative);
console.error("File not formatted. Run the 'Format Document' command to fix it:", file.relative);
errorCount++;
}
cb(null, file);
@@ -255,27 +324,52 @@ function hygiene(some) {
.pipe(filter(f => !f.stat.isDirectory()))
.pipe(filter(indentationFilter))
.pipe(indentation)
.pipe(filter(copyrightFilter));
// {{SQL CARBON EDIT}}
// .pipe(copyrights);
.pipe(filter(copyrightFilter))
.pipe(copyrights);
const typescript = result
.pipe(filter(tslintFilter))
.pipe(formatting)
.pipe(tsl);
.pipe(tsl)
// {{SQL CARBON EDIT}}
.pipe(filter(useStrictFilter))
.pipe(useStrict);
const javascript = result
.pipe(filter(eslintFilter))
.pipe(gulpeslint('src/.eslintrc'))
.pipe(gulpeslint.formatEach('compact'));
// {{SQL CARBON EDIT}}
// .pipe(gulpeslint.failAfterError());
.pipe(gulpeslint.formatEach('compact'))
.pipe(gulpeslint.failAfterError());
let count = 0;
return es.merge(typescript, javascript)
.pipe(es.through(function (data) {
// {{SQL CARBON EDIT}}
this.emit('end');
count++;
if (process.env['TRAVIS'] && count % 10 === 0) {
process.stdout.write('.');
}
this.emit('data', data);
}, function () {
process.stdout.write('\n');
const tslintResult = tsLinter.getResult();
if (tslintResult.failures.length > 0) {
for (const failure of tslintResult.failures) {
const name = failure.getFileName();
const position = failure.getStartPosition();
const line = position.getLineAndCharacter().line;
const character = position.getLineAndCharacter().character;
console.error(`${name}:${line + 1}:${character + 1}:${failure.getFailure()}`);
}
errorCount += tslintResult.failures.length;
}
if (errorCount > 0) {
this.emit('error', 'Hygiene failed with ' + errorCount + ' errors. Check \'build/gulpfile.hygiene.js\'.');
} else {
this.emit('end');
}
}));
}
@@ -293,7 +387,7 @@ function createGitIndexVinyls(paths) {
return e(err);
}
cp.exec(`git show :${relativePath}`, { maxBuffer: 2000 * 1024, encoding: 'buffer' }, (err, out) => {
cp.exec(`git show ":${relativePath}"`, { maxBuffer: 2000 * 1024, encoding: 'buffer' }, (err, out) => {
if (err) {
return e(err);
}

View File

@@ -6,22 +6,13 @@
'use strict';
const gulp = require('gulp');
const json = require('gulp-json-editor');
const buffer = require('gulp-buffer');
const filter = require('gulp-filter');
const es = require('event-stream');
const util = require('./lib/util');
const remote = require('gulp-remote-src');
const zip = require('gulp-vinyl-zip');
const assign = require('object-assign');
// {{SQL CARBON EDIT}}
const jeditor = require('gulp-json-editor');
const pkg = require('../package.json');
const product = require('../product.json');
gulp.task('mixin', function () {
// {{SQL CARBON EDIT}}
// {{SQL CARBON EDIT}}
const updateUrl = process.env['SQLOPS_UPDATEURL'];
if (!updateUrl) {
console.log('Missing SQLOPS_UPDATEURL, skipping mixin');
@@ -35,19 +26,53 @@ gulp.task('mixin', function () {
return;
}
// {{SQL CARBON EDIT}}
let serviceUrl = 'https://sqlopsextensions.blob.core.windows.net/marketplace/v1/extensionsGallery.json';
if (quality === 'insider') {
serviceUrl = `https://sqlopsextensions.blob.core.windows.net/marketplace/v1/extensionsGallery-${quality}.json`;
}
// {{SQL CARBON EDIT}} - apply ADS insiders values if needed
let newValues = {
"nameShort": product.nameShort,
"nameLong": product.nameLong,
"applicationName": product.applicationName,
"dataFolderName": product.dataFolderName,
"win32MutexName": product.win32MutexName,
"win32DirName": product.win32DirName,
"win32NameVersion": product.win32NameVersion,
"win32RegValueName": product.win32RegValueName,
"win32AppId": product.win32AppId,
"win32x64AppId": product.win32x64AppId,
"win32UserAppId": product.win32UserAppId,
"win32x64UserAppId": product.win32x64UserAppId,
"win32AppUserModelId": product.win32AppUserModelId,
"win32ShellNameShort": product.win32ShellNameShort,
"darwinBundleIdentifier": product.darwinBundleIdentifier,
"updateUrl": updateUrl,
"quality": quality,
"extensionsGallery": {
"serviceUrl": serviceUrl
"serviceUrl": 'https://sqlopsextensions.blob.core.windows.net/marketplace/v1/extensionsGallery.json'
}
};
if (quality === 'insider') {
let dashSuffix = '-insiders';
let dotSuffix = '.insiders';
let displaySuffix = ' - Insiders';
newValues.extensionsGallery.serviceUrl = `https://sqlopsextensions.blob.core.windows.net/marketplace/v1/extensionsGallery-${quality}.json`;
newValues.nameShort += dashSuffix;
newValues.nameLong += displaySuffix;
newValues.applicationName += dashSuffix;
newValues.dataFolderName += dashSuffix;
newValues.win32MutexName += dashSuffix;
newValues.win32DirName += displaySuffix;
newValues.win32NameVersion += displaySuffix;
newValues.win32RegValueName += dashSuffix;
newValues.win32AppId = "{{9F0801B2-DEE3-4272-A2C6-FBDF25BAAF0F}";
newValues.win32x64AppId = "{{6748A5FD-29EB-4BA6-B3C6-E7B981B8D6B0}";
newValues.win32UserAppId = "{{0F8CD1ED-483C-40EB-8AD2-8ED784651AA1}";
newValues.win32x64UserAppId += dashSuffix;
newValues.win32AppUserModelId += dotSuffix;
newValues.win32ShellNameShort += displaySuffix;
newValues.darwinBundleIdentifier += dotSuffix;
}
return gulp.src('./product.json')
.pipe(jeditor(newValues))
.pipe(gulp.dest('.'));

16
build/gulpfile.reh.js Normal file
View File

@@ -0,0 +1,16 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
const gulp = require('gulp');
const noop = () => { return Promise.resolve(); };
gulp.task('vscode-reh-win32-ia32-min', noop);
gulp.task('vscode-reh-win32-x64-min', noop);
gulp.task('vscode-reh-darwin-min', noop);
gulp.task('vscode-reh-linux-x64-min', noop);
gulp.task('vscode-reh-linux-arm-min', noop);

View File

@@ -28,7 +28,6 @@ const formatFiles = (some) => {
console.info('ran formatting on file ' + file.path + ' result: ' + result.message);
if (result.error) {
console.error(result.message);
errorCount++;
}
cb(null, file);
@@ -40,7 +39,7 @@ const formatFiles = (some) => {
.pipe(filter(f => !f.stat.isDirectory()))
.pipe(formatting);
}
};
const formatStagedFiles = () => {
const cp = require('child_process');
@@ -81,4 +80,4 @@ const formatStagedFiles = () => {
process.exit(1);
});
});
}
};

View File

@@ -1,15 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
const gulp = require('gulp');
const mocha = require('gulp-mocha');
gulp.task('test', function () {
return gulp.src('test/all.js')
.pipe(mocha({ ui: 'tdd', delay: true }))
.once('end', function () { process.exit(); });
});

View File

@@ -20,6 +20,7 @@ const filter = require('gulp-filter');
const json = require('gulp-json-editor');
const _ = require('underscore');
const util = require('./lib/util');
const task = require('./lib/task');
const ext = require('./lib/extensions');
const buildfile = require('../src/buildfile');
const common = require('./lib/optimize');
@@ -32,17 +33,17 @@ const i18n = require('./lib/i18n');
// {{SQL CARBON EDIT}}
const serviceDownloader = require('service-downloader').ServiceDownloadProvider;
const platformInfo = require('service-downloader/out/platform').PlatformInformation;
const glob = require('glob');
// {{SQL CARBON EDIT}} - End
const deps = require('./dependencies');
const getElectronVersion = require('./lib/electron').getElectronVersion;
const createAsar = require('./lib/asar').createAsar;
const { compileBuildTask } = require('./gulpfile.compile');
const productionDependencies = deps.getProductionDependencies(path.dirname(__dirname));
// @ts-ignore
// {{SQL CARBON EDIT}}
var del = require('del');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
const baseModules = Object.keys(process.binding('natives')).filter(n => !/^_|\//.test(n));
// {{SQL CARBON EDIT}}
const nodeModules = [
@@ -51,33 +52,12 @@ const nodeModules = [
'rxjs/Observable',
'rxjs/Subject',
'rxjs/Observer',
'ng2-charts/ng2-charts']
'ng2-charts']
.concat(Object.keys(product.dependencies || {}))
.concat(_.uniq(productionDependencies.map(d => d.name)))
.concat(baseModules);
// Build
const builtInExtensions = require('./builtInExtensions.json');
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
];
// {{SQL CARBON EDIT}}
const vsce = require('vsce');
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'agent',
'import',
'profiler'
];
var azureExtensions = [ 'azurecore'];
const vscodeEntryPoints = _.flatten([
buildfile.entrypoint('vs/workbench/workbench.main'),
buildfile.base,
@@ -90,22 +70,27 @@ const vscodeResources = [
'out-build/cli.js',
'out-build/driver.js',
'out-build/bootstrap.js',
'out-build/bootstrap-fork.js',
'out-build/bootstrap-amd.js',
'out-build/bootstrap-window.js',
'out-build/paths.js',
'out-build/vs/**/*.{svg,png,cur,html}',
'!out-build/vs/code/browser/**/*.html',
'out-build/vs/base/common/performance.js',
'out-build/vs/base/node/{stdForkStart.js,terminateProcess.sh,cpuUsage.sh}',
'out-build/vs/base/node/languagePacks.js',
'out-build/vs/base/node/{stdForkStart.js,terminateProcess.sh,cpuUsage.sh,ps.sh}',
'out-build/vs/base/browser/ui/octiconLabel/octicons/**',
'out-build/vs/workbench/browser/media/*-theme.css',
'out-build/vs/workbench/electron-browser/bootstrap/**',
'out-build/vs/workbench/parts/debug/**/*.json',
'out-build/vs/workbench/parts/execution/**/*.scpt',
'out-build/vs/workbench/parts/webview/electron-browser/webview-pre.js',
'out-build/vs/workbench/contrib/debug/**/*.json',
'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt',
'out-build/vs/workbench/contrib/webview/browser/pre/*.js',
'out-build/vs/workbench/contrib/webview/electron-browser/pre/*.js',
'out-build/vs/**/markdown.css',
'out-build/vs/workbench/parts/tasks/**/*.json',
'out-build/vs/workbench/parts/welcome/walkThrough/**/*.md',
'out-build/vs/workbench/contrib/tasks/**/*.json',
'out-build/vs/workbench/contrib/welcome/walkThrough/**/*.md',
'out-build/vs/workbench/services/files/**/*.exe',
'out-build/vs/workbench/services/files/**/*.md',
'out-build/vs/code/electron-browser/workbench/**',
'out-build/vs/code/electron-browser/sharedProcess/sharedProcess.js',
'out-build/vs/code/electron-browser/issue/issueReporter.js',
'out-build/vs/code/electron-browser/processExplorer/processExplorer.js',
@@ -117,19 +102,17 @@ const vscodeResources = [
'out-build/sql/parts/admin/**/*.html',
'out-build/sql/parts/connection/connectionDialog/media/*.{gif,png,svg}',
'out-build/sql/parts/common/dblist/**/*.html',
'out-build/sql/parts/dashboard/**/*.html',
'out-build/sql/workbench/parts/dashboard/**/*.html',
'out-build/sql/parts/disasterRecovery/**/*.html',
'out-build/sql/parts/common/modal/media/**',
'out-build/sql/parts/grid/load/lib/**',
'out-build/sql/parts/grid/load/loadJquery.js',
'out-build/sql/parts/grid/media/**',
'out-build/sql/parts/grid/views/**/*.html',
'out-build/sql/workbench/parts/grid/media/**',
'out-build/sql/workbench/parts/grid/views/**/*.html',
'out-build/sql/parts/tasks/**/*.html',
'out-build/sql/parts/taskHistory/viewlet/media/**',
'out-build/sql/parts/jobManagement/common/media/*.svg',
'out-build/sql/media/objectTypes/*.svg',
'out-build/sql/media/icons/*.svg',
'out-build/sql/parts/notebook/media/**/*.svg',
'out-build/sql/workbench/parts/notebook/media/**/*.svg',
'!**/test/**'
];
@@ -139,65 +122,84 @@ const BUNDLED_FILE_HEADER = [
' *--------------------------------------------------------*/'
].join('\n');
gulp.task('clean-optimized-vscode', util.rimraf('out-vscode'));
gulp.task('optimize-vscode', ['clean-optimized-vscode', 'compile-build', 'compile-extensions-build'], common.optimizeTask({
src: 'out-build',
entryPoints: vscodeEntryPoints,
otherSources: [],
resources: vscodeResources,
loaderConfig: common.loaderConfig(nodeModules),
header: BUNDLED_FILE_HEADER,
out: 'out-vscode',
bundleInfo: undefined
}));
const optimizeVSCodeTask = task.define('optimize-vscode', task.series(
task.parallel(
util.rimraf('out-vscode'),
compileBuildTask
),
common.optimizeTask({
src: 'out-build',
entryPoints: vscodeEntryPoints,
resources: vscodeResources,
loaderConfig: common.loaderConfig(nodeModules),
header: BUNDLED_FILE_HEADER,
out: 'out-vscode',
bundleInfo: undefined
})
));
gulp.task('optimize-index-js', ['optimize-vscode'], () => {
const fullpath = path.join(process.cwd(), 'out-vscode/vs/workbench/electron-browser/bootstrap/index.js');
const contents = fs.readFileSync(fullpath).toString();
const newContents = contents.replace('[/*BUILD->INSERT_NODE_MODULES*/]', JSON.stringify(nodeModules));
fs.writeFileSync(fullpath, newContents);
});
const optimizeIndexJSTask = task.define('optimize-index-js', task.series(
optimizeVSCodeTask,
() => {
const fullpath = path.join(process.cwd(), 'out-vscode/bootstrap-window.js');
const contents = fs.readFileSync(fullpath).toString();
const newContents = contents.replace('[/*BUILD->INSERT_NODE_MODULES*/]', JSON.stringify(nodeModules));
fs.writeFileSync(fullpath, newContents);
}
));
const baseUrl = `https://ticino.blob.core.windows.net/sourcemaps/${commit}/core`;
gulp.task('clean-minified-vscode', util.rimraf('out-vscode-min'));
gulp.task('minify-vscode', ['clean-minified-vscode', 'optimize-index-js'], common.minifyTask('out-vscode', baseUrl));
const sourceMappingURLBase = `https://ticino.blob.core.windows.net/sourcemaps/${commit}`;
const minifyVSCodeTask = task.define('minify-vscode', task.series(
task.parallel(
util.rimraf('out-vscode-min'),
optimizeIndexJSTask
),
common.minifyTask('out-vscode', `${sourceMappingURLBase}/core`)
));
// Package
// @ts-ignore JSON checking: darwinCredits is optional
const darwinCreditsTemplate = product.darwinCredits && _.template(fs.readFileSync(path.join(root, product.darwinCredits), 'utf8'));
function darwinBundleDocumentType(extensions, icon) {
return {
name: product.nameLong + ' document',
role: 'Editor',
ostypes: ["TEXT", "utxt", "TUTX", "****"],
extensions: extensions,
iconFile: icon
};
}
const config = {
version: getElectronVersion(),
productAppName: product.nameLong,
companyName: 'Microsoft Corporation',
copyright: 'Copyright (C) 2018 Microsoft. All rights reserved',
copyright: 'Copyright (C) 2019 Microsoft. All rights reserved',
darwinIcon: 'resources/darwin/code.icns',
darwinBundleIdentifier: product.darwinBundleIdentifier,
darwinApplicationCategoryType: 'public.app-category.developer-tools',
darwinHelpBookFolder: 'VS Code HelpBook',
darwinHelpBookName: 'VS Code HelpBook',
darwinBundleDocumentTypes: [{
name: product.nameLong + ' document',
role: 'Editor',
ostypes: ["TEXT", "utxt", "TUTX", "****"],
// {{SQL CARBON EDIT}}
extensions: ["csv", "json", "sqlplan", "sql", "xml"],
iconFile: 'resources/darwin/code_file.icns'
}],
darwinBundleDocumentTypes: [
// {{SQL CARBON EDIT}} - Remove most document types and replace with ours
darwinBundleDocumentType(["csv", "json", "sqlplan", "sql", "xml"], 'resources/darwin/code_file.icns'),
],
darwinBundleURLTypes: [{
role: 'Viewer',
name: product.nameLong,
urlSchemes: [product.urlProtocol]
}],
darwinCredits: darwinCreditsTemplate ? Buffer.from(darwinCreditsTemplate({ commit: commit, date: new Date().toISOString() })) : void 0,
darwinForceDarkModeSupport: true,
darwinCredits: darwinCreditsTemplate ? Buffer.from(darwinCreditsTemplate({ commit: commit, date: new Date().toISOString() })) : undefined,
linuxExecutableName: product.applicationName,
winIcon: 'resources/win32/code.ico',
token: process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || void 0,
token: process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined,
// @ts-ignore JSON checking: electronRepository is optional
repo: product.electronRepository || void 0
repo: product.electronRepository || undefined
};
function getElectron(arch) {
@@ -210,18 +212,18 @@ function getElectron(arch) {
});
return gulp.src('package.json')
.pipe(json({ name: product.nameShort }))
.pipe(json({ name: product.nameShort }))
.pipe(electron(electronOpts))
.pipe(filter(['**', '!**/app/package.json']))
.pipe(vfs.dest('.build/electron'));
};
}
gulp.task('clean-electron', util.rimraf('.build/electron'));
gulp.task('electron', ['clean-electron'], getElectron(process.arch));
gulp.task('electron-ia32', ['clean-electron'], getElectron('ia32'));
gulp.task('electron-x64', ['clean-electron'], getElectron('x64'));
gulp.task(task.define('electron', task.series(util.rimraf('.build/electron'), getElectron(process.arch))));
gulp.task(task.define('electron-ia32', task.series(util.rimraf('.build/electron'), getElectron('ia32'))));
gulp.task(task.define('electron-x64', task.series(util.rimraf('.build/electron'), getElectron('x64'))));
gulp.task(task.define('electron-arm', task.series(util.rimraf('.build/electron'), getElectron('armv7l'))));
gulp.task(task.define('electron-arm64', task.series(util.rimraf('.build/electron'), getElectron('arm64'))));
/**
* Compute checksums for some files.
@@ -257,116 +259,36 @@ function computeChecksum(filename) {
return hash;
}
function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
const packagePath = path.join(path.dirname(root), element.name + '.vsix');
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
// {{SQL CARBON EDIT}}
function packageAzureCoreTask(platform, arch) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', 'azurecore');
} else {
destination = path.join(destination, 'resources', 'app', 'extensions', 'azurecore');
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => azureExtensions.indexOf(name) > -1);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return ext.fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
function packageTask(platform, arch, opts) {
function packageTask(platform, arch, sourceFolderName, destinationFolderName, opts) {
opts = opts || {};
// {{SQL CARBON EDIT}}
const destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
const destination = path.join(path.dirname(root), destinationFolderName);
platform = platform || process.platform;
return () => {
const out = opts.minified ? 'out-vscode-min' : 'out-vscode';
const out = sourceFolderName;
const checksums = computeChecksums(out, [
'vs/workbench/workbench.main.js',
'vs/workbench/workbench.main.css',
'vs/workbench/electron-browser/bootstrap/index.html',
'vs/workbench/electron-browser/bootstrap/index.js',
'vs/workbench/electron-browser/bootstrap/preload.js'
'vs/code/electron-browser/workbench/workbench.html',
'vs/code/electron-browser/workbench/workbench.js'
]);
const src = gulp.src(out + '/**', { base: '.' })
.pipe(rename(function (path) { path.dirname = path.dirname.replace(new RegExp('^' + out), 'out'); }));
const root = path.resolve(path.join(__dirname, '..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1)
.filter(({ name }) => azureExtensions.indexOf(name) === -1);
packageBuiltInExtensions();
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return ext.fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
}));
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map(d => path.relative(root, d.path)).map(d => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util.cleanNodeModule('account-provider-azure', ['node_modules/date-utils/doc/**', 'node_modules/adal_node/node_modules/**'], undefined))
.pipe(util.cleanNodeModule('typescript', ['**/**'], undefined));
const sources = es.merge(src, localExtensions, localExtensionDependencies)
.pipe(rename(function (path) { path.dirname = path.dirname.replace(new RegExp('^' + out), 'out'); }))
.pipe(util.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
const root = path.resolve(path.join(__dirname, '..'));
// {{SQL CARBON EDIT}}
ext.packageBuiltInExtensions();
const sources = es.merge(src, ext.packageExtensionsStream({
sourceMappingURLBase: sourceMappingURLBase
}));
let version = packageJson.version;
// @ts-ignore JSON checking: quality is optional
const quality = product.quality;
@@ -377,8 +299,15 @@ function packageTask(platform, arch, opts) {
// {{SQL CARBON EDIT}}
const name = (platform === 'darwin') ? 'Azure Data Studio' : product.nameShort;
const packageJsonUpdates = { name, version };
// for linux url handling
if (platform === 'linux') {
packageJsonUpdates.desktopName = `${product.applicationName}-url-handler.desktop`;
}
const packageJsonStream = gulp.src(['package.json'], { base: '.' })
.pipe(json({ name, version }));
.pipe(json(packageJsonUpdates));
const date = new Date().toISOString();
const productJsonUpdate = { commit, date, checksums };
@@ -390,14 +319,13 @@ function packageTask(platform, arch, opts) {
const productJsonStream = gulp.src(['product.json'], { base: '.' })
.pipe(json(productJsonUpdate));
const license = gulp.src(['LICENSES.chromium.html', 'LICENSE.txt', 'ThirdPartyNotices.txt', 'licenses/**'], { base: '.' });
const watermark = gulp.src(['resources/letterpress.svg', 'resources/letterpress-dark.svg', 'resources/letterpress-hc.svg'], { base: '.' });
const license = gulp.src(['LICENSES.chromium.html', product.licenseFileName, 'ThirdPartyNotices.txt', 'licenses/**'], { base: '.', allowEmpty: true });
// TODO the API should be copied to `out` during compile, not here
const api = gulp.src('src/vs/vscode.d.ts').pipe(rename('out/vs/vscode.d.ts'));
// {{SQL CARBON EDIT}}
const dataApi = gulp.src('src/vs/data.d.ts').pipe(rename('out/sql/data.d.ts'));
// {{SQL CARBON EDIT}}
const dataApi = gulp.src('src/sql/azdata.d.ts').pipe(rename('out/sql/azdata.d.ts'));
const sqlopsAPI = gulp.src('src/sql/sqlops.d.ts').pipe(rename('out/sql/sqlops.d.ts'));
const depsSrc = [
..._.flatten(productionDependencies.map(d => path.relative(root, d.path)).map(d => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
@@ -407,29 +335,7 @@ function packageTask(platform, arch, opts) {
const deps = gulp.src(depsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util.cleanNodeModule('fsevents', ['binding.gyp', 'fsevents.cc', 'build/**', 'src/**', 'test/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('oniguruma', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node', 'src/*.js']))
.pipe(util.cleanNodeModule('windows-mutex', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-keymap', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-is-elevated', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-watchdog', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('spdlog', ['binding.gyp', 'build/**', 'deps/**', 'src/**', 'test/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('jschardet', ['dist/**']))
.pipe(util.cleanNodeModule('windows-foreground-love', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('windows-process-tree', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('gc-signals', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node', 'src/index.js']))
.pipe(util.cleanNodeModule('keytar', ['binding.gyp', 'build/**', 'src/**', 'script/**', 'node_modules/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('node-pty', ['binding.gyp', 'build/**', 'src/**', 'tools/**'], ['build/Release/*.exe', 'build/Release/*.dll', 'build/Release/*.node']))
// {{SQL CARBON EDIT}}
.pipe(util.cleanNodeModule('chart.js', ['node_modules/**'], undefined))
.pipe(util.cleanNodeModule('emmet', ['node_modules/**'], undefined))
.pipe(util.cleanNodeModule('pty.js', ['build/**'], ['build/Release/**']))
.pipe(util.cleanNodeModule('jquery-ui', ['external/**', 'demos/**'], undefined))
.pipe(util.cleanNodeModule('core-js', ['**/**'], undefined))
.pipe(util.cleanNodeModule('slickgrid', ['node_modules/**', 'examples/**'], undefined))
.pipe(util.cleanNodeModule('nsfw', ['binding.gyp', 'build/**', 'src/**', 'openpa/**', 'includes/**'], ['**/*.node', '**/*.a']))
.pipe(util.cleanNodeModule('vscode-nsfw', ['binding.gyp', 'build/**', 'src/**', 'openpa/**', 'includes/**'], ['**/*.node', '**/*.a']))
.pipe(util.cleanNodeModule('vsda', ['binding.gyp', 'README.md', 'build/**', '*.bat', '*.sh', '*.cpp', '*.h'], ['build/Release/vsda.node']))
.pipe(util.cleanNodeModules(path.join(__dirname, '.nativeignore')))
.pipe(createAsar(path.join(process.cwd(), 'node_modules'), ['**/*.node', '**/vscode-ripgrep/bin/*', '**/node-pty/build/Release/*'], 'app/node_modules.asar'));
// {{SQL CARBON EDIT}}
@@ -439,24 +345,33 @@ function packageTask(platform, arch, opts) {
'node_modules/slickgrid/**/*.*',
'node_modules/underscore/**/*.*',
'node_modules/zone.js/**/*.*',
'node_modules/chart.js/**/*.*'
'node_modules/chart.js/**/*.*',
'node_modules/chartjs-color/**/*.*',
'node_modules/chartjs-color-string/**/*.*',
'node_modules/color-convert/**/*.*',
'node_modules/color-name/**/*.*',
'node_modules/moment/**/*.*'
], { base: '.', dot: true });
let all = es.merge(
packageJsonStream,
packageJsonStream,
productJsonStream,
license,
watermark,
api,
// {{SQL CARBON EDIT}}
copiedModules,
dataApi,
sqlopsAPI,
sources,
deps
);
if (platform === 'win32') {
all = es.merge(all, gulp.src(['resources/win32/code_file.ico', 'resources/win32/code_70x70.png', 'resources/win32/code_150x150.png'], { base: '.' }));
all = es.merge(all, gulp.src([
// {{SQL CARBON EDIT}} remove unused icons
'resources/win32/code_70x70.png',
'resources/win32/code_150x150.png'
], { base: '.' }));
} else if (platform === 'linux') {
all = es.merge(all, gulp.src('resources/linux/code.png', { base: '.' }));
} else if (platform === 'darwin') {
@@ -472,8 +387,10 @@ function packageTask(platform, arch, opts) {
.pipe(electron(_.extend({}, config, { platform, arch, ffmpegChromium: true })))
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
// result = es.merge(result, gulp.src('resources/completions/**', { base: '.' }));
if (platform === 'win32') {
result = es.merge(result, gulp.src('resources/win32/bin/code.js', { base: 'resources/win32' }));
result = es.merge(result, gulp.src('resources/win32/bin/code.js', { base: 'resources/win32', allowEmpty: true }));
result = es.merge(result, gulp.src('resources/win32/bin/code.cmd', { base: 'resources/win32' })
.pipe(replace('@@NAME@@', product.nameShort))
@@ -481,47 +398,66 @@ function packageTask(platform, arch, opts) {
result = es.merge(result, gulp.src('resources/win32/bin/code.sh', { base: 'resources/win32' })
.pipe(replace('@@NAME@@', product.nameShort))
.pipe(replace('@@PRODNAME@@', product.nameLong))
.pipe(replace('@@VERSION@@', version))
.pipe(replace('@@COMMIT@@', commit))
.pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(replace('@@QUALITY@@', quality))
.pipe(rename(function (f) { f.basename = product.applicationName; f.extname = ''; })));
result = es.merge(result, gulp.src('resources/win32/VisualElementsManifest.xml', { base: 'resources/win32' })
.pipe(rename(product.nameShort + '.VisualElementsManifest.xml')));
} else if (platform === 'linux') {
result = es.merge(result, gulp.src('resources/linux/bin/code.sh', { base: '.' })
.pipe(replace('@@PRODNAME@@', product.nameLong))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('bin/' + product.applicationName)));
}
// submit all stats that have been collected
// during the build phase
if (opts.stats) {
result.on('end', () => {
const { submitAllStats } = require('./lib/stats');
submitAllStats(product, commit).then(() => console.log('Submitted bundle stats!'));
});
}
return result.pipe(vfs.dest(destination));
};
}
const buildRoot = path.dirname(root);
// {{SQL CARBON EDIT}}
gulp.task('vscode-win32-x64-azurecore', ['optimize-vscode'], packageAzureCoreTask('win32', 'x64'));
gulp.task('vscode-darwin-azurecore', ['optimize-vscode'], packageAzureCoreTask('darwin'));
gulp.task('vscode-linux-x64-azurecore', ['optimize-vscode'], packageAzureCoreTask('linux', 'x64'));
const BUILD_TARGETS = [
{ platform: 'win32', arch: 'ia32' },
{ platform: 'win32', arch: 'x64' },
{ platform: 'darwin', arch: null, opts: { stats: true } },
{ platform: 'linux', arch: 'ia32' },
{ platform: 'linux', arch: 'x64' },
{ platform: 'linux', arch: 'arm' },
{ platform: 'linux', arch: 'arm64' },
];
BUILD_TARGETS.forEach(buildTarget => {
const dashed = (str) => (str ? `-${str}` : ``);
const platform = buildTarget.platform;
const arch = buildTarget.arch;
const opts = buildTarget.opts;
gulp.task('clean-vscode-win32-ia32', util.rimraf(path.join(buildRoot, 'azuredatastudio-win32-ia32')));
gulp.task('clean-vscode-win32-x64', util.rimraf(path.join(buildRoot, 'azuredatastudio-win32-x64')));
gulp.task('clean-vscode-darwin', util.rimraf(path.join(buildRoot, 'azuredatastudio-darwin')));
gulp.task('clean-vscode-linux-ia32', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-ia32')));
gulp.task('clean-vscode-linux-x64', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-x64')));
gulp.task('clean-vscode-linux-arm', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-arm')));
['', 'min'].forEach(minified => {
const sourceFolderName = `out-vscode${dashed(minified)}`;
const destinationFolderName = `azuredatastudio${dashed(platform)}${dashed(arch)}`;
gulp.task('vscode-win32-ia32', ['optimize-vscode', 'clean-vscode-win32-ia32'], packageTask('win32', 'ia32'));
gulp.task('vscode-win32-x64', ['vscode-win32-x64-azurecore', 'optimize-vscode', 'clean-vscode-win32-x64'], packageTask('win32', 'x64'));
gulp.task('vscode-darwin', ['vscode-darwin-azurecore', 'optimize-vscode', 'clean-vscode-darwin'], packageTask('darwin'));
gulp.task('vscode-linux-ia32', ['optimize-vscode', 'clean-vscode-linux-ia32'], packageTask('linux', 'ia32'));
gulp.task('vscode-linux-x64', ['vscode-linux-x64-azurecore', 'optimize-vscode', 'clean-vscode-linux-x64'], packageTask('linux', 'x64'));
gulp.task('vscode-linux-arm', ['optimize-vscode', 'clean-vscode-linux-arm'], packageTask('linux', 'arm'));
gulp.task('vscode-win32-ia32-min', ['minify-vscode', 'clean-vscode-win32-ia32'], packageTask('win32', 'ia32', { minified: true }));
gulp.task('vscode-win32-x64-min', ['minify-vscode', 'clean-vscode-win32-x64'], packageTask('win32', 'x64', { minified: true }));
gulp.task('vscode-darwin-min', ['minify-vscode', 'clean-vscode-darwin'], packageTask('darwin', null, { minified: true }));
gulp.task('vscode-linux-ia32-min', ['minify-vscode', 'clean-vscode-linux-ia32'], packageTask('linux', 'ia32', { minified: true }));
gulp.task('vscode-linux-x64-min', ['minify-vscode', 'clean-vscode-linux-x64'], packageTask('linux', 'x64', { minified: true }));
gulp.task('vscode-linux-arm-min', ['minify-vscode', 'clean-vscode-linux-arm'], packageTask('linux', 'arm', { minified: true }));
const vscodeTask = task.define(`vscode${dashed(platform)}${dashed(arch)}${dashed(minified)}`, task.series(
task.parallel(
minified ? minifyVSCodeTask : optimizeVSCodeTask,
util.rimraf(path.join(buildRoot, destinationFolderName))
),
packageTask(platform, arch, sourceFolderName, destinationFolderName, opts)
));
gulp.task(vscodeTask);
});
});
// Transifex Localizations
@@ -544,68 +480,82 @@ const apiHostname = process.env.TRANSIFEX_API_URL;
const apiName = process.env.TRANSIFEX_API_NAME;
const apiToken = process.env.TRANSIFEX_API_TOKEN;
gulp.task('vscode-translations-push', ['optimize-vscode'], function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
gulp.task(task.define(
'vscode-translations-push',
task.series(
optimizeVSCodeTask,
function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(i18n.pushXlfFiles(apiHostname, apiName, apiToken));
});
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(i18n.pushXlfFiles(apiHostname, apiName, apiToken));
}
)
));
gulp.task('vscode-translations-push-test', ['optimize-vscode'], function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
gulp.task(task.define(
'vscode-translations-export',
task.series(
optimizeVSCodeTask,
function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
// {{SQL CARBON EDIT}}
// disable since function makes calls to VS Code Transifex API
// ).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(vfs.dest('../vscode-transifex-input'));
});
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(vfs.dest('../vscode-translations-export'));
}
)
));
gulp.task('vscode-translations-pull', function () {
[...i18n.defaultLanguages, ...i18n.extraLanguages].forEach(language => {
i18n.pullCoreAndExtensionsXlfFiles(apiHostname, apiName, apiToken, language).pipe(vfs.dest(`../vscode-localization/${language.id}/build`));
return es.merge([...i18n.defaultLanguages, ...i18n.extraLanguages].map(language => {
let includeDefault = !!innoSetupConfig[language.id].defaultInfo;
i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-localization/${language.id}/setup`));
});
return i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-translations-import/${language.id}/setup`));
}));
});
gulp.task('vscode-translations-import', function () {
[...i18n.defaultLanguages, ...i18n.extraLanguages].forEach(language => {
gulp.src(`../vscode-localization/${language.id}/build/*/*.xlf`)
.pipe(i18n.prepareI18nFiles())
.pipe(vfs.dest(`./i18n/${language.folderName}`));
// {{SQL CARBON EDIT}}
// gulp.src(`../vscode-localization/${language.id}/setup/*/*.xlf`)
// .pipe(i18n.prepareIslFiles(language, innoSetupConfig[language.id]))
// .pipe(vfs.dest(`./build/win32/i18n`));
// {{SQL CARBON EDIT}} - Replace function body with our own
return new Promise(function(resolve) {
[...i18n.defaultLanguages, ...i18n.extraLanguages].forEach(language => {
let languageId = language.translationId ? language.translationId : language.id;
gulp.src(`resources/xlf/${languageId}/**/*.xlf`)
.pipe(i18n.prepareI18nFiles())
.pipe(vfs.dest(`./i18n/${language.folderName}`));
resolve();
});
});
// {{SQL CARBON EDIT}} - End
});
// Sourcemaps
gulp.task('upload-vscode-sourcemaps', ['minify-vscode'], () => {
gulp.task('upload-vscode-sourcemaps', () => {
const vs = gulp.src('out-vscode-min/**/*.map', { base: 'out-vscode-min' })
.pipe(es.mapSync(f => {
f.path = `${f.base}/core/${f.relative}`;
return f;
}));
const extensions = gulp.src('extensions/**/out/**/*.map', { base: '.' });
const extensionsOut = gulp.src('extensions/**/out/**/*.map', { base: '.' });
const extensionsDist = gulp.src('extensions/**/dist/**/*.map', { base: '.' });
return es.merge(vs, extensions)
return es.merge(vs, extensionsOut, extensionsDist)
.pipe(es.through(function (data) {
// debug
console.log('Uploading Sourcemap', data.relative);
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
@@ -614,57 +564,8 @@ gulp.task('upload-vscode-sourcemaps', ['minify-vscode'], () => {
}));
});
const allConfigDetailsPath = path.join(os.tmpdir(), 'configuration.json');
gulp.task('upload-vscode-configuration', ['generate-vscode-configuration'], () => {
if (!shouldSetupSettingsSearch()) {
const branch = process.env.BUILD_SOURCEBRANCH;
console.log(`Only runs on master and release branches, not ${branch}`);
return;
}
if (!fs.existsSync(allConfigDetailsPath)) {
throw new Error(`configuration file at ${allConfigDetailsPath} does not exist`);
}
const settingsSearchBuildId = getSettingsSearchBuildId(packageJson);
if (!settingsSearchBuildId) {
throw new Error('Failed to compute build number');
}
return gulp.src(allConfigDetailsPath)
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'configuration',
prefix: `${settingsSearchBuildId}/${commit}/`
}));
});
function shouldSetupSettingsSearch() {
const branch = process.env.BUILD_SOURCEBRANCH;
return branch && (/\/master$/.test(branch) || branch.indexOf('/release/') >= 0);
}
function getSettingsSearchBuildId(packageJson) {
try {
const branch = process.env.BUILD_SOURCEBRANCH;
const branchId = branch.indexOf('/release/') >= 0 ? 0 :
/\/master$/.test(branch) ? 1 :
2; // Some unexpected branch
const out = cp.execSync(`git rev-list HEAD --count`);
const count = parseInt(out.toString());
// <version number><commit count><branchId (avoid unlikely conflicts)>
// 1.25.1, 1,234,567 commits, master = 1250112345671
return util.versionStringToNumber(packageJson.version) * 1e8 + count * 10 + branchId;
} catch (e) {
throw new Error('Could not determine build number: ' + e.toString());
}
}
// This task is only run for the MacOS build
gulp.task('generate-vscode-configuration', () => {
const generateVSCodeConfigurationTask = task.define('generate-vscode-configuration', () => {
return new Promise((resolve, reject) => {
const buildDir = process.env['AGENT_BUILDDIRECTORY'];
if (!buildDir) {
@@ -701,6 +602,61 @@ gulp.task('generate-vscode-configuration', () => {
});
});
const allConfigDetailsPath = path.join(os.tmpdir(), 'configuration.json');
gulp.task(task.define(
'upload-vscode-configuration',
task.series(
generateVSCodeConfigurationTask,
() => {
if (!shouldSetupSettingsSearch()) {
const branch = process.env.BUILD_SOURCEBRANCH;
console.log(`Only runs on master and release branches, not ${branch}`);
return;
}
if (!fs.existsSync(allConfigDetailsPath)) {
throw new Error(`configuration file at ${allConfigDetailsPath} does not exist`);
}
const settingsSearchBuildId = getSettingsSearchBuildId(packageJson);
if (!settingsSearchBuildId) {
throw new Error('Failed to compute build number');
}
return gulp.src(allConfigDetailsPath)
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'configuration',
prefix: `${settingsSearchBuildId}/${commit}/`
}));
}
)
));
function shouldSetupSettingsSearch() {
const branch = process.env.BUILD_SOURCEBRANCH;
return branch && (/\/master$/.test(branch) || branch.indexOf('/release/') >= 0);
}
function getSettingsSearchBuildId(packageJson) {
try {
const branch = process.env.BUILD_SOURCEBRANCH;
const branchId = branch.indexOf('/release/') >= 0 ? 0 :
/\/master$/.test(branch) ? 1 :
2; // Some unexpected branch
const out = cp.execSync(`git rev-list HEAD --count`);
const count = parseInt(out.toString());
// <version number><commit count><branchId (avoid unlikely conflicts)>
// 1.25.1, 1,234,567 commits, master = 1250112345671
return util.versionStringToNumber(packageJson.version) * 1e8 + count * 10 + branchId;
} catch (e) {
throw new Error('Could not determine build number: ' + e.toString());
}
}
// {{SQL CARBON EDIT}}
// Install service locally before building carbon
@@ -723,6 +679,28 @@ function installService() {
}
gulp.task('install-sqltoolsservice', () => {
return installService();
return installService();
});
function installSsmsMin() {
const config = require('../extensions/admin-tool-ext-win/src/config.json');
return platformInfo.getCurrent().then(p => {
const runtime = p.runtimeId;
// fix path since it won't be correct
config.installDirectory = path.join(__dirname, '..', 'extensions', 'admin-tool-ext-win', config.installDirectory);
var installer = new serviceDownloader(config);
const serviceInstallFolder = installer.getInstallDirectory(runtime);
const serviceCleanupFolder = path.join(serviceInstallFolder, '..');
console.log('Cleaning up the install folder: ' + serviceCleanupFolder);
return del(serviceCleanupFolder + '/*').then(() => {
console.log('Installing the service. Install folder: ' + serviceInstallFolder);
return installer.installService(runtime);
}, delError => {
console.log('failed to delete the install folder error: ' + delError);
});
});
}
gulp.task('install-ssmsmin', () => {
return installSsmsMin();
});

View File

@@ -12,29 +12,39 @@ const shell = require('gulp-shell');
const es = require('event-stream');
const vfs = require('vinyl-fs');
const util = require('./lib/util');
const task = require('./lib/task');
const packageJson = require('../package.json');
const product = require('../product.json');
const rpmDependencies = require('../resources/linux/rpm/dependencies.json');
const path = require('path');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
const linuxPackageRevision = Math.floor(new Date().getTime() / 1000);
function getDebPackageArch(arch) {
return { x64: 'amd64', ia32: 'i386', arm: 'armhf' }[arch];
return { x64: 'amd64', ia32: 'i386', arm: 'armhf', arm64: "arm64" }[arch];
}
function prepareDebPackage(arch) {
// {{SQL CARBON EDIT}}
// {{SQL CARBON EDIT}}
const binaryDir = '../azuredatastudio-linux-' + arch;
const debArch = getDebPackageArch(arch);
const destination = '.build/linux/deb/' + debArch + '/' + product.applicationName + '-' + debArch;
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename('usr/share/applications/' + product.applicationName + '.desktop'));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename('usr/share/applications/' + product.applicationName + '-url-handler.desktop'));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', product.applicationName))
.pipe(rename('usr/share/applications/' + product.applicationName + '.desktop'));
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const appdata = gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
@@ -43,7 +53,13 @@ function prepareDebPackage(arch) {
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('usr/share/pixmaps/' + product.applicationName + '.png'));
.pipe(rename('usr/share/pixmaps/' + product.linuxIconName + '.png'));
// const bash_completion = gulp.src('resources/completions/bash/code')
// .pipe(rename('usr/share/bash-completion/completions/code'));
// const zsh_completion = gulp.src('resources/completions/zsh/_code')
// .pipe(rename('usr/share/zsh/vendor-completions/_code'));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'usr/share/' + product.applicationName + '/' + p.dirname; }));
@@ -79,7 +95,7 @@ function prepareDebPackage(arch) {
.pipe(replace('@@UPDATEURL@@', product.updateUrl || '@@UPDATEURL@@'))
.pipe(rename('DEBIAN/postinst'));
const all = es.merge(control, postinst, postrm, prerm, desktop, appdata, icon, code);
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, icon, /* bash_completion, zsh_completion, */ code);
return all.pipe(vfs.dest(destination));
};
@@ -99,7 +115,7 @@ function getRpmBuildPath(rpmArch) {
}
function getRpmPackageArch(arch) {
return { x64: 'x86_64', ia32: 'i386', arm: 'armhf' }[arch];
return { x64: 'x86_64', ia32: 'i386', arm: 'armhf', arm64: "arm64" }[arch];
}
function prepareRpmPackage(arch) {
@@ -109,11 +125,17 @@ function prepareRpmPackage(arch) {
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '.desktop'));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '-url-handler.desktop'));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', product.applicationName))
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '.desktop'));
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const appdata = gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
@@ -122,7 +144,13 @@ function prepareRpmPackage(arch) {
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('BUILD/usr/share/pixmaps/' + product.applicationName + '.png'));
.pipe(rename('BUILD/usr/share/pixmaps/' + product.linuxIconName + '.png'));
// const bash_completion = gulp.src('resources/completions/bash/code')
// .pipe(rename('BUILD/usr/share/bash-completion/completions/code'));
// const zsh_completion = gulp.src('resources/completions/zsh/_code')
// .pipe(rename('BUILD/usr/share/zsh/site-functions/_code'));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'BUILD/usr/share/' + product.applicationName + '/' + p.dirname; }));
@@ -130,6 +158,7 @@ function prepareRpmPackage(arch) {
const spec = gulp.src('resources/linux/rpm/code.spec.template', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@VERSION@@', packageJson.version))
.pipe(replace('@@RELEASE@@', linuxPackageRevision))
.pipe(replace('@@ARCHITECTURE@@', rpmArch))
@@ -144,7 +173,7 @@ function prepareRpmPackage(arch) {
const specIcon = gulp.src('resources/linux/rpm/code.xpm', { base: '.' })
.pipe(rename('SOURCES/' + product.applicationName + '.xpm'));
const all = es.merge(code, desktop, appdata, icon, spec, specIcon);
const all = es.merge(code, desktops, appdata, icon, /* bash_completion, zsh_completion, */ spec, specIcon);
return all.pipe(vfs.dest(getRpmBuildPath(rpmArch)));
};
@@ -162,37 +191,45 @@ function buildRpmPackage(arch) {
'cp "' + rpmOut + '/$(ls ' + rpmOut + ')" ' + destination + '/'
]);
}
function getSnapBuildPath(arch) {
return `.build/linux/snap/${arch}/${product.applicationName}-${arch}`;
}
function prepareSnapPackage(arch) {
const binaryDir = '../VSCode-linux-' + arch;
// {{SQL CARBON EDIT}}
const binaryDir = '../azuredatastudio-linux-' + arch;
const destination = getSnapBuildPath(arch);
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename(`usr/share/applications/${product.applicationName}.desktop`));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename(`usr/share/applications/${product.applicationName}-url-handler.desktop`));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', `/usr/share/pixmaps/${product.applicationName}.png`))
.pipe(rename(`usr/share/applications/${product.applicationName}.desktop`));
.pipe(replace('@@ICON@@', `/usr/share/pixmaps/${product.linuxIconName}.png`))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename(`usr/share/pixmaps/${product.applicationName}.png`));
.pipe(rename(`usr/share/pixmaps/${product.linuxIconName}.png`));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'usr/share/' + product.applicationName + '/' + p.dirname; }));
.pipe(rename(function (p) { p.dirname = `usr/share/${product.applicationName}/${p.dirname}`; }));
const snapcraft = gulp.src('resources/linux/snap/snapcraft.yaml', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@VERSION@@', packageJson.version))
.pipe(replace('@@VERSION@@', commit.substr(0, 8)))
.pipe(rename('snap/snapcraft.yaml'));
const electronLaunch = gulp.src('resources/linux/snap/electron-launch', { base: '.' })
.pipe(rename('electron-launch'));
const all = es.merge(desktop, icon, code, snapcraft, electronLaunch);
const all = es.merge(desktops, icon, code, snapcraft, electronLaunch);
return all.pipe(vfs.dest(destination));
};
@@ -200,117 +237,39 @@ function prepareSnapPackage(arch) {
function buildSnapPackage(arch) {
const snapBuildPath = getSnapBuildPath(arch);
const snapFilename = `${product.applicationName}-${packageJson.version}-${linuxPackageRevision}-${arch}.snap`;
return shell.task([
`chmod +x ${snapBuildPath}/electron-launch`,
`cd ${snapBuildPath} && snapcraft snap --output ../${snapFilename}`
]);
return shell.task(`cd ${snapBuildPath} && snapcraft build`);
}
function getFlatpakArch(arch) {
return { x64: 'x86_64', ia32: 'i386', arm: 'arm' }[arch];
}
const BUILD_TARGETS = [
{ arch: 'ia32' },
{ arch: 'x64' },
{ arch: 'arm' },
{ arch: 'arm64' },
];
function prepareFlatpak(arch) {
// {{SQL CARBON EDIT}}
const binaryDir = '../azuredatastudio-linux-' + arch;
const flatpakArch = getFlatpakArch(arch);
const destination = '.build/linux/flatpak/' + flatpakArch;
BUILD_TARGETS.forEach((buildTarget) => {
const arch = buildTarget.arch;
return function () {
// This is not imported in the global scope to avoid requiring ImageMagick
// (or GraphicsMagick) when not building building Flatpak bundles.
const imgResize = require('gulp-image-resize');
const all = [16, 24, 32, 48, 64, 128, 192, 256, 512].map(function (size) {
return gulp.src('resources/linux/code.png', { base: '.' })
.pipe(imgResize({ width: size, height: size, format: "png", noProfile: true }))
.pipe(rename('share/icons/hicolor/' + size + 'x' + size + '/apps/' + flatpakManifest.appId + '.png'));
});
all.push(gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(replace('Exec=/usr/share/@@NAME@@/@@NAME@@', 'Exec=' + product.applicationName))
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('share/applications/' + flatpakManifest.appId + '.desktop')));
all.push(gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', flatpakManifest.appId))
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('share/appdata/' + flatpakManifest.appId + '.appdata.xml')));
all.push(gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) {
p.dirname = 'share/' + product.applicationName + '/' + p.dirname;
})));
return es.merge(all).pipe(vfs.dest(destination));
};
}
function buildFlatpak(arch) {
const flatpakArch = getFlatpakArch(arch);
const manifest = {};
for (var k in flatpakManifest) {
manifest[k] = flatpakManifest[k];
{
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
// gulp.task(prepareDebTask);
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
}
manifest.files = [
['.build/linux/flatpak/' + flatpakArch, '/'],
];
const buildOptions = {
arch: flatpakArch,
subject: product.nameLong + ' ' + packageJson.version + '.' + linuxPackageRevision,
};
// If requested, use the configured path for the OSTree repository.
if (process.env.FLATPAK_REPO) {
buildOptions.repoDir = process.env.FLATPAK_REPO;
} else {
buildOptions.bundlePath = manifest.appId + '-' + flatpakArch + '.flatpak';
{
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
// gulp.task(prepareRpmTask);
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
}
// Setup PGP signing if requested.
if (process.env.GPG_KEY_ID !== undefined) {
buildOptions.gpgSign = process.env.GPG_KEY_ID;
if (process.env.GPG_HOMEDIR) {
buildOptions.gpgHomedir = process.env.GPG_HOME_DIR;
}
{
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
}
return function (cb) {
require('flatpak-bundler').bundle(manifest, buildOptions, cb);
};
}
gulp.task('clean-vscode-linux-ia32-deb', util.rimraf('.build/linux/deb/i386'));
gulp.task('clean-vscode-linux-x64-deb', util.rimraf('.build/linux/deb/amd64'));
gulp.task('clean-vscode-linux-arm-deb', util.rimraf('.build/linux/deb/armhf'));
gulp.task('clean-vscode-linux-ia32-rpm', util.rimraf('.build/linux/rpm/i386'));
gulp.task('clean-vscode-linux-x64-rpm', util.rimraf('.build/linux/rpm/x86_64'));
gulp.task('clean-vscode-linux-arm-rpm', util.rimraf('.build/linux/rpm/armhf'));
gulp.task('clean-vscode-linux-ia32-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-x64-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-arm-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-ia32-flatpak', util.rimraf('.build/linux/flatpak/i386'));
gulp.task('clean-vscode-linux-x64-flatpak', util.rimraf('.build/linux/flatpak/x86_64'));
gulp.task('clean-vscode-linux-arm-flatpak', util.rimraf('.build/linux/flatpak/arm'));
gulp.task('vscode-linux-ia32-prepare-deb', ['clean-vscode-linux-ia32-deb'], prepareDebPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-deb', ['clean-vscode-linux-x64-deb'], prepareDebPackage('x64'));
gulp.task('vscode-linux-arm-prepare-deb', ['clean-vscode-linux-arm-deb'], prepareDebPackage('arm'));
gulp.task('vscode-linux-ia32-build-deb', ['vscode-linux-ia32-prepare-deb'], buildDebPackage('ia32'));
gulp.task('vscode-linux-x64-build-deb', ['vscode-linux-x64-prepare-deb'], buildDebPackage('x64'));
gulp.task('vscode-linux-arm-build-deb', ['vscode-linux-arm-prepare-deb'], buildDebPackage('arm'));
gulp.task('vscode-linux-ia32-prepare-rpm', ['clean-vscode-linux-ia32-rpm'], prepareRpmPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-rpm', ['clean-vscode-linux-x64-rpm'], prepareRpmPackage('x64'));
gulp.task('vscode-linux-arm-prepare-rpm', ['clean-vscode-linux-arm-rpm'], prepareRpmPackage('arm'));
gulp.task('vscode-linux-ia32-build-rpm', ['vscode-linux-ia32-prepare-rpm'], buildRpmPackage('ia32'));
gulp.task('vscode-linux-x64-build-rpm', ['vscode-linux-x64-prepare-rpm'], buildRpmPackage('x64'));
gulp.task('vscode-linux-arm-build-rpm', ['vscode-linux-arm-prepare-rpm'], buildRpmPackage('arm'));
gulp.task('vscode-linux-ia32-prepare-snap', ['clean-vscode-linux-ia32-snap'], prepareSnapPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-snap', ['clean-vscode-linux-x64-snap'], prepareSnapPackage('x64'));
gulp.task('vscode-linux-arm-prepare-snap', ['clean-vscode-linux-arm-snap'], prepareSnapPackage('arm'));
gulp.task('vscode-linux-ia32-build-snap', ['vscode-linux-ia32-prepare-snap'], buildSnapPackage('ia32'));
gulp.task('vscode-linux-x64-build-snap', ['vscode-linux-x64-prepare-snap'], buildSnapPackage('x64'));
gulp.task('vscode-linux-arm-build-snap', ['vscode-linux-arm-prepare-snap'], buildSnapPackage('arm'));
});

View File

@@ -12,9 +12,11 @@ const assert = require('assert');
const cp = require('child_process');
const _7z = require('7zip')['7z'];
const util = require('./lib/util');
const task = require('./lib/task');
const pkg = require('../package.json');
const product = require('../product.json');
const vfs = require('vinyl-fs');
const rcedit = require('rcedit');
const mkdirp = require('mkdirp');
const repoPath = path.dirname(__dirname);
@@ -25,18 +27,21 @@ const zipPath = arch => path.join(zipDir(arch), `VSCode-win32-${arch}.zip`);
const setupDir = (arch, target) => path.join(repoPath, '.build', `win32-${arch}`, `${target}-setup`);
const issPath = path.join(__dirname, 'win32', 'code.iss');
const innoSetupPath = path.join(path.dirname(path.dirname(require.resolve('innosetup-compiler'))), 'bin', 'ISCC.exe');
const signPS1 = path.join(repoPath, 'build', 'tfs', 'win32', 'sign.ps1');
// const signPS1 = path.join(repoPath, 'build', 'azure-pipelines', 'win32', 'sign.ps1');
function packageInnoSetup(iss, options, cb) {
options = options || {};
const definitions = options.definitions || {};
const debug = process.argv.some(arg => arg === '--debug-inno');
if (debug) {
if (process.argv.some(arg => arg === '--debug-inno')) {
definitions['Debug'] = 'true';
}
if (process.argv.some(arg => arg === '--sign')) {
definitions['Sign'] = 'true';
}
const keys = Object.keys(definitions);
keys.forEach(key => assert(typeof definitions[key] === 'string', `Missing value for '${key}' in Inno Setup package step`));
@@ -103,8 +108,8 @@ function buildWin32Setup(arch, target) {
}
function defineWin32SetupTasks(arch, target) {
gulp.task(`clean-vscode-win32-${arch}-${target}-setup`, util.rimraf(setupDir(arch, target)));
gulp.task(`vscode-win32-${arch}-${target}-setup`, [`clean-vscode-win32-${arch}-${target}-setup`], buildWin32Setup(arch, target));
const cleanTask = util.rimraf(setupDir(arch, target));
gulp.task(task.define(`vscode-win32-${arch}-${target}-setup`, task.series(cleanTask, buildWin32Setup(arch, target))));
}
defineWin32SetupTasks('ia32', 'system');
@@ -122,11 +127,8 @@ function archiveWin32Setup(arch) {
};
}
gulp.task('clean-vscode-win32-ia32-archive', util.rimraf(zipDir('ia32')));
gulp.task('vscode-win32-ia32-archive', ['clean-vscode-win32-ia32-archive'], archiveWin32Setup('ia32'));
gulp.task('clean-vscode-win32-x64-archive', util.rimraf(zipDir('x64')));
gulp.task('vscode-win32-x64-archive', ['clean-vscode-win32-x64-archive'], archiveWin32Setup('x64'));
gulp.task(task.define('vscode-win32-ia32-archive', task.series(util.rimraf(zipDir('ia32')), archiveWin32Setup('ia32'))));
gulp.task(task.define('vscode-win32-x64-archive', task.series(util.rimraf(zipDir('x64')), archiveWin32Setup('x64'))));
function copyInnoUpdater(arch) {
return () => {
@@ -135,5 +137,12 @@ function copyInnoUpdater(arch) {
};
}
gulp.task('vscode-win32-ia32-copy-inno-updater', copyInnoUpdater('ia32'));
gulp.task('vscode-win32-x64-copy-inno-updater', copyInnoUpdater('x64'));
function patchInnoUpdater(arch) {
return cb => {
const icon = path.join(repoPath, 'resources', 'win32', 'code.ico');
rcedit(path.join(buildPath(arch), 'tools', 'inno_updater.exe'), { icon }, cb);
};
}
gulp.task(task.define('vscode-win32-ia32-inno-updater', task.series(copyInnoUpdater('ia32'), patchInnoUpdater('ia32'))));
gulp.task(task.define('vscode-win32-x64-inno-updater', task.series(copyInnoUpdater('x64'), patchInnoUpdater('x64'))));

15
build/jsconfig.json Normal file
View File

@@ -0,0 +1,15 @@
{
"compilerOptions": {
"module": "commonjs",
"target": "es2017",
"jsx": "preserve",
"checkJs": true
},
"include": [
"**/*.js"
],
"exclude": [
"node_modules",
"**/node_modules/*"
]
}

View File

@@ -4,33 +4,33 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var es = require("event-stream");
var pickle = require("chromium-pickle-js");
var Filesystem = require("asar/lib/filesystem");
var VinylFile = require("vinyl");
var minimatch = require("minimatch");
const path = require("path");
const es = require("event-stream");
const pickle = require('chromium-pickle-js');
const Filesystem = require('asar/lib/filesystem');
const VinylFile = require("vinyl");
const minimatch = require("minimatch");
function createAsar(folderPath, unpackGlobs, destFilename) {
var shouldUnpackFile = function (file) {
for (var i = 0; i < unpackGlobs.length; i++) {
const shouldUnpackFile = (file) => {
for (let i = 0; i < unpackGlobs.length; i++) {
if (minimatch(file.relative, unpackGlobs[i])) {
return true;
}
}
return false;
};
var filesystem = new Filesystem(folderPath);
var out = [];
const filesystem = new Filesystem(folderPath);
const out = [];
// Keep track of pending inserts
var pendingInserts = 0;
var onFileInserted = function () { pendingInserts--; };
let pendingInserts = 0;
let onFileInserted = () => { pendingInserts--; };
// Do not insert twice the same directory
var seenDir = {};
var insertDirectoryRecursive = function (dir) {
const seenDir = {};
const insertDirectoryRecursive = (dir) => {
if (seenDir[dir]) {
return;
}
var lastSlash = dir.lastIndexOf('/');
let lastSlash = dir.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = dir.lastIndexOf('\\');
}
@@ -40,8 +40,8 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
seenDir[dir] = true;
filesystem.insertDirectory(dir);
};
var insertDirectoryForFile = function (file) {
var lastSlash = file.lastIndexOf('/');
const insertDirectoryForFile = (file) => {
let lastSlash = file.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = file.lastIndexOf('\\');
}
@@ -49,7 +49,7 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
insertDirectoryRecursive(file.substring(0, lastSlash));
}
};
var insertFile = function (relativePath, stat, shouldUnpack) {
const insertFile = (relativePath, stat, shouldUnpack) => {
insertDirectoryForFile(relativePath);
pendingInserts++;
filesystem.insertFile(relativePath, shouldUnpack, { stat: stat }, {}, onFileInserted);
@@ -59,13 +59,13 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
return;
}
if (!file.stat.isFile()) {
throw new Error("unknown item in stream!");
throw new Error(`unknown item in stream!`);
}
var shouldUnpack = shouldUnpackFile(file);
const shouldUnpack = shouldUnpackFile(file);
insertFile(file.relative, { size: file.contents.length, mode: file.stat.mode }, shouldUnpack);
if (shouldUnpack) {
// The file goes outside of xx.asar, in a folder xx.asar.unpacked
var relative = path.relative(folderPath, file.path);
const relative = path.relative(folderPath, file.path);
this.queue(new VinylFile({
cwd: folderPath,
base: folderPath,
@@ -79,34 +79,33 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
out.push(file.contents);
}
}, function () {
var _this = this;
var finish = function () {
let finish = () => {
{
var headerPickle = pickle.createEmpty();
const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header));
var headerBuf = headerPickle.toBuffer();
var sizePickle = pickle.createEmpty();
const headerBuf = headerPickle.toBuffer();
const sizePickle = pickle.createEmpty();
sizePickle.writeUInt32(headerBuf.length);
var sizeBuf = sizePickle.toBuffer();
const sizeBuf = sizePickle.toBuffer();
out.unshift(headerBuf);
out.unshift(sizeBuf);
}
var contents = Buffer.concat(out);
const contents = Buffer.concat(out);
out.length = 0;
_this.queue(new VinylFile({
this.queue(new VinylFile({
cwd: folderPath,
base: folderPath,
path: destFilename,
contents: contents
}));
_this.queue(null);
this.queue(null);
};
// Call finish() only when all file inserts have finished...
if (pendingInserts === 0) {
finish();
}
else {
onFileInserted = function () {
onFileInserted = () => {
pendingInserts--;
if (pendingInserts === 0) {
finish();

View File

@@ -7,8 +7,8 @@
import * as path from 'path';
import * as es from 'event-stream';
import * as pickle from 'chromium-pickle-js';
import * as Filesystem from 'asar/lib/filesystem';
const pickle = require('chromium-pickle-js');
const Filesystem = require('asar/lib/filesystem');
import * as VinylFile from 'vinyl';
import * as minimatch from 'minimatch';

View File

@@ -14,7 +14,8 @@ const es = require('event-stream');
const rename = require('gulp-rename');
const vfs = require('vinyl-fs');
const ext = require('./extensions');
const util = require('gulp-util');
const fancyLog = require('fancy-log');
const ansiColors = require('ansi-colors');
const root = path.dirname(path.dirname(__dirname));
const builtInExtensions = require('../builtInExtensions.json');
@@ -43,22 +44,22 @@ function isUpToDate(extension) {
function syncMarketplaceExtension(extension) {
if (isUpToDate(extension)) {
util.log(util.colors.blue('[marketplace]'), `${extension.name}@${extension.version}`, util.colors.green('✔︎'));
fancyLog(ansiColors.blue('[marketplace]'), `${extension.name}@${extension.version}`, ansiColors.green('✔︎'));
return es.readArray([]);
}
rimraf.sync(getExtensionPath(extension));
return ext.fromMarketplace(extension.name, extension.version)
return ext.fromMarketplace(extension.name, extension.version, extension.metadata)
.pipe(rename(p => p.dirname = `${extension.name}/${p.dirname}`))
.pipe(vfs.dest('.build/builtInExtensions'))
.on('end', () => util.log(util.colors.blue('[marketplace]'), extension.name, util.colors.green('✔︎')));
.on('end', () => fancyLog(ansiColors.blue('[marketplace]'), extension.name, ansiColors.green('✔︎')));
}
function syncExtension(extension, controlState) {
switch (controlState) {
case 'disabled':
util.log(util.colors.blue('[disabled]'), util.colors.gray(extension.name));
fancyLog(ansiColors.blue('[disabled]'), ansiColors.gray(extension.name));
return es.readArray([]);
case 'marketplace':
@@ -66,15 +67,15 @@ function syncExtension(extension, controlState) {
default:
if (!fs.existsSync(controlState)) {
util.log(util.colors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but that path does not exist.`));
fancyLog(ansiColors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but that path does not exist.`));
return es.readArray([]);
} else if (!fs.existsSync(path.join(controlState, 'package.json'))) {
util.log(util.colors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but there is no 'package.json' file in that directory.`));
fancyLog(ansiColors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but there is no 'package.json' file in that directory.`));
return es.readArray([]);
}
util.log(util.colors.blue('[local]'), `${extension.name}: ${util.colors.cyan(controlState)}`, util.colors.green('✔︎'));
fancyLog(ansiColors.blue('[local]'), `${extension.name}: ${ansiColors.cyan(controlState)}`, ansiColors.green('✔︎'));
return es.readArray([]);
}
}
@@ -93,8 +94,8 @@ function writeControlFile(control) {
}
function main() {
util.log('Syncronizing built-in extensions...');
util.log(`You can manage built-in extensions with the ${util.colors.cyan('--builtin')} flag`);
fancyLog('Syncronizing built-in extensions...');
fancyLog(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`);
const control = readControlFile();
const streams = [];

View File

@@ -4,19 +4,19 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var fs = require("fs");
var path = require("path");
var vm = require("vm");
const fs = require("fs");
const path = require("path");
const vm = require("vm");
/**
* Bundle `entryPoints` given config `config`.
*/
function bundle(entryPoints, config, callback) {
var entryPointsMap = {};
entryPoints.forEach(function (module) {
const entryPointsMap = {};
entryPoints.forEach((module) => {
entryPointsMap[module.name] = module;
});
var allMentionedModulesMap = {};
entryPoints.forEach(function (module) {
const allMentionedModulesMap = {};
entryPoints.forEach((module) => {
allMentionedModulesMap[module.name] = true;
(module.include || []).forEach(function (includedModule) {
allMentionedModulesMap[includedModule] = true;
@@ -25,26 +25,30 @@ function bundle(entryPoints, config, callback) {
allMentionedModulesMap[excludedModule] = true;
});
});
var code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
var r = vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
var loaderModule = { exports: {} };
const code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
const r = vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
const loaderModule = { exports: {} };
r.call({}, require, loaderModule, loaderModule.exports);
var loader = loaderModule.exports;
const loader = loaderModule.exports;
config.isBuild = true;
config.paths = config.paths || {};
config.paths['vs/nls'] = 'out-build/vs/nls.build';
config.paths['vs/css'] = 'out-build/vs/css.build';
if (!config.paths['vs/nls']) {
config.paths['vs/nls'] = 'out-build/vs/nls.build';
}
if (!config.paths['vs/css']) {
config.paths['vs/css'] = 'out-build/vs/css.build';
}
loader.config(config);
loader(['require'], function (localRequire) {
var resolvePath = function (path) {
var r = localRequire.toUrl(path);
loader(['require'], (localRequire) => {
const resolvePath = (path) => {
const r = localRequire.toUrl(path);
if (!/\.js/.test(r)) {
return r + '.js';
}
return r;
};
for (var moduleId in entryPointsMap) {
var entryPoint = entryPointsMap[moduleId];
for (const moduleId in entryPointsMap) {
const entryPoint = entryPointsMap[moduleId];
if (entryPoint.append) {
entryPoint.append = entryPoint.append.map(resolvePath);
}
@@ -53,59 +57,59 @@ function bundle(entryPoints, config, callback) {
}
}
});
loader(Object.keys(allMentionedModulesMap), function () {
var modules = loader.getBuildInfo();
var partialResult = emitEntryPoints(modules, entryPointsMap);
var cssInlinedResources = loader('vs/css').getInlinedResources();
loader(Object.keys(allMentionedModulesMap), () => {
const modules = loader.getBuildInfo();
const partialResult = emitEntryPoints(modules, entryPointsMap);
const cssInlinedResources = loader('vs/css').getInlinedResources();
callback(null, {
files: partialResult.files,
cssInlinedResources: cssInlinedResources,
bundleData: partialResult.bundleData
});
}, function (err) { return callback(err, null); });
}, (err) => callback(err, null));
}
exports.bundle = bundle;
function emitEntryPoints(modules, entryPoints) {
var modulesMap = {};
modules.forEach(function (m) {
const modulesMap = {};
modules.forEach((m) => {
modulesMap[m.id] = m;
});
var modulesGraph = {};
modules.forEach(function (m) {
const modulesGraph = {};
modules.forEach((m) => {
modulesGraph[m.id] = m.dependencies;
});
var sortedModules = topologicalSort(modulesGraph);
var result = [];
var usedPlugins = {};
var bundleData = {
const sortedModules = topologicalSort(modulesGraph);
let result = [];
const usedPlugins = {};
const bundleData = {
graph: modulesGraph,
bundles: {}
};
Object.keys(entryPoints).forEach(function (moduleToBundle) {
var info = entryPoints[moduleToBundle];
var rootNodes = [moduleToBundle].concat(info.include || []);
var allDependencies = visit(rootNodes, modulesGraph);
var excludes = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach(function (excludeRoot) {
var allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach(function (exclude) {
Object.keys(entryPoints).forEach((moduleToBundle) => {
const info = entryPoints[moduleToBundle];
const rootNodes = [moduleToBundle].concat(info.include || []);
const allDependencies = visit(rootNodes, modulesGraph);
const excludes = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach((excludeRoot) => {
const allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach((exclude) => {
delete allDependencies[exclude];
});
});
var includedModules = sortedModules.filter(function (module) {
const includedModules = sortedModules.filter((module) => {
return allDependencies[module];
});
bundleData.bundles[moduleToBundle] = includedModules;
var res = emitEntryPoint(modulesMap, modulesGraph, moduleToBundle, includedModules, info.prepend, info.append, info.dest);
const res = emitEntryPoint(modulesMap, modulesGraph, moduleToBundle, includedModules, info.prepend || [], info.append || [], info.dest);
result = result.concat(res.files);
for (var pluginName in res.usedPlugins) {
for (const pluginName in res.usedPlugins) {
usedPlugins[pluginName] = usedPlugins[pluginName] || res.usedPlugins[pluginName];
}
});
Object.keys(usedPlugins).forEach(function (pluginName) {
var plugin = usedPlugins[pluginName];
Object.keys(usedPlugins).forEach((pluginName) => {
const plugin = usedPlugins[pluginName];
if (typeof plugin.finishBuild === 'function') {
var write = function (filename, contents) {
const write = (filename, contents) => {
result.push({
dest: filename,
sources: [{
@@ -124,16 +128,16 @@ function emitEntryPoints(modules, entryPoints) {
};
}
function extractStrings(destFiles) {
var parseDefineCall = function (moduleMatch, depsMatch) {
var module = moduleMatch.replace(/^"|"$/g, '');
var deps = depsMatch.split(',');
deps = deps.map(function (dep) {
const parseDefineCall = (moduleMatch, depsMatch) => {
const module = moduleMatch.replace(/^"|"$/g, '');
let deps = depsMatch.split(',');
deps = deps.map((dep) => {
dep = dep.trim();
dep = dep.replace(/^"|"$/g, '');
dep = dep.replace(/^'|'$/g, '');
var prefix = null;
var _path = null;
var pieces = dep.split('!');
let prefix = null;
let _path = null;
const pieces = dep.split('!');
if (pieces.length > 1) {
prefix = pieces[0] + '!';
_path = pieces[1];
@@ -143,7 +147,7 @@ function extractStrings(destFiles) {
_path = pieces[0];
}
if (/^\.\//.test(_path) || /^\.\.\//.test(_path)) {
var res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
const res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
return prefix + res;
}
return prefix + _path;
@@ -153,7 +157,7 @@ function extractStrings(destFiles) {
deps: deps
};
};
destFiles.forEach(function (destFile, index) {
destFiles.forEach((destFile) => {
if (!/\.js$/.test(destFile.dest)) {
return;
}
@@ -161,44 +165,44 @@ function extractStrings(destFiles) {
return;
}
// Do one pass to record the usage counts for each module id
var useCounts = {};
destFile.sources.forEach(function (source) {
var matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
const useCounts = {};
destFile.sources.forEach((source) => {
const matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
if (!matches) {
return;
}
var defineCall = parseDefineCall(matches[1], matches[2]);
const defineCall = parseDefineCall(matches[1], matches[2]);
useCounts[defineCall.module] = (useCounts[defineCall.module] || 0) + 1;
defineCall.deps.forEach(function (dep) {
defineCall.deps.forEach((dep) => {
useCounts[dep] = (useCounts[dep] || 0) + 1;
});
});
var sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort(function (a, b) {
const sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort((a, b) => {
return useCounts[b] - useCounts[a];
});
var replacementMap = {};
sortedByUseModules.forEach(function (module, index) {
const replacementMap = {};
sortedByUseModules.forEach((module, index) => {
replacementMap[module] = index;
});
destFile.sources.forEach(function (source) {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, function (_, moduleMatch, depsMatch) {
var defineCall = parseDefineCall(moduleMatch, depsMatch);
return "define(__m[" + replacementMap[defineCall.module] + "/*" + defineCall.module + "*/], __M([" + defineCall.deps.map(function (dep) { return replacementMap[dep] + '/*' + dep + '*/'; }).join(',') + "])";
destFile.sources.forEach((source) => {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, (_, moduleMatch, depsMatch) => {
const defineCall = parseDefineCall(moduleMatch, depsMatch);
return `define(__m[${replacementMap[defineCall.module]}/*${defineCall.module}*/], __M([${defineCall.deps.map(dep => replacementMap[dep] + '/*' + dep + '*/').join(',')}])`;
});
});
destFile.sources.unshift({
path: null,
contents: [
'(function() {',
"var __m = " + JSON.stringify(sortedByUseModules) + ";",
"var __M = function(deps) {",
" var result = [];",
" for (var i = 0, len = deps.length; i < len; i++) {",
" result[i] = __m[deps[i]];",
" }",
" return result;",
"};"
`var __m = ${JSON.stringify(sortedByUseModules)};`,
`var __M = function(deps) {`,
` var result = [];`,
` for (var i = 0, len = deps.length; i < len; i++) {`,
` result[i] = __m[deps[i]];`,
` }`,
` return result;`,
`};`
].join('\n')
});
destFile.sources.push({
@@ -210,7 +214,7 @@ function extractStrings(destFiles) {
}
function removeDuplicateTSBoilerplate(destFiles) {
// Taken from typescript compiler => emitFiles
var BOILERPLATE = [
const BOILERPLATE = [
{ start: /^var __extends/, end: /^}\)\(\);$/ },
{ start: /^var __assign/, end: /^};$/ },
{ start: /^var __decorate/, end: /^};$/ },
@@ -219,14 +223,14 @@ function removeDuplicateTSBoilerplate(destFiles) {
{ start: /^var __awaiter/, end: /^};$/ },
{ start: /^var __generator/, end: /^};$/ },
];
destFiles.forEach(function (destFile) {
var SEEN_BOILERPLATE = [];
destFile.sources.forEach(function (source) {
var lines = source.contents.split(/\r\n|\n|\r/);
var newLines = [];
var IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE;
for (var i = 0; i < lines.length; i++) {
var line = lines[i];
destFiles.forEach((destFile) => {
const SEEN_BOILERPLATE = [];
destFile.sources.forEach((source) => {
const lines = source.contents.split(/\r\n|\n|\r/);
const newLines = [];
let IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (IS_REMOVING_BOILERPLATE) {
newLines.push('');
if (END_BOILERPLATE.test(line)) {
@@ -234,8 +238,8 @@ function removeDuplicateTSBoilerplate(destFiles) {
}
}
else {
for (var j = 0; j < BOILERPLATE.length; j++) {
var boilerplate = BOILERPLATE[j];
for (let j = 0; j < BOILERPLATE.length; j++) {
const boilerplate = BOILERPLATE[j];
if (boilerplate.start.test(line)) {
if (SEEN_BOILERPLATE[j]) {
IS_REMOVING_BOILERPLATE = true;
@@ -263,45 +267,45 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
if (!dest) {
dest = entryPoint + '.js';
}
var mainResult = {
const mainResult = {
sources: [],
dest: dest
}, results = [mainResult];
var usedPlugins = {};
var getLoaderPlugin = function (pluginName) {
const usedPlugins = {};
const getLoaderPlugin = (pluginName) => {
if (!usedPlugins[pluginName]) {
usedPlugins[pluginName] = modulesMap[pluginName].exports;
}
return usedPlugins[pluginName];
};
includedModules.forEach(function (c) {
var bangIndex = c.indexOf('!');
includedModules.forEach((c) => {
const bangIndex = c.indexOf('!');
if (bangIndex >= 0) {
var pluginName = c.substr(0, bangIndex);
var plugin = getLoaderPlugin(pluginName);
const pluginName = c.substr(0, bangIndex);
const plugin = getLoaderPlugin(pluginName);
mainResult.sources.push(emitPlugin(entryPoint, plugin, pluginName, c.substr(bangIndex + 1)));
return;
}
var module = modulesMap[c];
const module = modulesMap[c];
if (module.path === 'empty:') {
return;
}
var contents = readFileAndRemoveBOM(module.path);
const contents = readFileAndRemoveBOM(module.path);
if (module.shim) {
mainResult.sources.push(emitShimmedModule(c, deps[c], module.shim, module.path, contents));
}
else {
mainResult.sources.push(emitNamedModule(c, deps[c], module.defineLocation, module.path, contents));
mainResult.sources.push(emitNamedModule(c, module.defineLocation, module.path, contents));
}
});
Object.keys(usedPlugins).forEach(function (pluginName) {
var plugin = usedPlugins[pluginName];
Object.keys(usedPlugins).forEach((pluginName) => {
const plugin = usedPlugins[pluginName];
if (typeof plugin.writeFile === 'function') {
var req = (function () {
const req = (() => {
throw new Error('no-no!');
});
req.toUrl = function (something) { return something; };
var write = function (filename, contents) {
req.toUrl = something => something;
const write = (filename, contents) => {
results.push({
dest: filename,
sources: [{
@@ -313,15 +317,15 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
plugin.writeFile(pluginName, entryPoint, req, write, {});
}
});
var toIFile = function (path) {
var contents = readFileAndRemoveBOM(path);
const toIFile = (path) => {
const contents = readFileAndRemoveBOM(path);
return {
path: path,
contents: contents
};
};
var toPrepend = (prepend || []).map(toIFile);
var toAppend = (append || []).map(toIFile);
const toPrepend = (prepend || []).map(toIFile);
const toAppend = (append || []).map(toIFile);
mainResult.sources = toPrepend.concat(mainResult.sources).concat(toAppend);
return {
files: results,
@@ -329,8 +333,8 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
};
}
function readFileAndRemoveBOM(path) {
var BOM_CHAR_CODE = 65279;
var contents = fs.readFileSync(path, 'utf8');
const BOM_CHAR_CODE = 65279;
let contents = fs.readFileSync(path, 'utf8');
// Remove BOM
if (contents.charCodeAt(0) === BOM_CHAR_CODE) {
contents = contents.substring(1);
@@ -338,15 +342,15 @@ function readFileAndRemoveBOM(path) {
return contents;
}
function emitPlugin(entryPoint, plugin, pluginName, moduleName) {
var result = '';
let result = '';
if (typeof plugin.write === 'function') {
var write = (function (what) {
const write = ((what) => {
result += what;
});
write.getEntryPoint = function () {
write.getEntryPoint = () => {
return entryPoint;
};
write.asModule = function (moduleId, code) {
write.asModule = (moduleId, code) => {
code = code.replace(/^define\(/, 'define("' + moduleId + '",');
result += code;
};
@@ -357,20 +361,20 @@ function emitPlugin(entryPoint, plugin, pluginName, moduleName) {
contents: result
};
}
function emitNamedModule(moduleId, myDeps, defineCallPosition, path, contents) {
function emitNamedModule(moduleId, defineCallPosition, path, contents) {
// `defineCallPosition` is the position in code: |define()
var defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
const defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
// `parensOffset` is the position in code: define|()
var parensOffset = contents.indexOf('(', defineCallOffset);
var insertStr = '"' + moduleId + '", ';
const parensOffset = contents.indexOf('(', defineCallOffset);
const insertStr = '"' + moduleId + '", ';
return {
path: path,
contents: contents.substr(0, parensOffset + 1) + insertStr + contents.substr(parensOffset + 1)
};
}
function emitShimmedModule(moduleId, myDeps, factory, path, contents) {
var strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
var strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
const strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
const strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
return {
path: path,
contents: contents + '\n;\n' + strDefine
@@ -383,7 +387,8 @@ function positionToOffset(str, desiredLine, desiredCol) {
if (desiredLine === 1) {
return desiredCol - 1;
}
var line = 1, lastNewLineOffset = -1;
let line = 1;
let lastNewLineOffset = -1;
do {
if (desiredLine === line) {
return lastNewLineOffset + 1 + desiredCol - 1;
@@ -397,14 +402,15 @@ function positionToOffset(str, desiredLine, desiredCol) {
* Return a set of reachable nodes in `graph` starting from `rootNodes`
*/
function visit(rootNodes, graph) {
var result = {}, queue = rootNodes;
rootNodes.forEach(function (node) {
const result = {};
const queue = rootNodes;
rootNodes.forEach((node) => {
result[node] = true;
});
while (queue.length > 0) {
var el = queue.shift();
var myEdges = graph[el] || [];
myEdges.forEach(function (toNode) {
const el = queue.shift();
const myEdges = graph[el] || [];
myEdges.forEach((toNode) => {
if (!result[toNode]) {
result[toNode] = true;
queue.push(toNode);
@@ -417,11 +423,11 @@ function visit(rootNodes, graph) {
* Perform a topological sort on `graph`
*/
function topologicalSort(graph) {
var allNodes = {}, outgoingEdgeCount = {}, inverseEdges = {};
Object.keys(graph).forEach(function (fromNode) {
const allNodes = {}, outgoingEdgeCount = {}, inverseEdges = {};
Object.keys(graph).forEach((fromNode) => {
allNodes[fromNode] = true;
outgoingEdgeCount[fromNode] = graph[fromNode].length;
graph[fromNode].forEach(function (toNode) {
graph[fromNode].forEach((toNode) => {
allNodes[toNode] = true;
outgoingEdgeCount[toNode] = outgoingEdgeCount[toNode] || 0;
inverseEdges[toNode] = inverseEdges[toNode] || [];
@@ -429,8 +435,8 @@ function topologicalSort(graph) {
});
});
// https://en.wikipedia.org/wiki/Topological_sorting
var S = [], L = [];
Object.keys(allNodes).forEach(function (node) {
const S = [], L = [];
Object.keys(allNodes).forEach((node) => {
if (outgoingEdgeCount[node] === 0) {
delete outgoingEdgeCount[node];
S.push(node);
@@ -439,10 +445,10 @@ function topologicalSort(graph) {
while (S.length > 0) {
// Ensure the exact same order all the time with the same inputs
S.sort();
var n = S.shift();
const n = S.shift();
L.push(n);
var myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach(function (m) {
const myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach((m) => {
outgoingEdgeCount[m]--;
if (outgoingEdgeCount[m] === 0) {
delete outgoingEdgeCount[m];

View File

@@ -3,9 +3,9 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import fs = require('fs');
import path = require('path');
import vm = require('vm');
import * as fs from 'fs';
import * as path from 'path';
import * as vm from 'vm';
interface IPosition {
line: number;
@@ -46,7 +46,7 @@ export interface IEntryPoint {
name: string;
include?: string[];
exclude?: string[];
prepend: string[];
prepend?: string[];
append?: string[];
dest?: string;
}
@@ -64,7 +64,7 @@ interface INodeSet {
}
export interface IFile {
path: string;
path: string | null;
contents: string;
}
@@ -97,13 +97,13 @@ export interface ILoaderConfig {
/**
* Bundle `entryPoints` given config `config`.
*/
export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callback: (err: any, result: IBundleResult) => void): void {
let entryPointsMap: IEntryPointMap = {};
export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callback: (err: any, result: IBundleResult | null) => void): void {
const entryPointsMap: IEntryPointMap = {};
entryPoints.forEach((module: IEntryPoint) => {
entryPointsMap[module.name] = module;
});
let allMentionedModulesMap: { [modules: string]: boolean; } = {};
const allMentionedModulesMap: { [modules: string]: boolean; } = {};
entryPoints.forEach((module: IEntryPoint) => {
allMentionedModulesMap[module.name] = true;
(module.include || []).forEach(function (includedModule) {
@@ -115,28 +115,32 @@ export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callba
});
var code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
var r: Function = <any>vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
var loaderModule = { exports: {} };
const code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
const r: Function = <any>vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
const loaderModule = { exports: {} };
r.call({}, require, loaderModule, loaderModule.exports);
var loader: any = loaderModule.exports;
const loader: any = loaderModule.exports;
config.isBuild = true;
config.paths = config.paths || {};
config.paths['vs/nls'] = 'out-build/vs/nls.build';
config.paths['vs/css'] = 'out-build/vs/css.build';
if (!config.paths['vs/nls']) {
config.paths['vs/nls'] = 'out-build/vs/nls.build';
}
if (!config.paths['vs/css']) {
config.paths['vs/css'] = 'out-build/vs/css.build';
}
loader.config(config);
loader(['require'], (localRequire) => {
let resolvePath = (path: string) => {
let r = localRequire.toUrl(path);
loader(['require'], (localRequire: any) => {
const resolvePath = (path: string) => {
const r = localRequire.toUrl(path);
if (!/\.js/.test(r)) {
return r + '.js';
}
return r;
};
for (let moduleId in entryPointsMap) {
let entryPoint = entryPointsMap[moduleId];
for (const moduleId in entryPointsMap) {
const entryPoint = entryPointsMap[moduleId];
if (entryPoint.append) {
entryPoint.append = entryPoint.append.map(resolvePath);
}
@@ -147,76 +151,76 @@ export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callba
});
loader(Object.keys(allMentionedModulesMap), () => {
let modules = <IBuildModuleInfo[]>loader.getBuildInfo();
let partialResult = emitEntryPoints(modules, entryPointsMap);
let cssInlinedResources = loader('vs/css').getInlinedResources();
const modules = <IBuildModuleInfo[]>loader.getBuildInfo();
const partialResult = emitEntryPoints(modules, entryPointsMap);
const cssInlinedResources = loader('vs/css').getInlinedResources();
callback(null, {
files: partialResult.files,
cssInlinedResources: cssInlinedResources,
bundleData: partialResult.bundleData
});
}, (err) => callback(err, null));
}, (err: any) => callback(err, null));
}
function emitEntryPoints(modules: IBuildModuleInfo[], entryPoints: IEntryPointMap): IPartialBundleResult {
let modulesMap: IBuildModuleInfoMap = {};
const modulesMap: IBuildModuleInfoMap = {};
modules.forEach((m: IBuildModuleInfo) => {
modulesMap[m.id] = m;
});
let modulesGraph: IGraph = {};
const modulesGraph: IGraph = {};
modules.forEach((m: IBuildModuleInfo) => {
modulesGraph[m.id] = m.dependencies;
});
let sortedModules = topologicalSort(modulesGraph);
const sortedModules = topologicalSort(modulesGraph);
let result: IConcatFile[] = [];
let usedPlugins: IPluginMap = {};
let bundleData: IBundleData = {
const usedPlugins: IPluginMap = {};
const bundleData: IBundleData = {
graph: modulesGraph,
bundles: {}
};
Object.keys(entryPoints).forEach((moduleToBundle: string) => {
let info = entryPoints[moduleToBundle];
let rootNodes = [moduleToBundle].concat(info.include || []);
let allDependencies = visit(rootNodes, modulesGraph);
let excludes: string[] = ['require', 'exports', 'module'].concat(info.exclude || []);
const info = entryPoints[moduleToBundle];
const rootNodes = [moduleToBundle].concat(info.include || []);
const allDependencies = visit(rootNodes, modulesGraph);
const excludes: string[] = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach((excludeRoot: string) => {
let allExcludes = visit([excludeRoot], modulesGraph);
const allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach((exclude: string) => {
delete allDependencies[exclude];
});
});
let includedModules = sortedModules.filter((module: string) => {
const includedModules = sortedModules.filter((module: string) => {
return allDependencies[module];
});
bundleData.bundles[moduleToBundle] = includedModules;
let res = emitEntryPoint(
const res = emitEntryPoint(
modulesMap,
modulesGraph,
moduleToBundle,
includedModules,
info.prepend,
info.append,
info.prepend || [],
info.append || [],
info.dest
);
result = result.concat(res.files);
for (let pluginName in res.usedPlugins) {
for (const pluginName in res.usedPlugins) {
usedPlugins[pluginName] = usedPlugins[pluginName] || res.usedPlugins[pluginName];
}
});
Object.keys(usedPlugins).forEach((pluginName: string) => {
let plugin = usedPlugins[pluginName];
const plugin = usedPlugins[pluginName];
if (typeof plugin.finishBuild === 'function') {
let write = (filename: string, contents: string) => {
const write = (filename: string, contents: string) => {
result.push({
dest: filename,
sources: [{
@@ -237,16 +241,16 @@ function emitEntryPoints(modules: IBuildModuleInfo[], entryPoints: IEntryPointMa
}
function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
let parseDefineCall = (moduleMatch: string, depsMatch: string) => {
let module = moduleMatch.replace(/^"|"$/g, '');
const parseDefineCall = (moduleMatch: string, depsMatch: string) => {
const module = moduleMatch.replace(/^"|"$/g, '');
let deps = depsMatch.split(',');
deps = deps.map((dep) => {
dep = dep.trim();
dep = dep.replace(/^"|"$/g, '');
dep = dep.replace(/^'|'$/g, '');
let prefix: string = null;
let _path: string = null;
let pieces = dep.split('!');
let prefix: string | null = null;
let _path: string | null = null;
const pieces = dep.split('!');
if (pieces.length > 1) {
prefix = pieces[0] + '!';
_path = pieces[1];
@@ -256,7 +260,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
}
if (/^\.\//.test(_path) || /^\.\.\//.test(_path)) {
let res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
const res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
return prefix + res;
}
return prefix + _path;
@@ -267,7 +271,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
};
};
destFiles.forEach((destFile, index) => {
destFiles.forEach((destFile) => {
if (!/\.js$/.test(destFile.dest)) {
return;
}
@@ -276,33 +280,33 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
}
// Do one pass to record the usage counts for each module id
let useCounts: { [moduleId: string]: number; } = {};
const useCounts: { [moduleId: string]: number; } = {};
destFile.sources.forEach((source) => {
let matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
const matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
if (!matches) {
return;
}
let defineCall = parseDefineCall(matches[1], matches[2]);
const defineCall = parseDefineCall(matches[1], matches[2]);
useCounts[defineCall.module] = (useCounts[defineCall.module] || 0) + 1;
defineCall.deps.forEach((dep) => {
useCounts[dep] = (useCounts[dep] || 0) + 1;
});
});
let sortedByUseModules = Object.keys(useCounts);
const sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort((a, b) => {
return useCounts[b] - useCounts[a];
});
let replacementMap: { [moduleId: string]: number; } = {};
const replacementMap: { [moduleId: string]: number; } = {};
sortedByUseModules.forEach((module, index) => {
replacementMap[module] = index;
});
destFile.sources.forEach((source) => {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, (_, moduleMatch, depsMatch) => {
let defineCall = parseDefineCall(moduleMatch, depsMatch);
const defineCall = parseDefineCall(moduleMatch, depsMatch);
return `define(__m[${replacementMap[defineCall.module]}/*${defineCall.module}*/], __M([${defineCall.deps.map(dep => replacementMap[dep] + '/*' + dep + '*/').join(',')}])`;
});
});
@@ -332,7 +336,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
function removeDuplicateTSBoilerplate(destFiles: IConcatFile[]): IConcatFile[] {
// Taken from typescript compiler => emitFiles
let BOILERPLATE = [
const BOILERPLATE = [
{ start: /^var __extends/, end: /^}\)\(\);$/ },
{ start: /^var __assign/, end: /^};$/ },
{ start: /^var __decorate/, end: /^};$/ },
@@ -343,22 +347,22 @@ function removeDuplicateTSBoilerplate(destFiles: IConcatFile[]): IConcatFile[] {
];
destFiles.forEach((destFile) => {
let SEEN_BOILERPLATE = [];
const SEEN_BOILERPLATE: boolean[] = [];
destFile.sources.forEach((source) => {
let lines = source.contents.split(/\r\n|\n|\r/);
let newLines: string[] = [];
const lines = source.contents.split(/\r\n|\n|\r/);
const newLines: string[] = [];
let IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE: RegExp;
for (let i = 0; i < lines.length; i++) {
let line = lines[i];
const line = lines[i];
if (IS_REMOVING_BOILERPLATE) {
newLines.push('');
if (END_BOILERPLATE.test(line)) {
if (END_BOILERPLATE!.test(line)) {
IS_REMOVING_BOILERPLATE = false;
}
} else {
for (let j = 0; j < BOILERPLATE.length; j++) {
let boilerplate = BOILERPLATE[j];
const boilerplate = BOILERPLATE[j];
if (boilerplate.start.test(line)) {
if (SEEN_BOILERPLATE[j]) {
IS_REMOVING_BOILERPLATE = true;
@@ -398,19 +402,19 @@ function emitEntryPoint(
includedModules: string[],
prepend: string[],
append: string[],
dest: string
dest: string | undefined
): IEmitEntryPointResult {
if (!dest) {
dest = entryPoint + '.js';
}
let mainResult: IConcatFile = {
const mainResult: IConcatFile = {
sources: [],
dest: dest
},
results: IConcatFile[] = [mainResult];
let usedPlugins: IPluginMap = {};
let getLoaderPlugin = (pluginName: string): ILoaderPlugin => {
const usedPlugins: IPluginMap = {};
const getLoaderPlugin = (pluginName: string): ILoaderPlugin => {
if (!usedPlugins[pluginName]) {
usedPlugins[pluginName] = modulesMap[pluginName].exports;
}
@@ -418,39 +422,39 @@ function emitEntryPoint(
};
includedModules.forEach((c: string) => {
let bangIndex = c.indexOf('!');
const bangIndex = c.indexOf('!');
if (bangIndex >= 0) {
let pluginName = c.substr(0, bangIndex);
let plugin = getLoaderPlugin(pluginName);
const pluginName = c.substr(0, bangIndex);
const plugin = getLoaderPlugin(pluginName);
mainResult.sources.push(emitPlugin(entryPoint, plugin, pluginName, c.substr(bangIndex + 1)));
return;
}
let module = modulesMap[c];
const module = modulesMap[c];
if (module.path === 'empty:') {
return;
}
let contents = readFileAndRemoveBOM(module.path);
const contents = readFileAndRemoveBOM(module.path);
if (module.shim) {
mainResult.sources.push(emitShimmedModule(c, deps[c], module.shim, module.path, contents));
} else {
mainResult.sources.push(emitNamedModule(c, deps[c], module.defineLocation, module.path, contents));
mainResult.sources.push(emitNamedModule(c, module.defineLocation, module.path, contents));
}
});
Object.keys(usedPlugins).forEach((pluginName: string) => {
let plugin = usedPlugins[pluginName];
const plugin = usedPlugins[pluginName];
if (typeof plugin.writeFile === 'function') {
let req: ILoaderPluginReqFunc = <any>(() => {
const req: ILoaderPluginReqFunc = <any>(() => {
throw new Error('no-no!');
});
req.toUrl = something => something;
let write = (filename: string, contents: string) => {
const write = (filename: string, contents: string) => {
results.push({
dest: filename,
sources: [{
@@ -463,16 +467,16 @@ function emitEntryPoint(
}
});
let toIFile = (path): IFile => {
let contents = readFileAndRemoveBOM(path);
const toIFile = (path: string): IFile => {
const contents = readFileAndRemoveBOM(path);
return {
path: path,
contents: contents
};
};
let toPrepend = (prepend || []).map(toIFile);
let toAppend = (append || []).map(toIFile);
const toPrepend = (prepend || []).map(toIFile);
const toAppend = (append || []).map(toIFile);
mainResult.sources = toPrepend.concat(mainResult.sources).concat(toAppend);
@@ -483,8 +487,8 @@ function emitEntryPoint(
}
function readFileAndRemoveBOM(path: string): string {
var BOM_CHAR_CODE = 65279;
var contents = fs.readFileSync(path, 'utf8');
const BOM_CHAR_CODE = 65279;
let contents = fs.readFileSync(path, 'utf8');
// Remove BOM
if (contents.charCodeAt(0) === BOM_CHAR_CODE) {
contents = contents.substring(1);
@@ -495,7 +499,7 @@ function readFileAndRemoveBOM(path: string): string {
function emitPlugin(entryPoint: string, plugin: ILoaderPlugin, pluginName: string, moduleName: string): IFile {
let result = '';
if (typeof plugin.write === 'function') {
let write: ILoaderPluginWriteFunc = <any>((what) => {
const write: ILoaderPluginWriteFunc = <any>((what: string) => {
result += what;
});
write.getEntryPoint = () => {
@@ -513,15 +517,15 @@ function emitPlugin(entryPoint: string, plugin: ILoaderPlugin, pluginName: strin
};
}
function emitNamedModule(moduleId: string, myDeps: string[], defineCallPosition: IPosition, path: string, contents: string): IFile {
function emitNamedModule(moduleId: string, defineCallPosition: IPosition, path: string, contents: string): IFile {
// `defineCallPosition` is the position in code: |define()
let defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
const defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
// `parensOffset` is the position in code: define|()
let parensOffset = contents.indexOf('(', defineCallOffset);
const parensOffset = contents.indexOf('(', defineCallOffset);
let insertStr = '"' + moduleId + '", ';
const insertStr = '"' + moduleId + '", ';
return {
path: path,
@@ -530,8 +534,8 @@ function emitNamedModule(moduleId: string, myDeps: string[], defineCallPosition:
}
function emitShimmedModule(moduleId: string, myDeps: string[], factory: string, path: string, contents: string): IFile {
let strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
let strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
const strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
const strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
return {
path: path,
contents: contents + '\n;\n' + strDefine
@@ -546,8 +550,8 @@ function positionToOffset(str: string, desiredLine: number, desiredCol: number):
return desiredCol - 1;
}
let line = 1,
lastNewLineOffset = -1;
let line = 1;
let lastNewLineOffset = -1;
do {
if (desiredLine === line) {
@@ -565,16 +569,16 @@ function positionToOffset(str: string, desiredLine: number, desiredCol: number):
* Return a set of reachable nodes in `graph` starting from `rootNodes`
*/
function visit(rootNodes: string[], graph: IGraph): INodeSet {
let result: INodeSet = {},
queue = rootNodes;
const result: INodeSet = {};
const queue = rootNodes;
rootNodes.forEach((node) => {
result[node] = true;
});
while (queue.length > 0) {
let el = queue.shift();
let myEdges = graph[el] || [];
const el = queue.shift();
const myEdges = graph[el!] || [];
myEdges.forEach((toNode) => {
if (!result[toNode]) {
result[toNode] = true;
@@ -591,7 +595,7 @@ function visit(rootNodes: string[], graph: IGraph): INodeSet {
*/
function topologicalSort(graph: IGraph): string[] {
let allNodes: INodeSet = {},
const allNodes: INodeSet = {},
outgoingEdgeCount: { [node: string]: number; } = {},
inverseEdges: IGraph = {};
@@ -609,7 +613,7 @@ function topologicalSort(graph: IGraph): string[] {
});
// https://en.wikipedia.org/wiki/Topological_sorting
let S: string[] = [],
const S: string[] = [],
L: string[] = [];
Object.keys(allNodes).forEach((node: string) => {
@@ -623,10 +627,10 @@ function topologicalSort(graph: IGraph): string[] {
// Ensure the exact same order all the time with the same inputs
S.sort();
let n: string = S.shift();
const n: string = S.shift()!;
L.push(n);
let myInverseEdges = inverseEdges[n] || [];
const myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach((m: string) => {
outgoingEdgeCount[m]--;
if (outgoingEdgeCount[m] === 0) {

View File

@@ -4,44 +4,54 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var gulp = require("gulp");
var tsb = require("gulp-tsb");
var es = require("event-stream");
var watch = require('./watch');
var nls = require("./nls");
var util = require("./util");
var reporter_1 = require("./reporter");
var path = require("path");
var bom = require("gulp-bom");
var sourcemaps = require("gulp-sourcemaps");
var _ = require("underscore");
var monacodts = require("../monaco/api");
var fs = require("fs");
var reporter = reporter_1.createReporter();
const es = require("event-stream");
const fs = require("fs");
const gulp = require("gulp");
const bom = require("gulp-bom");
const sourcemaps = require("gulp-sourcemaps");
const tsb = require("gulp-tsb");
const path = require("path");
const _ = require("underscore");
const monacodts = require("../monaco/api");
const nls = require("./nls");
const reporter_1 = require("./reporter");
const util = require("./util");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const watch = require('./watch');
const reporter = reporter_1.createReporter();
function getTypeScriptCompilerOptions(src) {
var rootDir = path.join(__dirname, "../../" + src);
var options = require("../../" + src + "/tsconfig.json").compilerOptions;
const rootDir = path.join(__dirname, `../../${src}`);
const tsconfig = require(`../../${src}/tsconfig.json`);
let options;
if (tsconfig.extends) {
options = Object.assign({}, require(path.join(rootDir, tsconfig.extends)).compilerOptions, tsconfig.compilerOptions);
}
else {
options = tsconfig.compilerOptions;
}
options.verbose = false;
options.sourceMap = true;
if (process.env['VSCODE_NO_SOURCEMAP']) { // To be used by developers in a hurry
options.sourceMap = false;
}
options.rootDir = rootDir;
options.baseUrl = rootDir;
options.sourceRoot = util.toFileUri(rootDir);
options.newLine = /\r\n/.test(fs.readFileSync(__filename, 'utf8')) ? 'CRLF' : 'LF';
return options;
}
function createCompile(src, build, emitError) {
var opts = _.clone(getTypeScriptCompilerOptions(src));
const opts = _.clone(getTypeScriptCompilerOptions(src));
opts.inlineSources = !!build;
opts.noFilesystemLookup = true;
var ts = tsb.create(opts, null, null, function (err) { return reporter(err.toString()); });
const ts = tsb.create(opts, true, undefined, err => reporter(err.toString()));
return function (token) {
var utf8Filter = util.filter(function (data) { return /(\/|\\)test(\/|\\).*utf8/.test(data.path); });
var tsFilter = util.filter(function (data) { return /\.ts$/.test(data.path); });
var noDeclarationsFilter = util.filter(function (data) { return !(/\.d\.ts$/.test(data.path)); });
var input = es.through();
var output = input
const utf8Filter = util.filter(data => /(\/|\\)test(\/|\\).*utf8/.test(data.path));
const tsFilter = util.filter(data => /\.ts$/.test(data.path));
const noDeclarationsFilter = util.filter(data => !(/\.d\.ts$/.test(data.path)));
const input = es.through();
const output = input
.pipe(utf8Filter)
.pipe(bom())
.pipe(utf8Filter.restore)
@@ -57,91 +67,136 @@ function createCompile(src, build, emitError) {
sourceRoot: opts.sourceRoot
}))
.pipe(tsFilter.restore)
.pipe(reporter.end(emitError));
.pipe(reporter.end(!!emitError));
return es.duplex(input, output);
};
}
const typesDts = [
'node_modules/typescript/lib/*.d.ts',
'node_modules/@types/**/*.d.ts',
'!node_modules/@types/webpack/**/*',
'!node_modules/@types/uglify-js/**/*',
];
function compileTask(src, out, build) {
return function () {
var compile = createCompile(src, build, true);
var srcPipe = es.merge(gulp.src(src + "/**", { base: "" + src }), gulp.src('node_modules/typescript/lib/lib.d.ts'));
// Do not write .d.ts files to disk, as they are not needed there.
var dtsFilter = util.filter(function (data) { return !/\.d\.ts$/.test(data.path); });
const compile = createCompile(src, build, true);
const srcPipe = es.merge(gulp.src(`${src}/**`, { base: `${src}` }), gulp.src(typesDts));
let generator = new MonacoGenerator(false);
if (src === 'src') {
generator.execute();
}
return srcPipe
.pipe(generator.stream)
.pipe(compile())
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(src !== 'src' ? es.through() : monacodtsTask(out, false));
.pipe(gulp.dest(out));
};
}
exports.compileTask = compileTask;
function watchTask(out, build) {
return function () {
var compile = createCompile('src', build);
var src = es.merge(gulp.src('src/**', { base: 'src' }), gulp.src('node_modules/typescript/lib/lib.d.ts'));
var watchSrc = watch('src/**', { base: 'src' });
// Do not write .d.ts files to disk, as they are not needed there.
var dtsFilter = util.filter(function (data) { return !/\.d\.ts$/.test(data.path); });
const compile = createCompile('src', build);
const src = es.merge(gulp.src('src/**', { base: 'src' }), gulp.src(typesDts));
const watchSrc = watch('src/**', { base: 'src' });
let generator = new MonacoGenerator(true);
generator.execute();
return watchSrc
.pipe(generator.stream)
.pipe(util.incremental(compile, src, true))
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(monacodtsTask(out, true));
.pipe(gulp.dest(out));
};
}
exports.watchTask = watchTask;
function monacodtsTask(out, isWatch) {
var basePath = path.resolve(process.cwd(), out);
var neededFiles = {};
monacodts.getFilesToWatch(out).forEach(function (filePath) {
filePath = path.normalize(filePath);
neededFiles[filePath] = true;
});
var inputFiles = {};
for (var filePath in neededFiles) {
if (/\bsrc(\/|\\)vs\b/.test(filePath)) {
// This file is needed from source => simply read it now
inputFiles[filePath] = fs.readFileSync(filePath).toString();
const REPO_SRC_FOLDER = path.join(__dirname, '../../src');
class MonacoGenerator {
constructor(isWatch) {
this._executeSoonTimer = null;
this._isWatch = isWatch;
this.stream = es.through();
this._watchers = [];
this._watchedFiles = {};
let onWillReadFile = (moduleId, filePath) => {
if (!this._isWatch) {
return;
}
if (this._watchedFiles[filePath]) {
return;
}
this._watchedFiles[filePath] = true;
const watcher = fs.watch(filePath);
watcher.addListener('change', () => {
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
watcher.addListener('error', (err) => {
console.error(`Encountered error while watching ${filePath}.`);
console.log(err);
delete this._watchedFiles[filePath];
for (let i = 0; i < this._watchers.length; i++) {
if (this._watchers[i] === watcher) {
this._watchers.splice(i, 1);
break;
}
}
watcher.close();
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
this._watchers.push(watcher);
};
this._fsProvider = new class extends monacodts.FSProvider {
readFileSync(moduleId, filePath) {
onWillReadFile(moduleId, filePath);
return super.readFileSync(moduleId, filePath);
}
};
this._declarationResolver = new monacodts.DeclarationResolver(this._fsProvider);
if (this._isWatch) {
const recipeWatcher = fs.watch(monacodts.RECIPE_PATH);
recipeWatcher.addListener('change', () => {
this._executeSoon();
});
this._watchers.push(recipeWatcher);
}
}
var setInputFile = function (filePath, contents) {
if (inputFiles[filePath] === contents) {
// no change
_executeSoon() {
if (this._executeSoonTimer !== null) {
clearTimeout(this._executeSoonTimer);
this._executeSoonTimer = null;
}
this._executeSoonTimer = setTimeout(() => {
this._executeSoonTimer = null;
this.execute();
}, 20);
}
dispose() {
this._watchers.forEach(watcher => watcher.close());
}
_run() {
let r = monacodts.run3(this._declarationResolver);
if (!r && !this._isWatch) {
// The build must always be able to generate the monaco.d.ts
throw new Error(`monaco.d.ts generation error - Cannot continue`);
}
return r;
}
_log(message, ...rest) {
fancyLog(ansiColors.cyan('[monaco.d.ts]'), message, ...rest);
}
execute() {
const startTime = Date.now();
const result = this._run();
if (!result) {
// nothing really changed
return;
}
inputFiles[filePath] = contents;
var neededInputFilesCount = Object.keys(neededFiles).length;
var availableInputFilesCount = Object.keys(inputFiles).length;
if (neededInputFilesCount === availableInputFilesCount) {
run();
if (result.isTheSame) {
return;
}
};
var run = function () {
var result = monacodts.run(out, inputFiles);
if (!result.isTheSame) {
if (isWatch) {
fs.writeFileSync(result.filePath, result.content);
}
else {
fs.writeFileSync(result.filePath, result.content);
resultStream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
fs.writeFileSync(result.filePath, result.content);
fs.writeFileSync(path.join(REPO_SRC_FOLDER, 'vs/editor/common/standalone/standaloneEnums.ts'), result.enums);
this._log(`monaco.d.ts is changed - total time took ${Date.now() - startTime} ms`);
if (!this._isWatch) {
this.stream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
};
var resultStream;
if (isWatch) {
watch('build/monaco/*').pipe(es.through(function () {
run();
}));
}
resultStream = es.through(function (data) {
var filePath = path.normalize(path.resolve(basePath, data.relative));
if (neededFiles[filePath]) {
setInputFile(filePath, data.contents.toString());
}
this.emit('data', data);
});
return resultStream;
}

View File

@@ -5,31 +5,41 @@
'use strict';
import * as gulp from 'gulp';
import * as tsb from 'gulp-tsb';
import * as es from 'event-stream';
const watch = require('./watch');
import * as nls from './nls';
import * as util from './util';
import { createReporter } from './reporter';
import * as path from 'path';
import * as fs from 'fs';
import * as gulp from 'gulp';
import * as bom from 'gulp-bom';
import * as sourcemaps from 'gulp-sourcemaps';
import * as tsb from 'gulp-tsb';
import * as path from 'path';
import * as _ from 'underscore';
import * as monacodts from '../monaco/api';
import * as fs from 'fs';
import * as nls from './nls';
import { createReporter } from './reporter';
import * as util from './util';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
const watch = require('./watch');
const reporter = createReporter();
function getTypeScriptCompilerOptions(src: string) {
const rootDir = path.join(__dirname, `../../${src}`);
const options = require(`../../${src}/tsconfig.json`).compilerOptions;
const tsconfig = require(`../../${src}/tsconfig.json`);
let options: { [key: string]: any };
if (tsconfig.extends) {
options = Object.assign({}, require(path.join(rootDir, tsconfig.extends)).compilerOptions, tsconfig.compilerOptions);
} else {
options = tsconfig.compilerOptions;
}
options.verbose = false;
options.sourceMap = true;
if (process.env['VSCODE_NO_SOURCEMAP']) { // To be used by developers in a hurry
options.sourceMap = false;
}
options.rootDir = rootDir;
options.baseUrl = rootDir;
options.sourceRoot = util.toFileUri(rootDir);
options.newLine = /\r\n/.test(fs.readFileSync(__filename, 'utf8')) ? 'CRLF' : 'LF';
return options;
@@ -40,7 +50,7 @@ function createCompile(src: string, build: boolean, emitError?: boolean): (token
opts.inlineSources = !!build;
opts.noFilesystemLookup = true;
const ts = tsb.create(opts, null, null, err => reporter(err.toString()));
const ts = tsb.create(opts, true, undefined, err => reporter(err.toString()));
return function (token?: util.ICancellationToken) {
@@ -65,12 +75,19 @@ function createCompile(src: string, build: boolean, emitError?: boolean): (token
sourceRoot: opts.sourceRoot
}))
.pipe(tsFilter.restore)
.pipe(reporter.end(emitError));
.pipe(reporter.end(!!emitError));
return es.duplex(input, output);
};
}
const typesDts = [
'node_modules/typescript/lib/*.d.ts',
'node_modules/@types/**/*.d.ts',
'!node_modules/@types/webpack/**/*',
'!node_modules/@types/uglify-js/**/*',
];
export function compileTask(src: string, out: string, build: boolean): () => NodeJS.ReadWriteStream {
return function () {
@@ -78,18 +95,18 @@ export function compileTask(src: string, out: string, build: boolean): () => Nod
const srcPipe = es.merge(
gulp.src(`${src}/**`, { base: `${src}` }),
gulp.src('node_modules/typescript/lib/lib.d.ts'),
gulp.src(typesDts),
);
// Do not write .d.ts files to disk, as they are not needed there.
const dtsFilter = util.filter(data => !/\.d\.ts$/.test(data.path));
let generator = new MonacoGenerator(false);
if (src === 'src') {
generator.execute();
}
return srcPipe
.pipe(generator.stream)
.pipe(compile())
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(src !== 'src' ? es.through() : monacodtsTask(out, false));
.pipe(gulp.dest(out));
};
}
@@ -100,80 +117,128 @@ export function watchTask(out: string, build: boolean): () => NodeJS.ReadWriteSt
const src = es.merge(
gulp.src('src/**', { base: 'src' }),
gulp.src('node_modules/typescript/lib/lib.d.ts'),
gulp.src(typesDts),
);
const watchSrc = watch('src/**', { base: 'src' });
// Do not write .d.ts files to disk, as they are not needed there.
const dtsFilter = util.filter(data => !/\.d\.ts$/.test(data.path));
let generator = new MonacoGenerator(true);
generator.execute();
return watchSrc
.pipe(generator.stream)
.pipe(util.incremental(compile, src, true))
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(monacodtsTask(out, true));
.pipe(gulp.dest(out));
};
}
function monacodtsTask(out: string, isWatch: boolean): NodeJS.ReadWriteStream {
const REPO_SRC_FOLDER = path.join(__dirname, '../../src');
const basePath = path.resolve(process.cwd(), out);
class MonacoGenerator {
private readonly _isWatch: boolean;
public readonly stream: NodeJS.ReadWriteStream;
const neededFiles: { [file: string]: boolean; } = {};
monacodts.getFilesToWatch(out).forEach(function (filePath) {
filePath = path.normalize(filePath);
neededFiles[filePath] = true;
});
private readonly _watchers: fs.FSWatcher[];
private readonly _watchedFiles: { [filePath: string]: boolean; };
private readonly _fsProvider: monacodts.FSProvider;
private readonly _declarationResolver: monacodts.DeclarationResolver;
const inputFiles: { [file: string]: string; } = {};
for (let filePath in neededFiles) {
if (/\bsrc(\/|\\)vs\b/.test(filePath)) {
// This file is needed from source => simply read it now
inputFiles[filePath] = fs.readFileSync(filePath).toString();
constructor(isWatch: boolean) {
this._isWatch = isWatch;
this.stream = es.through();
this._watchers = [];
this._watchedFiles = {};
let onWillReadFile = (moduleId: string, filePath: string) => {
if (!this._isWatch) {
return;
}
if (this._watchedFiles[filePath]) {
return;
}
this._watchedFiles[filePath] = true;
const watcher = fs.watch(filePath);
watcher.addListener('change', () => {
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
watcher.addListener('error', (err) => {
console.error(`Encountered error while watching ${filePath}.`);
console.log(err);
delete this._watchedFiles[filePath];
for (let i = 0; i < this._watchers.length; i++) {
if (this._watchers[i] === watcher) {
this._watchers.splice(i, 1);
break;
}
}
watcher.close();
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
this._watchers.push(watcher);
};
this._fsProvider = new class extends monacodts.FSProvider {
public readFileSync(moduleId: string, filePath: string): Buffer {
onWillReadFile(moduleId, filePath);
return super.readFileSync(moduleId, filePath);
}
};
this._declarationResolver = new monacodts.DeclarationResolver(this._fsProvider);
if (this._isWatch) {
const recipeWatcher = fs.watch(monacodts.RECIPE_PATH);
recipeWatcher.addListener('change', () => {
this._executeSoon();
});
this._watchers.push(recipeWatcher);
}
}
const setInputFile = (filePath: string, contents: string) => {
if (inputFiles[filePath] === contents) {
// no change
private _executeSoonTimer: NodeJS.Timer | null = null;
private _executeSoon(): void {
if (this._executeSoonTimer !== null) {
clearTimeout(this._executeSoonTimer);
this._executeSoonTimer = null;
}
this._executeSoonTimer = setTimeout(() => {
this._executeSoonTimer = null;
this.execute();
}, 20);
}
public dispose(): void {
this._watchers.forEach(watcher => watcher.close());
}
private _run(): monacodts.IMonacoDeclarationResult | null {
let r = monacodts.run3(this._declarationResolver);
if (!r && !this._isWatch) {
// The build must always be able to generate the monaco.d.ts
throw new Error(`monaco.d.ts generation error - Cannot continue`);
}
return r;
}
private _log(message: any, ...rest: any[]): void {
fancyLog(ansiColors.cyan('[monaco.d.ts]'), message, ...rest);
}
public execute(): void {
const startTime = Date.now();
const result = this._run();
if (!result) {
// nothing really changed
return;
}
inputFiles[filePath] = contents;
const neededInputFilesCount = Object.keys(neededFiles).length;
const availableInputFilesCount = Object.keys(inputFiles).length;
if (neededInputFilesCount === availableInputFilesCount) {
run();
if (result.isTheSame) {
return;
}
};
const run = () => {
const result = monacodts.run(out, inputFiles);
if (!result.isTheSame) {
if (isWatch) {
fs.writeFileSync(result.filePath, result.content);
} else {
fs.writeFileSync(result.filePath, result.content);
resultStream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
fs.writeFileSync(result.filePath, result.content);
fs.writeFileSync(path.join(REPO_SRC_FOLDER, 'vs/editor/common/standalone/standaloneEnums.ts'), result.enums);
this._log(`monaco.d.ts is changed - total time took ${Date.now() - startTime} ms`);
if (!this._isWatch) {
this.stream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
};
let resultStream: NodeJS.ReadWriteStream;
if (isWatch) {
watch('build/monaco/*').pipe(es.through(function () {
run();
}));
}
resultStream = es.through(function (data) {
const filePath = path.normalize(path.resolve(basePath, data.relative));
if (neededFiles[filePath]) {
setInputFile(filePath, data.contents.toString());
}
this.emit('data', data);
});
return resultStream;
}

View File

@@ -11,6 +11,7 @@ const root = path.dirname(path.dirname(__dirname));
function getElectronVersion() {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
// @ts-ignore
const target = /^target "(.*)"$/m.exec(yarnrc)[1];
return target;
@@ -19,6 +20,7 @@ function getElectronVersion() {
module.exports.getElectronVersion = getElectronVersion;
// returns 0 if the right version of electron is in .build/electron
// @ts-ignore
if (require.main === module) {
const version = getElectronVersion();
const versionFile = path.join(root, '.build', 'electron', 'version');

View File

@@ -4,116 +4,330 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var es = require("event-stream");
var assign = require("object-assign");
var remote = require("gulp-remote-src");
var flatmap = require('gulp-flatmap');
var vzip = require('gulp-vinyl-zip');
var filter = require('gulp-filter');
var rename = require('gulp-rename');
var util = require('gulp-util');
var buffer = require('gulp-buffer');
var json = require('gulp-json-editor');
var fs = require("fs");
var path = require("path");
var vsce = require("vsce");
var File = require("vinyl");
function fromLocal(extensionPath) {
var result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
.then(function (fileNames) {
var files = fileNames
.map(function (fileName) { return path.join(extensionPath, fileName); })
.map(function (filePath) { return new File({
const es = require("event-stream");
const fs = require("fs");
const glob = require("glob");
const gulp = require("gulp");
const path = require("path");
const File = require("vinyl");
const vsce = require("vsce");
const stats_1 = require("./stats");
const util2 = require("./util");
const remote = require("gulp-remote-src");
const vzip = require('gulp-vinyl-zip');
const filter = require("gulp-filter");
const rename = require("gulp-rename");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const buffer = require('gulp-buffer');
const json = require("gulp-json-editor");
const webpack = require('webpack');
const webpackGulp = require('webpack-stream');
const root = path.resolve(path.join(__dirname, '..', '..'));
// {{SQL CARBON EDIT}}
const _ = require("underscore");
const vfs = require("vinyl-fs");
const deps = require('../dependencies');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
const visxDirectory = path.join(path.dirname(root), 'vsix');
try {
if (!fs.existsSync(visxDirectory)) {
fs.mkdirSync(visxDirectory);
}
}
catch (err) {
// don't fail the build if the output directory already exists
console.warn(err);
}
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
let pkgJson = JSON.parse(fs.readFileSync(path.join(element.path, 'package.json'), { encoding: 'utf8' }));
const packagePath = path.join(visxDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
exports.packageBuiltInExtensions = packageBuiltInExtensions;
function packageExtensionTask(extensionName, platform, arch) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', extensionName);
}
else {
destination = path.join(destination, 'resources', 'app', 'extensions', extensionName);
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '../..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => extensionName === name);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util2.skipDirectories())
.pipe(util2.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
exports.packageExtensionTask = packageExtensionTask;
// {{SQL CARBON EDIT}} - End
function fromLocal(extensionPath, sourceMappingURLBase) {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
if (fs.existsSync(webpackFilename)) {
return fromLocalWebpack(extensionPath, sourceMappingURLBase);
}
else {
return fromLocalNormal(extensionPath);
}
}
function fromLocalWebpack(extensionPath, sourceMappingURLBase) {
const result = es.through();
const packagedDependencies = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
}
}
}
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn, packagedDependencies }).then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath)
}); });
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = glob.sync(path.join(extensionPath, '/**/extension.webpack.config.js'), { ignore: ['**/node_modules'] });
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => () => {
const webpackDone = (err, stats) => {
fancyLog(`Bundled extension: ${ansiColors.yellow(path.join(path.basename(extensionPath), path.relative(extensionPath, webpackConfigPath)))}...`);
if (err) {
result.emit('error', err);
}
const { compilation } = stats;
if (compilation.errors.length > 0) {
result.emit('error', compilation.errors.join('\n'));
}
if (compilation.warnings.length > 0) {
result.emit('error', compilation.warnings.join('\n'));
}
};
const webpackConfig = Object.assign({}, require(webpackConfigPath), { mode: 'production' });
const relativeOutputPath = path.relative(extensionPath, webpackConfig.output.path);
return webpackGulp(webpackConfig, webpack, webpackDone)
.pipe(es.through(function (data) {
data.stat = data.stat || {};
data.base = extensionPath;
this.emit('data', data);
}))
.pipe(es.through(function (data) {
// source map handling:
// * rewrite sourceMappingURL
// * save to disk so that upload-task picks this up
if (sourceMappingURLBase) {
const contents = data.contents.toString('utf8');
data.contents = Buffer.from(contents.replace(/\n\/\/# sourceMappingURL=(.*)$/gm, function (_m, g1) {
return `\n//# sourceMappingURL=${sourceMappingURLBase}/extensions/${path.basename(extensionPath)}/${relativeOutputPath}/${g1}`;
}), 'utf8');
if (/\.js\.map$/.test(data.path)) {
if (!fs.existsSync(path.dirname(data.path))) {
fs.mkdirSync(path.dirname(data.path));
}
fs.writeFileSync(data.path, data.contents);
}
}
this.emit('data', data);
}));
});
es.merge(sequence(webpackStreams), patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
// this.emit('data', data);
// }))
.pipe(result);
}).catch(err => {
console.error(extensionPath);
console.error(packagedDependencies);
result.emit('error', err);
});
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
}
function fromLocalNormal(extensionPath) {
const result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
.then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath)
}));
es.readArray(files).pipe(result);
})
.catch(function (err) { return result.emit('error', err); });
return result;
.catch(err => result.emit('error', err));
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
}
exports.fromLocal = fromLocal;
function error(err) {
var result = es.through();
setTimeout(function () { return result.emit('error', err); });
return result;
}
var baseHeaders = {
const baseHeaders = {
'X-Market-Client-Id': 'VSCode Build',
'User-Agent': 'VSCode Build',
'X-Market-User-Id': '291C1CD0-051A-4123-9B4B-30D60EF52EE2',
};
function fromMarketplace(extensionName, version) {
var filterType = 7;
var value = extensionName;
var criterium = { filterType: filterType, value: value };
var criteria = [criterium];
var pageNumber = 1;
var pageSize = 1;
var sortBy = 0;
var sortOrder = 0;
var flags = 0x1 | 0x2 | 0x80;
var assetTypes = ['Microsoft.VisualStudio.Services.VSIXPackage'];
var filters = [{ criteria: criteria, pageNumber: pageNumber, pageSize: pageSize, sortBy: sortBy, sortOrder: sortOrder }];
var body = JSON.stringify({ filters: filters, assetTypes: assetTypes, flags: flags });
var headers = assign({}, baseHeaders, {
'Content-Type': 'application/json',
'Accept': 'application/json;api-version=3.0-preview.1',
'Content-Length': body.length
});
var options = {
base: 'https://marketplace.visualstudio.com/_apis/public/gallery',
function fromMarketplace(extensionName, version, metadata) {
const [publisher, name] = extensionName.split('.');
const url = `https://marketplace.visualstudio.com/_apis/public/gallery/publishers/${publisher}/vsextensions/${name}/${version}/vspackage`;
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: url,
requestOptions: {
method: 'POST',
gzip: true,
headers: headers,
body: body
headers: baseHeaders
}
};
return remote('/extensionquery', options)
.pipe(flatmap(function (stream, f) {
var rawResult = f.contents.toString('utf8');
var result = JSON.parse(rawResult);
var extension = result.results[0].extensions[0];
if (!extension) {
return error("No such extension: " + extension);
}
var metadata = {
id: extension.extensionId,
publisherId: extension.publisher,
publisherDisplayName: extension.publisher.displayName
};
var extensionVersion = extension.versions.filter(function (v) { return v.version === version; })[0];
if (!extensionVersion) {
return error("No such extension version: " + extensionName + " @ " + version);
}
var asset = extensionVersion.files.filter(function (f) { return f.assetType === 'Microsoft.VisualStudio.Services.VSIXPackage'; })[0];
if (!asset) {
return error("No VSIX found for extension version: " + extensionName + " @ " + version);
}
util.log('Downloading extension:', util.colors.yellow(extensionName + "@" + version), '...');
var options = {
base: asset.source,
requestOptions: {
gzip: true,
headers: baseHeaders
}
};
return remote('', options)
.pipe(flatmap(function (stream) {
var packageJsonFilter = filter('package.json', { restore: true });
return stream
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(function (p) { return p.dirname = p.dirname.replace(/^extension\/?/, ''); }))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}));
}));
const packageJsonFilter = filter('package.json', { restore: true });
return remote('', options)
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}
exports.fromMarketplace = fromMarketplace;
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'vscode-test-resolver',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
// {{SQL CARBON EDIT}}
'integration-tests'
];
// {{SQL CARBON EDIT}}
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'admin-tool-ext-win',
'agent',
'import',
'profiler',
'admin-pack',
'big-data-cluster',
'dacpac',
'schema-compare',
'resource-deployment',
'cms'
];
const builtInExtensions = require('../builtInExtensions.json');
/**
* We're doing way too much stuff at once, with webpack et al. So much stuff
* that while downloading extensions from the marketplace, node js doesn't get enough
* stack frames to complete the download in under 2 minutes, at which point the
* marketplace server cuts off the http request. So, we sequentialize the extensino tasks.
*/
function sequence(streamProviders) {
const result = es.through();
function pop() {
if (streamProviders.length === 0) {
result.emit('end');
}
else {
const fn = streamProviders.shift();
fn()
.on('end', function () { setTimeout(pop, 0); })
.pipe(result, { end: false });
}
}
pop();
return result;
}
function packageExtensionsStream(optsIn) {
const opts = optsIn || {};
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1);
const localExtensions = () => sequence([...localExtensionDescriptions.map(extension => () => {
return fromLocal(extension.path, opts.sourceMappingURLBase)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
})]);
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map((d) => path.relative(root, d.path)).map((d) => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = () => gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']));
// Original code commented out here
// const localExtensionDependencies = () => gulp.src('extensions/node_modules/**', { base: '.' });
// const marketplaceExtensions = () => es.merge(
// ...builtInExtensions
// .filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
// .map(extension => {
// return fromMarketplace(extension.name, extension.version, extension.metadata)
// .pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
// })
// );
return sequence([localExtensions, localExtensionDependencies,])
.pipe(util2.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
// {{SQL CARBON EDIT}} - End
}
exports.packageExtensionsStream = packageExtensionsStream;

View File

@@ -4,22 +4,232 @@
*--------------------------------------------------------------------------------------------*/
import * as es from 'event-stream';
import { Stream } from 'stream';
import assign = require('object-assign');
import remote = require('gulp-remote-src');
const flatmap = require('gulp-flatmap');
const vzip = require('gulp-vinyl-zip');
const filter = require('gulp-filter');
const rename = require('gulp-rename');
const util = require('gulp-util');
const buffer = require('gulp-buffer');
const json = require('gulp-json-editor');
import * as fs from 'fs';
import * as glob from 'glob';
import * as gulp from 'gulp';
import * as path from 'path';
import * as vsce from 'vsce';
import { Stream } from 'stream';
import * as File from 'vinyl';
import * as vsce from 'vsce';
import { createStatsStream } from './stats';
import * as util2 from './util';
import remote = require('gulp-remote-src');
const vzip = require('gulp-vinyl-zip');
import filter = require('gulp-filter');
import rename = require('gulp-rename');
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
const buffer = require('gulp-buffer');
import json = require('gulp-json-editor');
const webpack = require('webpack');
const webpackGulp = require('webpack-stream');
export function fromLocal(extensionPath: string): Stream {
const root = path.resolve(path.join(__dirname, '..', '..'));
// {{SQL CARBON EDIT}}
import * as _ from 'underscore';
import * as vfs from 'vinyl-fs';
const deps = require('../dependencies');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
export function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
const visxDirectory = path.join(path.dirname(root), 'vsix');
try {
if (!fs.existsSync(visxDirectory)) {
fs.mkdirSync(visxDirectory);
}
} catch (err) {
// don't fail the build if the output directory already exists
console.warn(err);
}
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
let pkgJson = JSON.parse(fs.readFileSync(path.join(element.path, 'package.json'), { encoding: 'utf8' }));
const packagePath = path.join(visxDirectory, `${pkgJson.name}-${pkgJson.version}.vsix`);
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
export function packageExtensionTask(extensionName: string, platform: string, arch: string) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', extensionName);
} else {
destination = path.join(destination, 'resources', 'app', 'extensions', extensionName);
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '../..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => extensionName === name);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util2.skipDirectories())
.pipe(util2.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
// {{SQL CARBON EDIT}} - End
function fromLocal(extensionPath: string, sourceMappingURLBase?: string): Stream {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
if (fs.existsSync(webpackFilename)) {
return fromLocalWebpack(extensionPath, sourceMappingURLBase);
} else {
return fromLocalNormal(extensionPath);
}
}
function fromLocalWebpack(extensionPath: string, sourceMappingURLBase: string | undefined): Stream {
const result = es.through();
const packagedDependencies: string[] = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
}
}
}
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn, packagedDependencies }).then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath) as any
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = (<string[]>glob.sync(
path.join(extensionPath, '/**/extension.webpack.config.js'),
{ ignore: ['**/node_modules'] }
));
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data: any) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => () => {
const webpackDone = (err: any, stats: any) => {
fancyLog(`Bundled extension: ${ansiColors.yellow(path.join(path.basename(extensionPath), path.relative(extensionPath, webpackConfigPath)))}...`);
if (err) {
result.emit('error', err);
}
const { compilation } = stats;
if (compilation.errors.length > 0) {
result.emit('error', compilation.errors.join('\n'));
}
if (compilation.warnings.length > 0) {
result.emit('error', compilation.warnings.join('\n'));
}
};
const webpackConfig = {
...require(webpackConfigPath),
...{ mode: 'production' }
};
const relativeOutputPath = path.relative(extensionPath, webpackConfig.output.path);
return webpackGulp(webpackConfig, webpack, webpackDone)
.pipe(es.through(function (data) {
data.stat = data.stat || {};
data.base = extensionPath;
this.emit('data', data);
}))
.pipe(es.through(function (data: File) {
// source map handling:
// * rewrite sourceMappingURL
// * save to disk so that upload-task picks this up
if (sourceMappingURLBase) {
const contents = (<Buffer>data.contents).toString('utf8');
data.contents = Buffer.from(contents.replace(/\n\/\/# sourceMappingURL=(.*)$/gm, function (_m, g1) {
return `\n//# sourceMappingURL=${sourceMappingURLBase}/extensions/${path.basename(extensionPath)}/${relativeOutputPath}/${g1}`;
}), 'utf8');
if (/\.js\.map$/.test(data.path)) {
if (!fs.existsSync(path.dirname(data.path))) {
fs.mkdirSync(path.dirname(data.path));
}
fs.writeFileSync(data.path, data.contents);
}
}
this.emit('data', data);
}));
});
es.merge(sequence(webpackStreams), patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
// this.emit('data', data);
// }))
.pipe(result);
}).catch(err => {
console.error(extensionPath);
console.error(packagedDependencies);
result.emit('error', err);
});
return result.pipe(createStatsStream(path.basename(extensionPath)));
}
function fromLocalNormal(extensionPath: string): Stream {
const result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
@@ -37,13 +247,7 @@ export function fromLocal(extensionPath: string): Stream {
})
.catch(err => result.emit('error', err));
return result;
}
function error(err: any): Stream {
const result = es.through();
setTimeout(() => result.emit('error', err));
return result;
return result.pipe(createStatsStream(path.basename(extensionPath)));
}
const baseHeaders = {
@@ -52,82 +256,142 @@ const baseHeaders = {
'X-Market-User-Id': '291C1CD0-051A-4123-9B4B-30D60EF52EE2',
};
export function fromMarketplace(extensionName: string, version: string): Stream {
const filterType = 7;
const value = extensionName;
const criterium = { filterType, value };
const criteria = [criterium];
const pageNumber = 1;
const pageSize = 1;
const sortBy = 0;
const sortOrder = 0;
const flags = 0x1 | 0x2 | 0x80;
const assetTypes = ['Microsoft.VisualStudio.Services.VSIXPackage'];
const filters = [{ criteria, pageNumber, pageSize, sortBy, sortOrder }];
const body = JSON.stringify({ filters, assetTypes, flags });
const headers: any = assign({}, baseHeaders, {
'Content-Type': 'application/json',
'Accept': 'application/json;api-version=3.0-preview.1',
'Content-Length': body.length
});
export function fromMarketplace(extensionName: string, version: string, metadata: any): Stream {
const [publisher, name] = extensionName.split('.');
const url = `https://marketplace.visualstudio.com/_apis/public/gallery/publishers/${publisher}/vsextensions/${name}/${version}/vspackage`;
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: 'https://marketplace.visualstudio.com/_apis/public/gallery',
base: url,
requestOptions: {
method: 'POST',
gzip: true,
headers,
body: body
headers: baseHeaders
}
};
return remote('/extensionquery', options)
.pipe(flatmap((stream, f) => {
const rawResult = f.contents.toString('utf8');
const result = JSON.parse(rawResult);
const extension = result.results[0].extensions[0];
if (!extension) {
return error(`No such extension: ${extension}`);
}
const packageJsonFilter = filter('package.json', { restore: true });
const metadata = {
id: extension.extensionId,
publisherId: extension.publisher,
publisherDisplayName: extension.publisher.displayName
};
const extensionVersion = extension.versions.filter(v => v.version === version)[0];
if (!extensionVersion) {
return error(`No such extension version: ${extensionName} @ ${version}`);
}
const asset = extensionVersion.files.filter(f => f.assetType === 'Microsoft.VisualStudio.Services.VSIXPackage')[0];
if (!asset) {
return error(`No VSIX found for extension version: ${extensionName} @ ${version}`);
}
util.log('Downloading extension:', util.colors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: asset.source,
requestOptions: {
gzip: true,
headers: baseHeaders
}
};
return remote('', options)
.pipe(flatmap(stream => {
const packageJsonFilter = filter('package.json', { restore: true });
return stream
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}));
}));
return remote('', options)
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname!.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}
interface IPackageExtensionsOptions {
/**
* Set to undefined to package all of them.
*/
desiredExtensions?: string[];
sourceMappingURLBase?: string;
}
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'vscode-test-resolver',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
// {{SQL CARBON EDIT}}
'integration-tests'
];
// {{SQL CARBON EDIT}}
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'admin-tool-ext-win',
'agent',
'import',
'profiler',
'admin-pack',
'big-data-cluster',
'dacpac',
'schema-compare',
'resource-deployment',
'cms'
];
// {{SQL CARBON EDIT}} - End
interface IBuiltInExtension {
name: string;
version: string;
repo: string;
metadata: any;
}
const builtInExtensions: IBuiltInExtension[] = require('../builtInExtensions.json');
/**
* We're doing way too much stuff at once, with webpack et al. So much stuff
* that while downloading extensions from the marketplace, node js doesn't get enough
* stack frames to complete the download in under 2 minutes, at which point the
* marketplace server cuts off the http request. So, we sequentialize the extensino tasks.
*/
function sequence(streamProviders: { (): Stream }[]): Stream {
const result = es.through();
function pop() {
if (streamProviders.length === 0) {
result.emit('end');
} else {
const fn = streamProviders.shift()!;
fn()
.on('end', function () { setTimeout(pop, 0); })
.pipe(result, { end: false });
}
}
pop();
return result;
}
export function packageExtensionsStream(optsIn?: IPackageExtensionsOptions): NodeJS.ReadWriteStream {
const opts = optsIn || {};
const localExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1);
const localExtensions = () => sequence([...localExtensionDescriptions.map(extension => () => {
return fromLocal(extension.path, opts.sourceMappingURLBase)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
})]);
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map((d: any) => path.relative(root, d.path)).map((d: any) => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = () => gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']));
// Original code commented out here
// const localExtensionDependencies = () => gulp.src('extensions/node_modules/**', { base: '.' });
// const marketplaceExtensions = () => es.merge(
// ...builtInExtensions
// .filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
// .map(extension => {
// return fromMarketplace(extension.name, extension.version, extension.metadata)
// .pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
// })
// );
return sequence([localExtensions, localExtensionDependencies, /*marketplaceExtensions*/])
.pipe(util2.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
// {{SQL CARBON EDIT}} - End
}

View File

@@ -4,47 +4,47 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var fs = require("fs");
const path = require("path");
const fs = require("fs");
/**
* Returns the sha1 commit version of a repository or undefined in case of failure.
*/
function getVersion(repo) {
var git = path.join(repo, '.git');
var headPath = path.join(git, 'HEAD');
var head;
const git = path.join(repo, '.git');
const headPath = path.join(git, 'HEAD');
let head;
try {
head = fs.readFileSync(headPath, 'utf8').trim();
}
catch (e) {
return void 0;
return undefined;
}
if (/^[0-9a-f]{40}$/i.test(head)) {
return head;
}
var refMatch = /^ref: (.*)$/.exec(head);
const refMatch = /^ref: (.*)$/.exec(head);
if (!refMatch) {
return void 0;
return undefined;
}
var ref = refMatch[1];
var refPath = path.join(git, ref);
const ref = refMatch[1];
const refPath = path.join(git, ref);
try {
return fs.readFileSync(refPath, 'utf8').trim();
}
catch (e) {
// noop
}
var packedRefsPath = path.join(git, 'packed-refs');
var refsRaw;
const packedRefsPath = path.join(git, 'packed-refs');
let refsRaw;
try {
refsRaw = fs.readFileSync(packedRefsPath, 'utf8').trim();
}
catch (e) {
return void 0;
return undefined;
}
var refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
var refsMatch;
var refs = {};
const refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
let refsMatch;
let refs = {};
while (refsMatch = refsRegex.exec(refsRaw)) {
refs[refsMatch[2]] = refsMatch[1];
}

View File

@@ -10,7 +10,7 @@ import * as fs from 'fs';
/**
* Returns the sha1 commit version of a repository or undefined in case of failure.
*/
export function getVersion(repo: string): string {
export function getVersion(repo: string): string | undefined {
const git = path.join(repo, '.git');
const headPath = path.join(git, 'HEAD');
let head: string;
@@ -18,7 +18,7 @@ export function getVersion(repo: string): string {
try {
head = fs.readFileSync(headPath, 'utf8').trim();
} catch (e) {
return void 0;
return undefined;
}
if (/^[0-9a-f]{40}$/i.test(head)) {
@@ -28,7 +28,7 @@ export function getVersion(repo: string): string {
const refMatch = /^ref: (.*)$/.exec(head);
if (!refMatch) {
return void 0;
return undefined;
}
const ref = refMatch[1];
@@ -46,11 +46,11 @@ export function getVersion(repo: string): string {
try {
refsRaw = fs.readFileSync(packedRefsPath, 'utf8').trim();
} catch (e) {
return void 0;
return undefined;
}
const refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
let refsMatch: RegExpExecArray;
let refsMatch: RegExpExecArray | null;
let refs: { [ref: string]: string } = {};
while (refsMatch = refsRegex.exec(refsRaw)) {

File diff suppressed because it is too large Load Diff

View File

@@ -27,135 +27,159 @@
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/cli",
"name": "vs/workbench/api/common",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/codeEditor",
"name": "vs/workbench/contrib/cli",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/comments",
"name": "vs/workbench/contrib/codeEditor",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/debug",
"name": "vs/workbench/contrib/codeinset",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/emmet",
"name": "vs/workbench/contrib/callHierarchy",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/execution",
"name": "vs/workbench/contrib/comments",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/extensions",
"name": "vs/workbench/contrib/debug",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/feedback",
"name": "vs/workbench/contrib/emmet",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/files",
"name": "vs/workbench/contrib/extensions",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/html",
"name": "vs/workbench/contrib/externalTerminal",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/markers",
"name": "vs/workbench/contrib/feedback",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/localizations",
"name": "vs/workbench/contrib/files",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/logs",
"name": "vs/workbench/contrib/html",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/navigation",
"name": "vs/workbench/contrib/issue",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/output",
"name": "vs/workbench/contrib/markers",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/performance",
"name": "vs/workbench/contrib/localizations",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/preferences",
"name": "vs/workbench/contrib/logs",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/quickopen",
"name": "vs/workbench/contrib/output",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/relauncher",
"name": "vs/workbench/contrib/performance",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/scm",
"name": "vs/workbench/contrib/preferences",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/search",
"name": "vs/workbench/contrib/quickopen",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/snippets",
"name": "vs/workbench/contrib/remote",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/surveys",
"name": "vs/workbench/contrib/relauncher",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/tasks",
"name": "vs/workbench/contrib/scm",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/terminal",
"name": "vs/workbench/contrib/search",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/themes",
"name": "vs/workbench/contrib/snippets",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/trust",
"name": "vs/workbench/contrib/format",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/update",
"name": "vs/workbench/contrib/stats",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/url",
"name": "vs/workbench/contrib/surveys",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/watermark",
"name": "vs/workbench/contrib/tasks",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/webview",
"name": "vs/workbench/contrib/terminal",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/welcome",
"name": "vs/workbench/contrib/themes",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/outline",
"name": "vs/workbench/contrib/trust",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/update",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/url",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/watermark",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/webview",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/welcome",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/outline",
"project": "vscode-workbench"
},
{
@@ -166,6 +190,10 @@
"name": "vs/workbench/services/bulkEdit",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/commands",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/configuration",
"project": "vscode-workbench"
@@ -191,13 +219,17 @@
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/jsonschemas",
"name": "vs/workbench/services/extensionManagement",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/files",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/integrity",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/keybinding",
"project": "vscode-workbench"
@@ -210,6 +242,10 @@
"name": "vs/workbench/services/progress",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/remote",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/textfile",
"project": "vscode-workbench"
@@ -230,6 +266,10 @@
"name": "vs/workbench/services/decorations",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/label",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/preferences",
"project": "vscode-preferences"

View File

@@ -7,25 +7,25 @@ import * as path from 'path';
import * as fs from 'fs';
import { through, readable, ThroughStream } from 'event-stream';
import File = require('vinyl');
import * as File from 'vinyl';
import * as Is from 'is';
import * as xml2js from 'xml2js';
import * as glob from 'glob';
import * as https from 'https';
import * as gulp from 'gulp';
var util = require('gulp-util');
var iconv = require('iconv-lite');
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as iconv from 'iconv-lite';
const NUMBER_OF_CONCURRENT_DOWNLOADS = 4;
function log(message: any, ...rest: any[]): void {
util.log(util.colors.green('[i18n]'), message, ...rest);
fancyLog(ansiColors.green('[i18n]'), message, ...rest);
}
export interface Language {
id: string; // laguage id, e.g. zh-tw, de
transifexId?: string; // language id used in transifex, e.g zh-hant, de (optional, if not set, the id is used)
id: string; // language id, e.g. zh-tw, de
translationId?: string; // language id used in translation tools, e.g zh-hant, de (optional, if not set, the id is used)
folderName?: string; // language specific folder name, e.g. cht, deu (optional, if not set, the id is used)
}
@@ -38,8 +38,8 @@ export interface InnoSetup {
}
export const defaultLanguages: Language[] = [
{ id: 'zh-tw', folderName: 'cht', transifexId: 'zh-hant' },
{ id: 'zh-cn', folderName: 'chs', transifexId: 'zh-hans' },
{ id: 'zh-tw', folderName: 'cht', translationId: 'zh-hant' },
{ id: 'zh-cn', folderName: 'chs', translationId: 'zh-hans' },
{ id: 'ja', folderName: 'jpn' },
{ id: 'ko', folderName: 'kor' },
{ id: 'de', folderName: 'deu' },
@@ -57,7 +57,7 @@ export const extraLanguages: Language[] = [
];
// non built-in extensions also that are transifex and need to be part of the language packs
const externalExtensionsWithTranslations = {
export const externalExtensionsWithTranslations = {
'vscode-chrome-debug': 'msjsdiag.debugger-for-chrome',
'vscode-node-debug': 'ms-vscode.node-debug',
'vscode-node-debug2': 'ms-vscode.node-debug2'
@@ -71,7 +71,7 @@ interface Map<V> {
interface Item {
id: string;
message: string;
comment: string;
comment?: string;
}
export interface Resource {
@@ -137,27 +137,6 @@ module PackageJsonFormat {
}
}
interface ModuleJsonFormat {
messages: string[];
keys: (string | LocalizeInfo)[];
}
module ModuleJsonFormat {
export function is(value: any): value is ModuleJsonFormat {
let candidate = value as ModuleJsonFormat;
return Is.defined(candidate)
&& Is.array(candidate.messages) && candidate.messages.every(message => Is.string(message))
&& Is.array(candidate.keys) && candidate.keys.every(key => Is.string(key) || LocalizeInfo.is(key));
}
}
interface BundledExtensionHeaderFormat {
id: string;
type: string;
hash: string;
outDir: string;
}
interface BundledExtensionFormat {
[key: string]: {
messages: string[];
@@ -165,10 +144,19 @@ interface BundledExtensionFormat {
};
}
interface I18nFormat {
version: string;
contents: {
[module: string]: {
[messageKey: string]: string;
};
};
}
export class Line {
private buffer: string[] = [];
constructor(private indent: number = 0) {
constructor(indent: number = 0) {
if (indent > 0) {
this.buffer.push(new Array(indent + 1).join(' '));
}
@@ -235,8 +223,8 @@ export class XLF {
let existingKeys = new Set<string>();
for (let i = 0; i < keys.length; i++) {
let key = keys[i];
let realKey: string;
let comment: string;
let realKey: string | undefined;
let comment: string | undefined;
if (Is.string(key)) {
realKey = key;
comment = undefined;
@@ -286,17 +274,17 @@ export class XLF {
}
static parsePseudo = function (xlfString: string): Promise<ParsedXLF[]> {
return new Promise((resolve, reject) => {
return new Promise((resolve) => {
let parser = new xml2js.Parser();
let files: { messages: Map<string>, originalFilePath: string, language: string }[] = [];
parser.parseString(xlfString, function (err, result) {
parser.parseString(xlfString, function (_err: any, result: any) {
const fileNodes: any[] = result['xliff']['file'];
fileNodes.forEach(file => {
const originalFilePath = file.$.original;
const messages: Map<string> = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach(unit => {
transUnits.forEach((unit: any) => {
const key = unit.$.id;
const val = pseudify(unit.source[0]['_'].toString());
if (key && val) {
@@ -317,7 +305,7 @@ export class XLF {
let files: { messages: Map<string>, originalFilePath: string, language: string }[] = [];
parser.parseString(xlfString, function (err, result) {
parser.parseString(xlfString, function (err: any, result: any) {
if (err) {
reject(new Error(`XLF parsing error: Failed to parse XLIFF string. ${err}`));
}
@@ -340,17 +328,20 @@ export class XLF {
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach(unit => {
transUnits.forEach((unit: any) => {
const key = unit.$.id;
if (!unit.target) {
return; // No translation available
}
const val = unit.target.toString();
let val = unit.target[0];
if (typeof val !== 'string') {
val = val._;
}
if (key && val) {
messages[key] = decodeEntities(val);
} else {
reject(new Error(`XLF parsing error: XLIFF file does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
reject(new Error(`XLF parsing error: XLIFF file ${originalFilePath} does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: language.toLowerCase() });
@@ -369,7 +360,7 @@ export interface ITask<T> {
interface ILimitedTaskFactory<T> {
factory: ITask<Promise<T>>;
c: (value?: T | Thenable<T>) => void;
c: (value?: T | Promise<T>) => void;
e: (error?: any) => void;
}
@@ -391,7 +382,7 @@ export class Limiter<T> {
private consume(): void {
while (this.outstandingPromises.length && this.runningPromises < this.maxDegreeOfParalellism) {
const iLimitedTask = this.outstandingPromises.shift();
const iLimitedTask = this.outstandingPromises.shift()!;
this.runningPromises++;
const promise = iLimitedTask.factory();
@@ -419,8 +410,8 @@ function stripComments(content: string): string {
* Third matches block comments
* Fourth matches line comments
*/
var regexp: RegExp = /("(?:[^\\\"]*(?:\\.)?)*")|('(?:[^\\\']*(?:\\.)?)*')|(\/\*(?:\r?\n|.)*?\*\/)|(\/{2,}.*?(?:(?:\r?\n)|$))/g;
let result = content.replace(regexp, (match, m1, m2, m3, m4) => {
const regexp = /("(?:[^\\\"]*(?:\\.)?)*")|('(?:[^\\\']*(?:\\.)?)*')|(\/\*(?:\r?\n|.)*?\*\/)|(\/{2,}.*?(?:(?:\r?\n)|$))/g;
let result = content.replace(regexp, (match, _m1, _m2, m3, m4) => {
// Only one of m1, m2, m3, m4 matches
if (m3) {
// A block comment. Replace with nothing
@@ -442,9 +433,9 @@ function stripComments(content: string): string {
}
function escapeCharacters(value: string): string {
var result: string[] = [];
for (var i = 0; i < value.length; i++) {
var ch = value.charAt(i);
const result: string[] = [];
for (let i = 0; i < value.length; i++) {
const ch = value.charAt(i);
switch (ch) {
case '\'':
result.push('\\\'');
@@ -484,7 +475,6 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
let statistics: Map<number> = Object.create(null);
let total: number = 0;
let defaultMessages: Map<Map<string>> = Object.create(null);
let modules = Object.keys(keysSection);
modules.forEach((module) => {
@@ -497,7 +487,6 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
let messageMap: Map<string> = Object.create(null);
defaultMessages[module] = messageMap;
keys.map((key, i) => {
total++;
if (typeof key === 'string') {
messageMap[key] = messages[i];
} else {
@@ -506,7 +495,11 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
});
});
let languageDirectory = path.join(__dirname, '..', '..', 'i18n');
let languageDirectory = path.join(__dirname, '..', '..', '..', 'vscode-loc', 'i18n');
if (!fs.existsSync(languageDirectory)) {
log(`No VS Code localization repository found. Looking at ${languageDirectory}`);
log(`To bundle translations please check out the vscode-loc repository as a sibling of the vscode repository.`);
}
let sortedLanguages = sortLanguages(languages);
sortedLanguages.forEach((language) => {
if (process.env['VSCODE_BUILD_VERBOSE']) {
@@ -515,31 +508,35 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
statistics[language.id] = 0;
let localizedModules: Map<string[]> = Object.create(null);
let languageFolderName = language.folderName || language.id;
let cwd = path.join(languageDirectory, languageFolderName, 'src');
let languageFolderName = language.translationId || language.id;
let i18nFile = path.join(languageDirectory, `vscode-language-pack-${languageFolderName}`, 'translations', 'main.i18n.json');
let allMessages: I18nFormat | undefined;
if (fs.existsSync(i18nFile)) {
let content = stripComments(fs.readFileSync(i18nFile, 'utf8'));
allMessages = JSON.parse(content);
}
modules.forEach((module) => {
let order = keysSection[module];
let i18nFile = path.join(cwd, module) + '.i18n.json';
let messages: Map<string> = null;
if (fs.existsSync(i18nFile)) {
let content = stripComments(fs.readFileSync(i18nFile, 'utf8'));
messages = JSON.parse(content);
} else {
let moduleMessage: { [messageKey: string]: string } | undefined;
if (allMessages) {
moduleMessage = allMessages.contents[module];
}
if (!moduleMessage) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log(`No localized messages found for module ${module}. Using default messages.`);
}
messages = defaultMessages[module];
statistics[language.id] = statistics[language.id] + Object.keys(messages).length;
moduleMessage = defaultMessages[module];
statistics[language.id] = statistics[language.id] + Object.keys(moduleMessage).length;
}
let localizedMessages: string[] = [];
order.forEach((keyInfo) => {
let key: string = null;
let key: string | null = null;
if (typeof keyInfo === 'string') {
key = keyInfo;
} else {
key = keyInfo.key;
}
let message: string = messages[key];
let message: string = moduleMessage![key];
if (!message) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log(`No localized message found for key ${key} in module ${module}. Using default message.`);
@@ -625,7 +622,7 @@ export function getResource(sourceFile: string): Resource {
return { name: 'vs/base', project: editorProject };
} else if (/^vs\/code/.test(sourceFile)) {
return { name: 'vs/code', project: workbenchProject };
} else if (/^vs\/workbench\/parts/.test(sourceFile)) {
} else if (/^vs\/workbench\/contrib/.test(sourceFile)) {
resource = sourceFile.split('/', 4).join('/');
return { name: resource, project: workbenchProject };
} else if (/^vs\/workbench\/services/.test(sourceFile)) {
@@ -712,7 +709,7 @@ export function createXlfFilesForExtensions(): ThroughStream {
}
return _xlf;
}
gulp.src([`./extensions/${extensionName}/package.nls.json`, `./extensions/${extensionName}/**/nls.metadata.json`]).pipe(through(function (file: File) {
gulp.src([`./extensions/${extensionName}/package.nls.json`, `./extensions/${extensionName}/**/nls.metadata.json`], { allowEmpty: true }).pipe(through(function (file: File) {
if (file.isBuffer()) {
const buffer: Buffer = file.contents as Buffer;
const basename = path.basename(file.path);
@@ -824,8 +821,8 @@ export function createXlfFilesForIsl(): ThroughStream {
}
export function pushXlfFiles(apiHostname: string, username: string, password: string): ThroughStream {
let tryGetPromises = [];
let updateCreatePromises = [];
let tryGetPromises: Array<Promise<boolean>> = [];
let updateCreatePromises: Array<Promise<boolean>> = [];
return through(function (this: ThroughStream, file: File) {
const project = path.dirname(file.relative);
@@ -890,7 +887,7 @@ function getAllResources(project: string, apiHostname: string, username: string,
export function findObsoleteResources(apiHostname: string, username: string, password: string): ThroughStream {
let resourcesByProject: Map<string[]> = Object.create(null);
resourcesByProject[extensionsProject] = [].concat(externalExtensionsWithTranslations); // clone
resourcesByProject[extensionsProject] = ([] as any[]).concat(externalExtensionsWithTranslations); // clone
return through(function (this: ThroughStream, file: File) {
const project = path.dirname(file.relative);
@@ -907,7 +904,7 @@ export function findObsoleteResources(apiHostname: string, username: string, pas
const json = JSON.parse(fs.readFileSync('./build/lib/i18n.resources.json', 'utf8'));
let i18Resources = [...json.editor, ...json.workbench].map((r: Resource) => r.project + '/' + r.name.replace(/\//g, '_'));
let extractedResources = [];
let extractedResources: string[] = [];
for (let project of [workbenchProject, editorProject]) {
for (let resource of resourcesByProject[project]) {
if (resource !== 'setup_messages') {
@@ -920,7 +917,7 @@ export function findObsoleteResources(apiHostname: string, username: string, pas
console.log(`[i18n] Missing resources in file 'build/lib/i18n.resources.json': JSON.stringify(${extractedResources.filter(p => i18Resources.indexOf(p) === -1)})`);
}
let promises = [];
let promises: Array<Promise<void>> = [];
for (let project in resourcesByProject) {
promises.push(
getAllResources(project, apiHostname, username, password).then(resources => {
@@ -965,7 +962,7 @@ function tryGetResource(project: string, slug: string, apiHostname: string, cred
}
function createResource(project: string, slug: string, xlfFile: File, apiHostname: string, credentials: any): Promise<any> {
return new Promise((resolve, reject) => {
return new Promise((_resolve, reject) => {
const data = JSON.stringify({
'content': xlfFile.contents.toString(),
'name': slug,
@@ -1056,8 +1053,8 @@ export function pullCoreAndExtensionsXlfFiles(apiHostname: string, username: str
// extensions
let extensionsToLocalize = Object.create(null);
glob.sync('./extensions/**/*.nls.json', ).forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/*/node_modules/vscode-nls', ).forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/**/*.nls.json').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/*/node_modules/vscode-nls').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
Object.keys(extensionsToLocalize).forEach(extension => {
_coreAndExtensionResources.push({ name: extension, project: extensionsProject });
@@ -1085,7 +1082,7 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
let expectedTranslationsCount = resources.length;
let translationsRetrieved = 0, called = false;
return readable(function (count, callback) {
return readable(function (_count: any, callback: any) {
// Mark end of stream when all resources were retrieved
if (translationsRetrieved === expectedTranslationsCount) {
return this.emit('end');
@@ -1095,7 +1092,7 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
called = true;
const stream = this;
resources.map(function (resource) {
retrieveResource(language, resource, apiHostname, credentials).then((file: File) => {
retrieveResource(language, resource, apiHostname, credentials).then((file: File | null) => {
if (file) {
stream.emit('data', file);
}
@@ -1107,13 +1104,13 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
callback();
});
}
const limiter = new Limiter<File>(NUMBER_OF_CONCURRENT_DOWNLOADS);
const limiter = new Limiter<File | null>(NUMBER_OF_CONCURRENT_DOWNLOADS);
function retrieveResource(language: Language, resource: Resource, apiHostname, credentials): Promise<File> {
return limiter.queue(() => new Promise<File>((resolve, reject) => {
function retrieveResource(language: Language, resource: Resource, apiHostname: string, credentials: string): Promise<File | null> {
return limiter.queue(() => new Promise<File | null>((resolve, reject) => {
const slug = resource.name.replace(/\//g, '_');
const project = resource.project;
let transifexLanguageId = language.id === 'ps' ? 'en' : language.transifexId || language.id;
let transifexLanguageId = language.id === 'ps' ? 'en' : language.translationId || language.id;
const options = {
hostname: apiHostname,
path: `/api/2/project/${project}/resource/${slug}/translation/${transifexLanguageId}?file&mode=onlyreviewed`,
@@ -1212,10 +1209,10 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
let parsePromises: Promise<ParsedXLF[]>[] = [];
let mainPack: I18nPack = { version: i18nPackVersion, contents: {} };
let extensionsPacks: Map<I18nPack> = {};
let errors: any[] = [];
return through(function (this: ThroughStream, xlf: File) {
let stream = this;
let project = path.dirname(xlf.path);
let resource = path.basename(xlf.path, '.xlf');
let project = path.basename(path.dirname(xlf.relative));
let resource = path.basename(xlf.relative, '.xlf');
let contents = xlf.contents.toString();
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
parsePromises.push(parsePromise);
@@ -1242,10 +1239,15 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
}
});
}
);
).catch(reason => {
errors.push(reason);
});
}, function () {
Promise.all(parsePromises)
.then(() => {
if (errors.length > 0) {
throw errors;
}
const translatedMainFile = createI18nFile('./main', mainPack);
resultingTranslationPaths.push({ id: 'vscode', resourceName: 'main.i18n.json' });
@@ -1264,7 +1266,9 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
}
this.queue(null);
})
.catch(reason => { throw new Error(reason); });
.catch((reason) => {
this.emit('error', reason);
});
});
}
@@ -1285,11 +1289,15 @@ export function prepareIslFiles(language: Language, innoSetupConfig: InnoSetup):
stream.queue(translatedFile);
});
}
);
).catch(reason => {
this.emit('error', reason);
});
}, function () {
Promise.all(parsePromises)
.then(() => { this.queue(null); })
.catch(reason => { throw new Error(reason); });
.catch(reason => {
this.emit('error', reason);
});
});
}
@@ -1306,7 +1314,7 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
let firstChar = line.charAt(0);
if (firstChar === '[' || firstChar === ';') {
if (line === '; *** Inno Setup version 5.5.3+ English messages ***') {
content.push(`; *** Inno Setup version 5.5.3+ ${innoSetup.defaultInfo.name} messages ***`);
content.push(`; *** Inno Setup version 5.5.3+ ${innoSetup.defaultInfo!.name} messages ***`);
} else {
content.push(line);
}
@@ -1316,9 +1324,9 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
let translated = line;
if (key) {
if (key === 'LanguageName') {
translated = `${key}=${innoSetup.defaultInfo.name}`;
translated = `${key}=${innoSetup.defaultInfo!.name}`;
} else if (key === 'LanguageID') {
translated = `${key}=${innoSetup.defaultInfo.id}`;
translated = `${key}=${innoSetup.defaultInfo!.id}`;
} else if (key === 'LanguageCodePage') {
translated = `${key}=${innoSetup.codePage.substr(2)}`;
} else {
@@ -1339,14 +1347,14 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
return new File({
path: filePath,
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8'), innoSetup.codePage)
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage)
});
}
function encodeEntities(value: string): string {
var result: string[] = [];
for (var i = 0; i < value.length; i++) {
var ch = value[i];
let result: string[] = [];
for (let i = 0; i < value.length; i++) {
let ch = value[i];
switch (ch) {
case '<':
result.push('&lt;');
@@ -1370,4 +1378,4 @@ function decodeEntities(value: string): string {
function pseudify(message: string) {
return '\uFF3B' + message.replace(/[aouei]/g, '$&$&') + '\uFF3D';
}
}

View File

@@ -3,13 +3,12 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
var ts = require("typescript");
var lazy = require("lazy.js");
var event_stream_1 = require("event-stream");
var File = require("vinyl");
var sm = require("source-map");
var assign = require("object-assign");
var path = require("path");
const ts = require("typescript");
const lazy = require("lazy.js");
const event_stream_1 = require("event-stream");
const File = require("vinyl");
const sm = require("source-map");
const path = require("path");
var CollectStepResult;
(function (CollectStepResult) {
CollectStepResult[CollectStepResult["Yes"] = 0] = "Yes";
@@ -18,9 +17,9 @@ var CollectStepResult;
CollectStepResult[CollectStepResult["NoAndRecurse"] = 3] = "NoAndRecurse";
})(CollectStepResult || (CollectStepResult = {}));
function collect(node, fn) {
var result = [];
const result = [];
function loop(node) {
var stepResult = fn(node);
const stepResult = fn(node);
if (stepResult === CollectStepResult.Yes || stepResult === CollectStepResult.YesAndRecurse) {
result.push(node);
}
@@ -32,43 +31,45 @@ function collect(node, fn) {
return result;
}
function clone(object) {
var result = {};
for (var id in object) {
const result = {};
for (const id in object) {
result[id] = object[id];
}
return result;
}
function template(lines) {
var indent = '', wrap = '';
let indent = '', wrap = '';
if (lines.length > 1) {
indent = '\t';
wrap = '\n';
}
return "/*---------------------------------------------------------\n * Copyright (C) Microsoft Corporation. All rights reserved.\n *--------------------------------------------------------*/\ndefine([], [" + (wrap + lines.map(function (l) { return indent + l; }).join(',\n') + wrap) + "]);";
return `/*---------------------------------------------------------
* Copyright (C) Microsoft Corporation. All rights reserved.
*--------------------------------------------------------*/
define([], [${wrap + lines.map(l => indent + l).join(',\n') + wrap}]);`;
}
/**
* Returns a stream containing the patched JavaScript and source maps.
*/
function nls() {
var input = event_stream_1.through();
var output = input.pipe(event_stream_1.through(function (f) {
var _this = this;
const input = event_stream_1.through();
const output = input.pipe(event_stream_1.through(function (f) {
if (!f.sourceMap) {
return this.emit('error', new Error("File " + f.relative + " does not have sourcemaps."));
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
}
var source = f.sourceMap.sources[0];
let source = f.sourceMap.sources[0];
if (!source) {
return this.emit('error', new Error("File " + f.relative + " does not have a source in the source map."));
return this.emit('error', new Error(`File ${f.relative} does not have a source in the source map.`));
}
var root = f.sourceMap.sourceRoot;
const root = f.sourceMap.sourceRoot;
if (root) {
source = path.join(root, source);
}
var typescript = f.sourceMap.sourcesContent[0];
const typescript = f.sourceMap.sourcesContent[0];
if (!typescript) {
return this.emit('error', new Error("File " + f.relative + " does not have the original content in the source map."));
return this.emit('error', new Error(`File ${f.relative} does not have the original content in the source map.`));
}
nls.patchFiles(f, typescript).forEach(function (f) { return _this.emit('data', f); });
nls.patchFiles(f, typescript).forEach(f => this.emit('data', f));
}));
return event_stream_1.duplex(input, output);
}
@@ -76,8 +77,7 @@ function isImportNode(node) {
return node.kind === ts.SyntaxKind.ImportDeclaration || node.kind === ts.SyntaxKind.ImportEqualsDeclaration;
}
(function (nls_1) {
function fileFrom(file, contents, path) {
if (path === void 0) { path = file.path; }
function fileFrom(file, contents, path = file.path) {
return new File({
contents: Buffer.from(contents),
base: file.base,
@@ -87,29 +87,27 @@ function isImportNode(node) {
}
nls_1.fileFrom = fileFrom;
function mappedPositionFrom(source, lc) {
return { source: source, line: lc.line + 1, column: lc.character };
return { source, line: lc.line + 1, column: lc.character };
}
nls_1.mappedPositionFrom = mappedPositionFrom;
function lcFrom(position) {
return { line: position.line - 1, character: position.column };
}
nls_1.lcFrom = lcFrom;
var SingleFileServiceHost = /** @class */ (function () {
function SingleFileServiceHost(options, filename, contents) {
var _this = this;
class SingleFileServiceHost {
constructor(options, filename, contents) {
this.options = options;
this.filename = filename;
this.getCompilationSettings = function () { return _this.options; };
this.getScriptFileNames = function () { return [_this.filename]; };
this.getScriptVersion = function () { return '1'; };
this.getScriptSnapshot = function (name) { return name === _this.filename ? _this.file : _this.lib; };
this.getCurrentDirectory = function () { return ''; };
this.getDefaultLibFileName = function () { return 'lib.d.ts'; };
this.getCompilationSettings = () => this.options;
this.getScriptFileNames = () => [this.filename];
this.getScriptVersion = () => '1';
this.getScriptSnapshot = (name) => name === this.filename ? this.file : this.lib;
this.getCurrentDirectory = () => '';
this.getDefaultLibFileName = () => 'lib.d.ts';
this.file = ts.ScriptSnapshot.fromString(contents);
this.lib = ts.ScriptSnapshot.fromString('');
}
return SingleFileServiceHost;
}());
}
nls_1.SingleFileServiceHost = SingleFileServiceHost;
function isCallExpressionWithinTextSpanCollectStep(textSpan, node) {
if (!ts.textSpanContainsTextSpan({ start: node.pos, length: node.end - node.pos }, textSpan)) {
@@ -117,97 +115,96 @@ function isImportNode(node) {
}
return node.kind === ts.SyntaxKind.CallExpression ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse;
}
function analyze(contents, options) {
if (options === void 0) { options = {}; }
var filename = 'file.ts';
var serviceHost = new SingleFileServiceHost(assign(clone(options), { noResolve: true }), filename, contents);
var service = ts.createLanguageService(serviceHost);
var sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
function analyze(contents, options = {}) {
const filename = 'file.ts';
const serviceHost = new SingleFileServiceHost(Object.assign(clone(options), { noResolve: true }), filename, contents);
const service = ts.createLanguageService(serviceHost);
const sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
// all imports
var imports = lazy(collect(sourceFile, function (n) { return isImportNode(n) ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse; }));
const imports = lazy(collect(sourceFile, n => isImportNode(n) ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse));
// import nls = require('vs/nls');
var importEqualsDeclarations = imports
.filter(function (n) { return n.kind === ts.SyntaxKind.ImportEqualsDeclaration; })
.map(function (n) { return n; })
.filter(function (d) { return d.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference; })
.filter(function (d) { return d.moduleReference.expression.getText() === '\'vs/nls\''; });
const importEqualsDeclarations = imports
.filter(n => n.kind === ts.SyntaxKind.ImportEqualsDeclaration)
.map(n => n)
.filter(d => d.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference)
.filter(d => d.moduleReference.expression.getText() === '\'vs/nls\'');
// import ... from 'vs/nls';
var importDeclarations = imports
.filter(function (n) { return n.kind === ts.SyntaxKind.ImportDeclaration; })
.map(function (n) { return n; })
.filter(function (d) { return d.moduleSpecifier.kind === ts.SyntaxKind.StringLiteral; })
.filter(function (d) { return d.moduleSpecifier.getText() === '\'vs/nls\''; })
.filter(function (d) { return !!d.importClause && !!d.importClause.namedBindings; });
var nlsExpressions = importEqualsDeclarations
.map(function (d) { return d.moduleReference.expression; })
.concat(importDeclarations.map(function (d) { return d.moduleSpecifier; }))
.map(function (d) { return ({
const importDeclarations = imports
.filter(n => n.kind === ts.SyntaxKind.ImportDeclaration)
.map(n => n)
.filter(d => d.moduleSpecifier.kind === ts.SyntaxKind.StringLiteral)
.filter(d => d.moduleSpecifier.getText() === '\'vs/nls\'')
.filter(d => !!d.importClause && !!d.importClause.namedBindings);
const nlsExpressions = importEqualsDeclarations
.map(d => d.moduleReference.expression)
.concat(importDeclarations.map(d => d.moduleSpecifier))
.map(d => ({
start: ts.getLineAndCharacterOfPosition(sourceFile, d.getStart()),
end: ts.getLineAndCharacterOfPosition(sourceFile, d.getEnd())
}); });
}));
// `nls.localize(...)` calls
var nlsLocalizeCallExpressions = importDeclarations
.filter(function (d) { return d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport; })
.map(function (d) { return d.importClause.namedBindings.name; })
.concat(importEqualsDeclarations.map(function (d) { return d.name; }))
const nlsLocalizeCallExpressions = importDeclarations
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport))
.map(d => d.importClause.namedBindings.name)
.concat(importEqualsDeclarations.map(d => d.name))
// find read-only references to `nls`
.map(function (n) { return service.getReferencesAtPosition(filename, n.pos + 1); })
.map(n => service.getReferencesAtPosition(filename, n.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; })
.filter(r => !r.isWriteAccess)
// find the deepest call expressions AST nodes that contain those references
.map(function (r) { return collect(sourceFile, function (n) { return isCallExpressionWithinTextSpanCollectStep(r.textSpan, n); }); })
.map(function (a) { return lazy(a).last(); })
.filter(function (n) { return !!n; })
.map(function (n) { return n; })
.map(r => collect(sourceFile, n => isCallExpressionWithinTextSpanCollectStep(r.textSpan, n)))
.map(a => lazy(a).last())
.filter(n => !!n)
.map(n => n)
// only `localize` calls
.filter(function (n) { return n.expression.kind === ts.SyntaxKind.PropertyAccessExpression && n.expression.name.getText() === 'localize'; });
.filter(n => n.expression.kind === ts.SyntaxKind.PropertyAccessExpression && n.expression.name.getText() === 'localize');
// `localize` named imports
var allLocalizeImportDeclarations = importDeclarations
.filter(function (d) { return d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports; })
.map(function (d) { return [].concat(d.importClause.namedBindings.elements); })
const allLocalizeImportDeclarations = importDeclarations
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports))
.map(d => [].concat(d.importClause.namedBindings.elements))
.flatten();
// `localize` read-only references
var localizeReferences = allLocalizeImportDeclarations
.filter(function (d) { return d.name.getText() === 'localize'; })
.map(function (n) { return service.getReferencesAtPosition(filename, n.pos + 1); })
const localizeReferences = allLocalizeImportDeclarations
.filter(d => d.name.getText() === 'localize')
.map(n => service.getReferencesAtPosition(filename, n.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; });
.filter(r => !r.isWriteAccess);
// custom named `localize` read-only references
var namedLocalizeReferences = allLocalizeImportDeclarations
.filter(function (d) { return d.propertyName && d.propertyName.getText() === 'localize'; })
.map(function (n) { return service.getReferencesAtPosition(filename, n.name.pos + 1); })
const namedLocalizeReferences = allLocalizeImportDeclarations
.filter(d => d.propertyName && d.propertyName.getText() === 'localize')
.map(n => service.getReferencesAtPosition(filename, n.name.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; });
.filter(r => !r.isWriteAccess);
// find the deepest call expressions AST nodes that contain those references
var localizeCallExpressions = localizeReferences
const localizeCallExpressions = localizeReferences
.concat(namedLocalizeReferences)
.map(function (r) { return collect(sourceFile, function (n) { return isCallExpressionWithinTextSpanCollectStep(r.textSpan, n); }); })
.map(function (a) { return lazy(a).last(); })
.filter(function (n) { return !!n; })
.map(function (n) { return n; });
.map(r => collect(sourceFile, n => isCallExpressionWithinTextSpanCollectStep(r.textSpan, n)))
.map(a => lazy(a).last())
.filter(n => !!n)
.map(n => n);
// collect everything
var localizeCalls = nlsLocalizeCallExpressions
const localizeCalls = nlsLocalizeCallExpressions
.concat(localizeCallExpressions)
.map(function (e) { return e.arguments; })
.filter(function (a) { return a.length > 1; })
.sort(function (a, b) { return a[0].getStart() - b[0].getStart(); })
.map(function (a) { return ({
.map(e => e.arguments)
.filter(a => a.length > 1)
.sort((a, b) => a[0].getStart() - b[0].getStart())
.map(a => ({
keySpan: { start: ts.getLineAndCharacterOfPosition(sourceFile, a[0].getStart()), end: ts.getLineAndCharacterOfPosition(sourceFile, a[0].getEnd()) },
key: a[0].getText(),
valueSpan: { start: ts.getLineAndCharacterOfPosition(sourceFile, a[1].getStart()), end: ts.getLineAndCharacterOfPosition(sourceFile, a[1].getEnd()) },
value: a[1].getText()
}); });
}));
return {
localizeCalls: localizeCalls.toArray(),
nlsExpressions: nlsExpressions.toArray()
};
}
nls_1.analyze = analyze;
var TextModel = /** @class */ (function () {
function TextModel(contents) {
var regex = /\r\n|\r|\n/g;
var index = 0;
var match;
class TextModel {
constructor(contents) {
const regex = /\r\n|\r|\n/g;
let index = 0;
let match;
this.lines = [];
this.lineEndings = [];
while (match = regex.exec(contents)) {
@@ -220,85 +217,80 @@ function isImportNode(node) {
this.lineEndings.push('');
}
}
TextModel.prototype.get = function (index) {
get(index) {
return this.lines[index];
};
TextModel.prototype.set = function (index, line) {
}
set(index, line) {
this.lines[index] = line;
};
Object.defineProperty(TextModel.prototype, "lineCount", {
get: function () {
return this.lines.length;
},
enumerable: true,
configurable: true
});
}
get lineCount() {
return this.lines.length;
}
/**
* Applies patch(es) to the model.
* Multiple patches must be ordered.
* Does not support patches spanning multiple lines.
*/
TextModel.prototype.apply = function (patch) {
var startLineNumber = patch.span.start.line;
var endLineNumber = patch.span.end.line;
var startLine = this.lines[startLineNumber] || '';
var endLine = this.lines[endLineNumber] || '';
apply(patch) {
const startLineNumber = patch.span.start.line;
const endLineNumber = patch.span.end.line;
const startLine = this.lines[startLineNumber] || '';
const endLine = this.lines[endLineNumber] || '';
this.lines[startLineNumber] = [
startLine.substring(0, patch.span.start.character),
patch.content,
endLine.substring(patch.span.end.character)
].join('');
for (var i = startLineNumber + 1; i <= endLineNumber; i++) {
for (let i = startLineNumber + 1; i <= endLineNumber; i++) {
this.lines[i] = '';
}
};
TextModel.prototype.toString = function () {
}
toString() {
return lazy(this.lines).zip(this.lineEndings)
.flatten().toArray().join('');
};
return TextModel;
}());
}
}
nls_1.TextModel = TextModel;
function patchJavascript(patches, contents, moduleId) {
var model = new nls.TextModel(contents);
const model = new nls.TextModel(contents);
// patch the localize calls
lazy(patches).reverse().each(function (p) { return model.apply(p); });
lazy(patches).reverse().each(p => model.apply(p));
// patch the 'vs/nls' imports
var firstLine = model.get(0);
var patchedFirstLine = firstLine.replace(/(['"])vs\/nls\1/g, "$1vs/nls!" + moduleId + "$1");
const firstLine = model.get(0);
const patchedFirstLine = firstLine.replace(/(['"])vs\/nls\1/g, `$1vs/nls!${moduleId}$1`);
model.set(0, patchedFirstLine);
return model.toString();
}
nls_1.patchJavascript = patchJavascript;
function patchSourcemap(patches, rsm, smc) {
var smg = new sm.SourceMapGenerator({
const smg = new sm.SourceMapGenerator({
file: rsm.file,
sourceRoot: rsm.sourceRoot
});
patches = patches.reverse();
var currentLine = -1;
var currentLineDiff = 0;
var source = null;
smc.eachMapping(function (m) {
var patch = patches[patches.length - 1];
var original = { line: m.originalLine, column: m.originalColumn };
var generated = { line: m.generatedLine, column: m.generatedColumn };
let currentLine = -1;
let currentLineDiff = 0;
let source = null;
smc.eachMapping(m => {
const patch = patches[patches.length - 1];
const original = { line: m.originalLine, column: m.originalColumn };
const generated = { line: m.generatedLine, column: m.generatedColumn };
if (currentLine !== generated.line) {
currentLineDiff = 0;
}
currentLine = generated.line;
generated.column += currentLineDiff;
if (patch && m.generatedLine - 1 === patch.span.end.line && m.generatedColumn === patch.span.end.character) {
var originalLength = patch.span.end.character - patch.span.start.character;
var modifiedLength = patch.content.length;
var lengthDiff = modifiedLength - originalLength;
const originalLength = patch.span.end.character - patch.span.start.character;
const modifiedLength = patch.content.length;
const lengthDiff = modifiedLength - originalLength;
currentLineDiff += lengthDiff;
generated.column += lengthDiff;
patches.pop();
}
source = rsm.sourceRoot ? path.relative(rsm.sourceRoot, m.source) : m.source;
source = source.replace(/\\/g, '/');
smg.addMapping({ source: source, name: m.name, original: original, generated: generated });
smg.addMapping({ source, name: m.name, original, generated });
}, null, sm.SourceMapConsumer.GENERATED_ORDER);
if (source) {
smg.setSourceContent(source, smc.sourceContentFor(source));
@@ -307,47 +299,47 @@ function isImportNode(node) {
}
nls_1.patchSourcemap = patchSourcemap;
function patch(moduleId, typescript, javascript, sourcemap) {
var _a = analyze(typescript), localizeCalls = _a.localizeCalls, nlsExpressions = _a.nlsExpressions;
const { localizeCalls, nlsExpressions } = analyze(typescript);
if (localizeCalls.length === 0) {
return { javascript: javascript, sourcemap: sourcemap };
return { javascript, sourcemap };
}
var nlsKeys = template(localizeCalls.map(function (lc) { return lc.key; }));
var nls = template(localizeCalls.map(function (lc) { return lc.value; }));
var smc = new sm.SourceMapConsumer(sourcemap);
var positionFrom = mappedPositionFrom.bind(null, sourcemap.sources[0]);
var i = 0;
const nlsKeys = template(localizeCalls.map(lc => lc.key));
const nls = template(localizeCalls.map(lc => lc.value));
const smc = new sm.SourceMapConsumer(sourcemap);
const positionFrom = mappedPositionFrom.bind(null, sourcemap.sources[0]);
let i = 0;
// build patches
var patches = lazy(localizeCalls)
.map(function (lc) { return ([
const patches = lazy(localizeCalls)
.map(lc => ([
{ range: lc.keySpan, content: '' + (i++) },
{ range: lc.valueSpan, content: 'null' }
]); })
]))
.flatten()
.map(function (c) {
var start = lcFrom(smc.generatedPositionFor(positionFrom(c.range.start)));
var end = lcFrom(smc.generatedPositionFor(positionFrom(c.range.end)));
return { span: { start: start, end: end }, content: c.content };
.map(c => {
const start = lcFrom(smc.generatedPositionFor(positionFrom(c.range.start)));
const end = lcFrom(smc.generatedPositionFor(positionFrom(c.range.end)));
return { span: { start, end }, content: c.content };
})
.toArray();
javascript = patchJavascript(patches, javascript, moduleId);
// since imports are not within the sourcemap information,
// we must do this MacGyver style
if (nlsExpressions.length) {
javascript = javascript.replace(/^define\(.*$/m, function (line) {
return line.replace(/(['"])vs\/nls\1/g, "$1vs/nls!" + moduleId + "$1");
javascript = javascript.replace(/^define\(.*$/m, line => {
return line.replace(/(['"])vs\/nls\1/g, `$1vs/nls!${moduleId}$1`);
});
}
sourcemap = patchSourcemap(patches, sourcemap, smc);
return { javascript: javascript, sourcemap: sourcemap, nlsKeys: nlsKeys, nls: nls };
return { javascript, sourcemap, nlsKeys, nls };
}
nls_1.patch = patch;
function patchFiles(javascriptFile, typescript) {
// hack?
var moduleId = javascriptFile.relative
const moduleId = javascriptFile.relative
.replace(/\.js$/, '')
.replace(/\\/g, '/');
var _a = patch(moduleId, typescript, javascriptFile.contents.toString(), javascriptFile.sourceMap), javascript = _a.javascript, sourcemap = _a.sourcemap, nlsKeys = _a.nlsKeys, nls = _a.nls;
var result = [fileFrom(javascriptFile, javascript)];
const { javascript, sourcemap, nlsKeys, nls } = patch(moduleId, typescript, javascriptFile.contents.toString(), javascriptFile.sourceMap);
const result = [fileFrom(javascriptFile, javascript)];
result[0].sourceMap = sourcemap;
if (nlsKeys) {
result.push(fileFrom(javascriptFile, nlsKeys, javascriptFile.path.replace(/\.js$/, '.nls.keys.js')));

View File

@@ -6,10 +6,9 @@
import * as ts from 'typescript';
import * as lazy from 'lazy.js';
import { duplex, through } from 'event-stream';
import File = require('vinyl');
import * as File from 'vinyl';
import * as sm from 'source-map';
import assign = require('object-assign');
import path = require('path');
import * as path from 'path';
declare class FileSourceMap extends File {
public sourceMap: sm.RawSourceMap;
@@ -26,7 +25,7 @@ function collect(node: ts.Node, fn: (node: ts.Node) => CollectStepResult): ts.No
const result: ts.Node[] = [];
function loop(node: ts.Node) {
var stepResult = fn(node);
const stepResult = fn(node);
if (stepResult === CollectStepResult.Yes || stepResult === CollectStepResult.YesAndRecurse) {
result.push(node);
@@ -42,8 +41,8 @@ function collect(node: ts.Node, fn: (node: ts.Node) => CollectStepResult): ts.No
}
function clone<T>(object: T): T {
var result = <T>{};
for (var id in object) {
const result = <T>{};
for (const id in object) {
result[id] = object[id];
}
return result;
@@ -67,8 +66,8 @@ define([], [${ wrap + lines.map(l => indent + l).join(',\n') + wrap}]);`;
* Returns a stream containing the patched JavaScript and source maps.
*/
function nls(): NodeJS.ReadWriteStream {
var input = through();
var output = input.pipe(through(function (f: FileSourceMap) {
const input = through();
const output = input.pipe(through(function (f: FileSourceMap) {
if (!f.sourceMap) {
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
}
@@ -83,7 +82,7 @@ function nls(): NodeJS.ReadWriteStream {
source = path.join(root, source);
}
const typescript = f.sourceMap.sourcesContent[0];
const typescript = f.sourceMap.sourcesContent![0];
if (!typescript) {
return this.emit('error', new Error(`File ${f.relative} does not have the original content in the source map.`));
}
@@ -174,7 +173,7 @@ module nls {
export function analyze(contents: string, options: ts.CompilerOptions = {}): ILocalizeAnalysisResult {
const filename = 'file.ts';
const serviceHost = new SingleFileServiceHost(assign(clone(options), { noResolve: true }), filename, contents);
const serviceHost = new SingleFileServiceHost(Object.assign(clone(options), { noResolve: true }), filename, contents);
const service = ts.createLanguageService(serviceHost);
const sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
@@ -206,8 +205,8 @@ module nls {
// `nls.localize(...)` calls
const nlsLocalizeCallExpressions = importDeclarations
.filter(d => d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport)
.map(d => (<ts.NamespaceImport>d.importClause.namedBindings).name)
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport))
.map(d => (<ts.NamespaceImport>d.importClause!.namedBindings).name)
.concat(importEqualsDeclarations.map(d => d.name))
// find read-only references to `nls`
@@ -226,8 +225,8 @@ module nls {
// `localize` named imports
const allLocalizeImportDeclarations = importDeclarations
.filter(d => d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports)
.map(d => [].concat((<ts.NamedImports>d.importClause.namedBindings).elements))
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports))
.map(d => ([] as any[]).concat((<ts.NamedImports>d.importClause!.namedBindings!).elements))
.flatten();
// `localize` read-only references
@@ -279,7 +278,7 @@ module nls {
constructor(contents: string) {
const regex = /\r\n|\r|\n/g;
let index = 0;
let match: RegExpExecArray;
let match: RegExpExecArray | null;
this.lines = [];
this.lineEndings = [];
@@ -360,7 +359,7 @@ module nls {
patches = patches.reverse();
let currentLine = -1;
let currentLineDiff = 0;
let source = null;
let source: string | null = null;
smc.eachMapping(m => {
const patch = patches[patches.length - 1];

View File

@@ -4,29 +4,32 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var gulp = require("gulp");
var sourcemaps = require("gulp-sourcemaps");
var filter = require("gulp-filter");
var minifyCSS = require("gulp-cssnano");
var uglify = require("gulp-uglify");
var composer = require("gulp-uglify/composer");
var uglifyes = require("uglify-es");
var es = require("event-stream");
var concat = require("gulp-concat");
var VinylFile = require("vinyl");
var bundle = require("./bundle");
var util = require("./util");
var gulpUtil = require("gulp-util");
var flatmap = require("gulp-flatmap");
var pump = require("pump");
var REPO_ROOT_PATH = path.join(__dirname, '../..');
const es = require("event-stream");
const gulp = require("gulp");
const concat = require("gulp-concat");
const minifyCSS = require("gulp-cssnano");
const filter = require("gulp-filter");
const flatmap = require("gulp-flatmap");
const sourcemaps = require("gulp-sourcemaps");
const uglify = require("gulp-uglify");
const composer = require("gulp-uglify/composer");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const path = require("path");
const pump = require("pump");
const uglifyes = require("uglify-es");
const VinylFile = require("vinyl");
const bundle = require("./bundle");
const i18n_1 = require("./i18n");
const stats_1 = require("./stats");
const util = require("./util");
const REPO_ROOT_PATH = path.join(__dirname, '../..');
function log(prefix, message) {
gulpUtil.log(gulpUtil.colors.cyan('[' + prefix + ']'), message);
fancyLog(ansiColors.cyan('[' + prefix + ']'), message);
}
// {{SQL CARBON EDIT}}
function loaderConfig(emptyPaths) {
var result = {
const result = {
paths: {
'vs': 'out-build/vs',
'sql': 'out-build/sql',
@@ -38,26 +41,26 @@ function loaderConfig(emptyPaths) {
return result;
}
exports.loaderConfig = loaderConfig;
var IS_OUR_COPYRIGHT_REGEXP = /Copyright \(C\) Microsoft Corporation/i;
const IS_OUR_COPYRIGHT_REGEXP = /Copyright \(C\) Microsoft Corporation/i;
function loader(src, bundledFileHeader, bundleLoader) {
var sources = [
src + "/vs/loader.js"
let sources = [
`${src}/vs/loader.js`
];
if (bundleLoader) {
sources = sources.concat([
src + "/vs/css.js",
src + "/vs/nls.js"
`${src}/vs/css.js`,
`${src}/vs/nls.js`
]);
}
var isFirst = true;
let isFirst = true;
return (gulp
.src(sources, { base: "" + src })
.src(sources, { base: `${src}` })
.pipe(es.through(function (data) {
if (isFirst) {
isFirst = false;
this.emit('data', new VinylFile({
path: 'fake',
base: '',
base: undefined,
contents: Buffer.from(bundledFileHeader)
}));
this.emit('data', data);
@@ -74,12 +77,12 @@ function loader(src, bundledFileHeader, bundleLoader) {
})));
}
function toConcatStream(src, bundledFileHeader, sources, dest) {
var useSourcemaps = /\.js$/.test(dest) && !/\.nls\.js$/.test(dest);
const useSourcemaps = /\.js$/.test(dest) && !/\.nls\.js$/.test(dest);
// If a bundle ends up including in any of the sources our copyright, then
// insert a fake source at the beginning of each bundle with our copyright
var containsOurCopyright = false;
for (var i = 0, len = sources.length; i < len; i++) {
var fileContents = sources[i].contents;
let containsOurCopyright = false;
for (let i = 0, len = sources.length; i < len; i++) {
const fileContents = sources[i].contents;
if (IS_OUR_COPYRIGHT_REGEXP.test(fileContents)) {
containsOurCopyright = true;
break;
@@ -91,9 +94,9 @@ function toConcatStream(src, bundledFileHeader, sources, dest) {
contents: bundledFileHeader
});
}
var treatedSources = sources.map(function (source) {
var root = source.path ? REPO_ROOT_PATH.replace(/\\/g, '/') : '';
var base = source.path ? root + ("/" + src) : '';
const treatedSources = sources.map(function (source) {
const root = source.path ? REPO_ROOT_PATH.replace(/\\/g, '/') : '';
const base = source.path ? root + `/${src}` : undefined;
return new VinylFile({
path: source.path ? root + '/' + source.path.replace(/\\/g, '/') : 'fake',
base: base,
@@ -102,7 +105,8 @@ function toConcatStream(src, bundledFileHeader, sources, dest) {
});
return es.readArray(treatedSources)
.pipe(useSourcemaps ? util.loadSourcemaps() : es.through())
.pipe(concat(dest));
.pipe(concat(dest))
.pipe(stats_1.createStatsStream(dest));
}
function toBundleStream(src, bundledFileHeader, bundles) {
return es.merge(bundles.map(function (bundle) {
@@ -110,33 +114,32 @@ function toBundleStream(src, bundledFileHeader, bundles) {
}));
}
function optimizeTask(opts) {
var src = opts.src;
var entryPoints = opts.entryPoints;
var otherSources = opts.otherSources;
var resources = opts.resources;
var loaderConfig = opts.loaderConfig;
var bundledFileHeader = opts.header;
var bundleLoader = (typeof opts.bundleLoader === 'undefined' ? true : opts.bundleLoader);
var out = opts.out;
const src = opts.src;
const entryPoints = opts.entryPoints;
const resources = opts.resources;
const loaderConfig = opts.loaderConfig;
const bundledFileHeader = opts.header;
const bundleLoader = (typeof opts.bundleLoader === 'undefined' ? true : opts.bundleLoader);
const out = opts.out;
return function () {
var bundlesStream = es.through(); // this stream will contain the bundled files
var resourcesStream = es.through(); // this stream will contain the resources
var bundleInfoStream = es.through(); // this stream will contain bundleInfo.json
const bundlesStream = es.through(); // this stream will contain the bundled files
const resourcesStream = es.through(); // this stream will contain the resources
const bundleInfoStream = es.through(); // this stream will contain bundleInfo.json
bundle.bundle(entryPoints, loaderConfig, function (err, result) {
if (err) {
if (err || !result) {
return bundlesStream.emit('error', JSON.stringify(err));
}
toBundleStream(src, bundledFileHeader, result.files).pipe(bundlesStream);
// Remove css inlined resources
var filteredResources = resources.slice();
const filteredResources = resources.slice();
result.cssInlinedResources.forEach(function (resource) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log('optimizer', 'excluding inlined: ' + resource);
}
filteredResources.push('!' + resource);
});
gulp.src(filteredResources, { base: "" + src }).pipe(resourcesStream);
var bundleInfoArray = [];
gulp.src(filteredResources, { base: `${src}`, allowEmpty: true }).pipe(resourcesStream);
const bundleInfoArray = [];
if (opts.bundleInfo) {
bundleInfoArray.push(new VinylFile({
path: 'bundleInfo.json',
@@ -146,26 +149,17 @@ function optimizeTask(opts) {
}
es.readArray(bundleInfoArray).pipe(bundleInfoStream);
});
var otherSourcesStream = es.through();
var otherSourcesStreamArr = [];
gulp.src(otherSources, { base: "" + src })
.pipe(es.through(function (data) {
otherSourcesStreamArr.push(toConcatStream(src, bundledFileHeader, [data], data.relative));
}, function () {
if (!otherSourcesStreamArr.length) {
setTimeout(function () { otherSourcesStream.emit('end'); }, 0);
}
else {
es.merge(otherSourcesStreamArr).pipe(otherSourcesStream);
}
}));
var result = es.merge(loader(src, bundledFileHeader, bundleLoader), bundlesStream, otherSourcesStream, resourcesStream, bundleInfoStream);
const result = es.merge(loader(src, bundledFileHeader, bundleLoader), bundlesStream, resourcesStream, bundleInfoStream);
return result
.pipe(sourcemaps.write('./', {
sourceRoot: null,
sourceRoot: undefined,
addComment: true,
includeContent: true
}))
.pipe(opts.languages && opts.languages.length ? i18n_1.processNlsFiles({
fileHeader: bundledFileHeader,
languages: opts.languages
}) : es.through())
.pipe(gulp.dest(out));
};
}
@@ -175,14 +169,14 @@ exports.optimizeTask = optimizeTask;
* to have a file "context" to include our copyright only once per file.
*/
function uglifyWithCopyrights() {
var preserveComments = function (f) {
return function (node, comment) {
var text = comment.value;
var type = comment.type;
const preserveComments = (f) => {
return (_node, comment) => {
const text = comment.value;
const type = comment.type;
if (/@minifier_do_not_preserve/.test(text)) {
return false;
}
var isOurCopyright = IS_OUR_COPYRIGHT_REGEXP.test(text);
const isOurCopyright = IS_OUR_COPYRIGHT_REGEXP.test(text);
if (isOurCopyright) {
if (f.__hasOurCopyright) {
return false;
@@ -200,10 +194,10 @@ function uglifyWithCopyrights() {
return false;
};
};
var minify = composer(uglifyes);
var input = es.through();
var output = input
.pipe(flatmap(function (stream, f) {
const minify = composer(uglifyes);
const input = es.through();
const output = input
.pipe(flatmap((stream, f) => {
return stream.pipe(minify({
output: {
comments: preserveComments(f),
@@ -214,18 +208,23 @@ function uglifyWithCopyrights() {
return es.duplex(input, output);
}
function minifyTask(src, sourceMapBaseUrl) {
var sourceMappingURL = sourceMapBaseUrl && (function (f) { return sourceMapBaseUrl + "/" + f.relative + ".map"; });
return function (cb) {
var jsFilter = filter('**/*.js', { restore: true });
var cssFilter = filter('**/*.css', { restore: true });
pump(gulp.src([src + '/**', '!' + src + '/**/*.map']), jsFilter, sourcemaps.init({ loadMaps: true }), uglifyWithCopyrights(), jsFilter.restore, cssFilter, minifyCSS({ reduceIdents: false }), cssFilter.restore, sourcemaps.write('./', {
sourceMappingURL: sourceMappingURL,
sourceRoot: null,
const sourceMappingURL = sourceMapBaseUrl ? ((f) => `${sourceMapBaseUrl}/${f.relative}.map`) : undefined;
return cb => {
const jsFilter = filter('**/*.js', { restore: true });
const cssFilter = filter('**/*.css', { restore: true });
pump(gulp.src([src + '/**', '!' + src + '/**/*.map']), jsFilter, sourcemaps.init({ loadMaps: true }), uglifyWithCopyrights(), jsFilter.restore, cssFilter, minifyCSS({ reduceIdents: false }), cssFilter.restore, sourcemaps.mapSources((sourcePath) => {
if (sourcePath === 'bootstrap-fork.js') {
return 'bootstrap-fork.orig.js';
}
return sourcePath;
}), sourcemaps.write('./', {
sourceMappingURL,
sourceRoot: undefined,
includeContent: true,
addComment: true
}), gulp.dest(src + '-min'), function (err) {
}), gulp.dest(src + '-min'), (err) => {
if (err instanceof uglify.GulpUglifyError) {
console.error("Uglify error in '" + (err.cause && err.cause.filename) + "'");
console.error(`Uglify error in '${err.cause && err.cause.filename}'`);
}
cb(err);
});

Some files were not shown because too many files have changed in this diff Show More