Compare commits

..

327 Commits
1.4.5 ... 1.6.0

Author SHA1 Message Date
Karl Burtram
eb35dae1d1 Turn off failing Insights tests 2019-04-16 22:59:03 -07:00
Karl Burtram
cc0a144169 Remove gap from bottom of Server view (#5072) 2019-04-16 19:54:30 -07:00
Anthony Dresser
b973f9e0ec changes strings for data explorer (#4946) 2019-04-16 18:49:59 -07:00
Alan Ren
d62068025c Merge fix the selectbox issue for chart 2019-04-16 13:40:24 -07:00
Aditya Bist
b2952d2ddf fix job action context (#5053) 2019-04-16 13:36:30 -07:00
Gene Lee
a21244816d Add support for new endpoint key string 'gateway' (#4954) 2019-04-16 10:45:00 -07:00
Chris LaFreniere
a9aeb57dc4 Fix to ensure that we rewrite spark ui links correctly (#4962) 2019-04-16 10:34:01 -07:00
Gene Lee
00537ed199 Fixed bug: CheckboxTreeNode label overflows, and node icon disappears (#5022) 2019-04-16 10:31:40 -07:00
Alex Ross
c2df3e0e0a Merge 51b0b28134d51361cf996d2f0a1c698247aeabd8 2019-04-15 10:15:55 -07:00
Charles Gagnon
e3afb1cffc Fix wrong release notes being loaded (#5008) 2019-04-11 20:56:19 -07:00
Kevin Cunnane
b475311f85 Fix #4500 Untitled notebook reopen doesn't show dirty (#5005)
* Fix 2 notebook issues
- Do not create notebook model twice on start
- Do not cause disposed warnings due to markdown cell deserialization

* Fix notebook dirty on open issue
Before model is resolved we weren't getting dirty events.
Solution is to use backing text model until it's ready.
Must hook to the dirty event & notify to get the dot to appear

# Conflicts:
#	src/sql/workbench/parts/notebook/cellViews/code.component.ts
2019-04-11 20:56:03 -07:00
Maddy
45005d61e0 add wrap to the <pre> tag (#5002)
* add wrap to the <pre> tag

* removed styles for browser support
2019-04-11 20:46:41 -07:00
Anthony Dresser
adfddeae27 restore line height to account list renderer (#4999) 2019-04-11 20:46:27 -07:00
Anthony Dresser
a7429267bb revert data explorer id to connections (#5003) 2019-04-11 20:46:08 -07:00
kisantia
ff415d6a03 bump SQL Tools to 1.5.0-alpha.85 to get invalid dacpac version fix (#5001) 2019-04-11 15:25:55 -07:00
udeeshagautam
b4dc35a4de Fix for 4104 : Multiple consecutive spaces in query results cells are condensed into one (#4983)
* preserving spaces in query results - all beginning, trailing and middle spaces will be shown as is

* removing the change through formatting and replacing with css change formatting was leaving special char while removing nbsp
2019-04-11 15:25:41 -07:00
Aditya Bist
29c7ccad39 Added some usage details (#4711)
* added some details

* remove unused import

* added new metrics, removed churn

* merged master and code review comments

* code review comments

* normalized days to calendar days/weeks/months

* cleaned up code

* changed comment to start required check for PR

* fix failing test

* fix test

* removed null assignment

* fix null test script
2019-04-11 15:25:24 -07:00
Yurong He
1da3635d03 Fix ##3479 ctrl+a select active cell output or preview markdown (#4981)
* Enable ctrl+a to select the output or markdown content when the cell is active

* Moved toggleUserSelect into ngOnChanges

* Resolve PR comments
2019-04-11 15:25:02 -07:00
Chris LaFreniere
1f50015ed2 Add New Notebook from Server Dashboard (#4971) 2019-04-11 15:24:47 -07:00
Aditya Bist
01892422cb Azure extension changes (#4987)
* removed search box

* removed commented code

# Conflicts:
#	src/sql/parts/objectExplorer/viewlet/serverTreeView.ts
2019-04-11 15:24:31 -07:00
Cory Rivera
afce60b06f Add additional error handling to Python installation for Notebooks (#4891)
* Also enabled integration tests for python installation.
2019-04-11 15:01:02 -07:00
Chris LaFreniere
a2b87f6158 Fix for relative markdown image paths (#4889)
* Fix for relative markdown image paths

* PR comments
2019-04-11 15:00:45 -07:00
Chris LaFreniere
76a2f92daf Notebooks: Potential Fix for "Notebook Provider does not Exist" Error (#4848)
* Fallback to SQL

* Fix providers not found issue

* await whenInstalledExtensionsRegistered

* PR comments
2019-04-11 15:00:21 -07:00
Gene Lee
5b09d57196 Fixed Broken Notebook Preview (#4895) 2019-04-11 14:59:09 -07:00
Karl Burtram
95bf18f859 Fix merge build break 2019-04-10 16:16:43 -07:00
Kevin Cunnane
1fc648ff37 Mitigate (but not fully fix) Run Cell from disconnected notebook (#4960)
This is a partial fix that lays groundwork for full "Prompt to connect" if a kernel needs a connection.
I am waiting on Yurong's refactoring of connection handling before doing any of the prompt work.

- Adds kernel metadata about whether a connection is required.
- For Jupyter, only Spark kernels are listed as requiring a connection
- If this is true and there's no active connection, will show notification and not call execute

In the future, this path will still be used if user is prompted to connect and cancels out.
The future change will be to inject a "connect" handler from notebook.component to the cell callback and use to set connection context
2019-04-10 13:50:23 -07:00
Kevin Cunnane
37ba956bad Fix #4893 New Notebook Can Open Existing Noteboon (#4959)
Add back check for textDocuments with same name, should've been there anyhow

On rehydration files show as text docs before clicking as only get
changed by customInputConverter code path.
We should look at this long term - ideally we'd update notebookDocuments
with correct values on initial start. #4958 opened to track this.
2019-04-10 13:47:49 -07:00
Kevin Cunnane
697f887539 Fix #4930 Text cells are referred to as text and markdown in commands (#4956) 2019-04-10 13:44:20 -07:00
Anthony Dresser
7b42141958 Merge 2de47c2a50 2019-04-10 13:44:11 -07:00
Chris LaFreniere
158b00f9b4 always serialize execution count (#4864) 2019-04-10 13:39:54 -07:00
Matt Bierner
43ac8dfd20 Adopt TS 3.4.3
Fixes #72005
2019-04-10 13:30:11 -07:00
Sandeep Somavarapu
34d8d52e7a Fix #71947 2019-04-09 13:09:10 -07:00
Sandeep Somavarapu
c738b26c04 Fix #71585 2019-04-09 13:09:03 -07:00
Johannes Rieken
6090e7173f properly check picked formatter, #71988 2019-04-09 13:06:45 -07:00
Johannes Rieken
7ebf746584 use editor for format on save and honor silent mode 2019-04-09 13:06:36 -07:00
Matt Bierner
e9d04d75ac Fixes #71688 2019-04-09 13:06:20 -07:00
Joao Moreno
605160a1ba Cherry pick 3773487012c98ef7974a0182771ea19178f2a525 2019-04-09 13:04:37 -07:00
Johannes Rieken
82b8750b63 show notification when formatter is disabled/uninstalled, message otherwise 2019-04-09 12:59:00 -07:00
Rob Lourens
61f7f19d12 Fix #71465 - "find in files shortcut broken" 2019-04-09 12:58:48 -07:00
Anthony Dresser
9bd7e30d18 revert vscode cahnges (#4879) 2019-04-05 09:15:38 -07:00
Aditya Bist
572010ded1 fixed azure extension issues (#4859) 2019-04-04 15:32:04 -07:00
David Shiflet
1f22326e78 Reuse existing saved connection that matches args (#4839)
* Reuse existing saved connection that matches args

* search subgroups for matches
2019-04-04 11:11:18 -07:00
Yurong He
504d5c91bc Fixed #4800 need to use ConnectionProfile in order to get the correct… (#4812)
* Fixed #4800 need to use ConnectionProfile in order to get the correct connection

* Go back to create connect in run cell to avoid to fail to run cell or close the connection used by other.
2019-04-03 19:00:56 -07:00
Chris LaFreniere
a34692b6f2 Fix for dispose method of undefined (#4843) 2019-04-03 18:08:42 -07:00
Karl Burtram
73b5d23210 Update bug template labels (#4840) 2019-04-03 16:47:55 -07:00
Anthony Dresser
e31de8b137 fixed more null references (#4841) 2019-04-03 16:30:58 -07:00
Anthony Dresser
cef5bbb2be Strict null pass on some base ui files (#4832)
* more strict null checks in base browser code

* revert changes to radiobutton

* fix some more minor things, enable strict null check in pipelines

* formatting

* fix compile errors

* make null undefined

* more null to undefined
2019-04-03 16:18:33 -07:00
Anthony Dresser
80a8e1a4da More builder removal (#4810)
* remove more builder references

* formatting
2019-04-03 16:18:03 -07:00
Anthony Dresser
fcb8fe50fe Advanced Description On Bottom (#4836)
* move description to the bottom

* formatting
2019-04-03 15:49:53 -07:00
Karl Burtram
5235c8aad6 Turn-off classifier bot while doing label refactoring (#4834) 2019-04-03 13:59:48 -07:00
Karl Burtram
5b67525211 Bump SQL Tools to 1.5.0-alpha.84 (#4830) 2019-04-03 11:43:43 -07:00
Anthony Dresser
fb697729c0 Data Explorer Icons (#4806)
* inital icon support

* add necessary classes for icons

* initialize icon to blank string
2019-04-03 11:10:48 -07:00
Chris LaFreniere
76fe0fef49 Fix broken test merge conflicts (#4823) 2019-04-03 10:44:01 -07:00
Karl Burtram
6295d03801 Update XLF files (#4811) 2019-04-02 17:35:18 -07:00
Chris LaFreniere
07166fb3cd Run All Cells Notebook Implementation (#4713)
* runAllCells API

* add comment

* more run cells fixes

* Add integration test

* Add multiple cell SQL notebook test

* Comment out python tests as they fail in the lab

* remove unused imports

* PR comments

* Remove localize

* Return true instead of promise.resolve(true)
2019-04-02 16:47:00 -07:00
Yurong He
219dfe66d0 Make connection for new Notebook (#4770)
* Make connection for new Notebook

* Resolve merge issue

* User connectionUri to track the connects need to be disconnected

* Removed debugging log
2019-04-02 15:52:01 -07:00
Yurong He
22c62fb524 Added setup and teardown for test; add variable to control run python… (#4782)
* Added setup and teardown for test; add variable to control run python/pyspark tests; remove dup code in clearAlloutput tests

* Resolve PR comments
2019-04-02 15:16:54 -07:00
Karl Burtram
f8706abebe Merge from vscode b8c2e7108b3cae7aa2782112da654bedd8bb3a52 (#4808) 2019-04-02 14:35:06 -07:00
Alan Ren
e83a6f9c2e profile page and summary page (#4769)
* add cluster name to page

* implement profile page -1

* fix compilation error due to new method

* profile page 0328

* summary page

* make divcontainer accessible

* handle disposable

* add support for "coming soon" cards
2019-04-02 13:52:39 -07:00
Anthony Dresser
63485c8c78 Remove builder references from options dialog (#4774)
* remove more builder references; remove $ from declarations

* fix jquery references

* formatting

* fixing backup

* fix backup box
2019-04-02 13:49:50 -07:00
Karl Burtram
72ef024678 Check for null ref in query statusbar timer (#4804) 2019-04-02 11:56:45 -07:00
Chris LaFreniere
414c736655 Allow output area to be selectable again (#4714) 2019-04-01 23:09:00 -07:00
Yurong He
fdbfbb9238 Change attach to show user name. Use connectionProfile.title to displ… (#4794)
* Change attach to show user name. Use connectionProfile.title to display the same info as OE server node

* Fixed some potential profile leak

* Removed unused import

* Resolve PR comments
2019-04-01 21:18:49 -07:00
Alan Ren
a766e5d334 fix error for download extension (#4793) 2019-04-01 16:44:15 -07:00
Yurong He
2faf01eb9d Add MouseWheelSupport , AdditionalKeyBindings and AutoColumnSize plugins to sql notebook grid (#4790) 2019-04-01 12:44:40 -07:00
Yurong He
a4c2463b2f Add azure and standalone instance config to environment variables (#4745)
* change to fs to write file

* change random range to 1 to 100

* Added more env variables check as well

* Add azure and standalone env variables

* Add Azure instance to unblock mac pipeline testing

* Fixed some merge issue and improve delete file

* Added standalone test only run on windows
2019-03-29 20:14:44 -07:00
Cory Rivera
ddbd8033f9 Remove an outdated showErrorMessage assert from Jupyter Server Manager unit tests. (#4777) 2019-03-29 15:18:01 -07:00
Yurong He
6da66cf367 change to fs to write file (#4736)
* change to fs to write file

* Added more env variables check as well

* Resolve PR comments

* Merge master

* Add file name exist check and delete it after test is done
2019-03-29 15:04:11 -07:00
Cory Rivera
c7bc37d010 Add quotes around python paths to guard against spaces. (#4775) 2019-03-29 14:55:08 -07:00
Charles Gagnon
e0ec3c5035 Fix workspaceRoot macro for insights (#4686)
* Fix workspaceRoot macro for insights

The workspaceRoot macro wasn't working correctly for finding the queryFile. There were a couple of issues :

1. The path separators were hardcoded as / which wasn't xplat-compatible
2. They required the first section of the path was one of the folders in the workspace - e.g. if the workspace contained a folder named foo you'd have to specify ${workspaceRoot}\foo\myfile.sql. This is inconsistent with the folder logich was just appends the path after ${workspaceRoot} to the folder that's currently open

I changed the logic to just append the relative part of the path to every folder currently open in the workspace and choose the first one that it found that contained the file we were told to look for - which follows the convention the folder logic uses. If the file doesn't exist it'll just fall back to using the path without the macro (which is likely to not resolve and thus will display an error, but there's nothing we can do at that point anyways)

* Switch to using VS Code resolver (support for more than just workspaceRoot) and move resolution code into helper method so it can be used by the multiple places it's called. Added tests for the methods.

* Add test for invalid param

* Change resolveQueryFilePath to be a standalone exported function. Change it to throw if the file can't be resolved/found so callers can display error correctly. Added more tests to covery new scenarios. Switch to using pfs instead of fs for file existance checks.

* Add extra param to InsightsDialogController construction in test

* Fix formatting and test errors.

* Change to suiteSetup and suiteTeardown so the setup/teardown is only ran once instead of once per test - we don't need unique files and this stops a race condition error with deleting the test folder.

* spaces -> tabs
2019-03-29 14:53:49 -07:00
Aditya Bist
d4f287298f Agent - committer work (#4758)
* fix delete job

* added the ability to change and retrieve jobowner

* fixed UX for delete step

* improved operator actions

* fixed operators and proxies

* added errors for failures
2019-03-29 13:46:52 -07:00
Aditya Bist
e70d5838a8 Agent - stop job (#4410)
* stop job behavior similar to ssms

* removed agent from sqlops

* fix couple UX issues

* let sqlops remain unchanged
2019-03-29 13:42:34 -07:00
Aditya Bist
b04ca0fdbd Agent - refresh refactor (#4773)
* refactored refresh views

* removed refresh from jobs view
2019-03-29 13:42:02 -07:00
Anthony Dresser
a064da642d Merge from vscode f5044f0910e4aa7e7e06cb509781f3d56e729959 (#4759) 2019-03-29 10:54:38 -07:00
Charles Gagnon
37ce37979a Fix broken notebook test (#4766)
* Fix broken test

The test would always fail if it was doing the right thing - since it was asserting that there WEREN'T any cells with outputs (the opposite of what it should have been checking).

* Update error message to be clearer

* Fix spelling error
2019-03-29 10:25:52 -07:00
Karl Burtram
cb3cbd0d78 Revert Messages Collapse behavior (#4757)
* Revert "change sizing behavior to allow the messages to fulling collapse down (set results to have no max height) (#4313)"

This reverts commit 7de294a58e.

* Revert "Grid scrolling bugs (#4396)"

This reverts commit ace6012c1c.
2019-03-28 18:09:58 -07:00
Raj
1415aa1c03 #4586: Clear Results feature in Notebooks (#4705)
* #4586: Clear all outputs feature

* text change

* Adding extensible method with integration tests

* Misc change

* Misc change

* Adding more logging

* Change to test

* Adding outputs confition in integration tests
2019-03-28 15:20:28 -07:00
Karl Burtram
7eb17f6abc Add Query Editor null checks (#4753) 2019-03-28 15:18:44 -07:00
Charles Gagnon
fdb471d506 Windows extension fixes/improvements (#4740)
* Windows extension fixes/improvements

Remove error message when activating extension on non-win platforms (instead just don't do anything). Add hooks to kill child processes on exit if they haven't ended already. Formatting fixes.

* Remove shelljs

* Remove error messages from telemetry until I can follow up on whether we're allowed to send these up. Fix some issues with the exec callback - the child_process exec function returns an ExecException object instead of the code directly like the shelljs did so check for that correctly.
2019-03-28 14:58:18 -07:00
Anthony Dresser
e6785ffe95 Merge from vscode de81ccf04849309f843db21130c806a5783678f7 (#4738) 2019-03-28 13:06:16 -07:00
Karl Burtram
cc2951265e Add query execution plan extensibility APIs (#4072)
* WIP 1

* WIP 2

* Fix typos

* Iterate on API a bit

* Query Tab WIP

* More dynamic query tab impl

* Fix merge breaks

* Update interfaces

* Update to single event handler for query events

* Remove query plan extension

* Add generated JS file
2019-03-28 10:59:02 -07:00
Chris LaFreniere
ee413f3b24 Ensure only one cell shows as running at a time (#4715) 2019-03-27 19:04:07 -07:00
Karl Burtram
fc664a850d Fix query plan styles to correct rendering (#4739) 2019-03-27 18:00:33 -07:00
Raj
55efe76b2e #3585: Canceled message when changing kernel (#4673)
* #3585: Canceled message when changing kernel

* Indentation

* Cleanup
2019-03-27 15:43:27 -07:00
Raj
dd8922ce4d #4511: Streamline untitled notebook count (#4727) 2019-03-27 15:29:45 -07:00
Yurong He
102b48c302 Change standalone integration test to use sql login (#4728) 2019-03-27 15:09:45 -07:00
David Shiflet
a360bebd9d Add status messages during command line processing (#4725)
* add status bar messages

* missed semicolon
2019-03-27 15:44:57 -04:00
Cory Rivera
37ab493b78 Use getErrorMessage to get exception messages in python installation code. (#4730) 2019-03-27 12:14:42 -07:00
Anthony Dresser
46b7afe558 Merge from vscode 3bd60b2ba753e7fe39b42f99184bc6c5881d3551 (#4712) 2019-03-27 11:36:01 -07:00
Chris LaFreniere
eac3420583 Minimum grid height set when grid returns 0 rows (#4716) 2019-03-27 10:57:48 -07:00
Chris LaFreniere
e1e9c08242 Disable Python 3 notebook integration test due to timeout (#4726)
* Disable Python 3 notebook integration test due to timeout

* Remove unused imports
2019-03-27 10:18:42 -07:00
Chris LaFreniere
5ac6cf3b74 Ensure SQL is the first kernel in the kernels dropdown (#4692)
* Ensure SQL is the first kernel shown in the dropdown

* cleanup to prevent sql from registering twice
2019-03-26 19:39:28 -07:00
Yurong He
d6a58136da Add log to help debug failure in the lab (#4706) 2019-03-26 11:55:14 -07:00
Alan Ren
819b7b93d1 render xml properly in the grid (#4674)
* render xml properly in the grid

* fix html injection

* fix tslint error

* add comments and fix an issue

* fix comment
2019-03-26 11:48:55 -07:00
Anthony Dresser
bceeda1cfd Remove dev flags (#4707)
* remove more dev flags

* formatting
2019-03-26 11:44:59 -07:00
Anthony Dresser
0d8ef9583b Merge from vscode 966b87dd4013be1a9c06e2b8334522ec61905cc2 (#4696) 2019-03-26 11:43:38 -07:00
Yurong He
b1393ae615 Disable pySpark3 notebook test to unblock insider build (#4695)
* Disable pySpark3 notebook test to unblock insider build

* Removed unused import
2019-03-25 17:36:49 -07:00
Alan Ren
464109313b fix context menu test due to new SSMSMin feature (#4693) 2019-03-25 16:49:12 -07:00
Raj
83c8baf8e3 #4418: Notebook file icon doesn't show for new notebook (#4676)
* #4418: Notebook icon doesn't load for untitled

* Indentation

* Indentation
2019-03-25 16:05:04 -07:00
Chris LaFreniere
1bac929ab3 Fix for SQL Kernel only showing up (#4691)
* Ensure that notebook providers that are registered "early" are shown in kernels dropdown

* cleanup
2019-03-25 15:24:57 -07:00
Charles Gagnon
b27417da41 Add ADS Windows support extension with LaunchSsmsDialog command (#4248)
* ADS Windows support extension with LaunchSsmsDialog command

* Update readme

* Fix spacing

* Update download with new file location and name

* Update SsmsMin package with bits from latest RC build and addressed some comments.

* Update extension name. Add Context menu extension for launching server properties dialog. Remove params interface from public API

* Rename folder and update README

* Correct README title

* Fix a few issues and clean up some stuff.

* Update to azdata namespace

* Refactor to use async/await and add some more telemetry

* Add .bat for running extension tests (currently only Notebook) and set up launch.json with 2 new launch configs for running & debugging extension tests.

* Rename files to make it clear these aren't the integration tests

* Update launch.config too

* Fix spacing and missed file name update

* Fix some bugs in buildSsmsMinCommandArgs and add unit tests
2019-03-25 14:19:11 -07:00
Charles Gagnon
ef1f72f69b Remove popular extensions view (#4679)
We don't track installs for our extensions so the "Popular Extensions" category is just a duplicate of the full list. Removing it to reduce clutter until we can add that functionality in. This also removes the command associated with the view.
2019-03-25 13:21:48 -07:00
Raj
784fd57410 confirmSave using UntitledEditorInput (#4611) 2019-03-25 10:25:28 -07:00
Charles Gagnon
1dd0afcf80 Add ADS Startup Telemetry events back in (#4675)
They were removed with a VS Code merge. Changed to using extension contribution to log the telemetry to reduce the amount of VS code edits.
2019-03-23 11:02:34 -07:00
udeeshagautam
ddce7731b9 Feature/viewlet cmsapis (#4312)
* first set of changes to experiment the registration of cms related apis

* Adding cms service entry to workbench

* Adding basic functionality for add remove reg servers and group

* Returning relative path as part of RegServerResult as string

* Removing the cms apis from core. Having mssql extension expose them for cms extension

* Propogating the backend name changes to apis

* Fixing some missing sqlops references

* Adding a sqltools service version with CMS apis available
2019-03-22 17:24:45 -07:00
Alan Ren
d00c3780a6 fix for issue 4596 (#4670) 2019-03-22 14:11:26 -07:00
Anthony Dresser
4a87a24235 Merge from vscode 011858832762aaff245b2336fb1c38166e7a10fb (#4663) 2019-03-22 13:07:54 -07:00
Anthony Dresser
f5c9174c2f Revert "Connection Store Refactor (#4632)" (#4671)
This reverts commit 756f77063a.
2019-03-22 11:30:20 -07:00
Alan Ren
8d5f676039 a few deploy cluster wizard changes (#4644)
* update icon for target types

* Revert "update icon for target types"

This reverts commit 79bd7674f2c09602430a0b10829f7b0d3234eb98.

* update target type icons

* update eula and privacy policy links

* existing cluster page

* adjust the loading indicator position
2019-03-22 11:29:51 -07:00
Yurong He
71db7e10b6 Add notebook integration tests (#4652)
* Add notebook integration tests
2019-03-22 10:39:44 -07:00
Anthony Dresser
756f77063a Connection Store Refactor (#4632)
* various clean ups

* formatting

* remove linting

* formatting

* IConfigurationService is even better

* messing with connection config tests

* update tests

* formatting

* foramtting

* remove unused code

* add more tests

* working through tests

* formatting

* more factoring of connection store and increase code coverage

* formatting

* fix tests
2019-03-22 01:07:32 -07:00
Matt Irvine
5f637036bc Fix query tab color regression (#4660) 2019-03-21 16:22:03 -07:00
Anthony Dresser
24b5f41065 add data explorer default (#4653) 2019-03-21 16:10:54 -07:00
Chris LaFreniere
b00352570b Fix the "Failed to show notebook document" errors (#4649) 2019-03-21 16:06:17 -07:00
Matt Irvine
0b44b7d384 Fix issue reporter 'Visual Studio Code' text (#4658) 2019-03-21 15:38:25 -07:00
Chris LaFreniere
82da64d66d Handle future done in SQL for multiple batches (#4654) 2019-03-21 13:47:01 -07:00
Matt Irvine
ee5a76bb0c Go back to LICENSE.txt for Windows installers (#4655) 2019-03-21 13:28:51 -07:00
Anthony Dresser
b65ee5b42e Merge from vscode fc10e26ea50f82cdd84e9141491357922e6f5fba (#4639) 2019-03-21 10:58:16 -07:00
Anthony Dresser
8298db7d13 Ignore the users title bar settings (#4625)
* ignore the users title bar settings

* ignore in more places
2019-03-20 19:39:43 -07:00
Chris LaFreniere
5b0e86b179 Have notebook grids respond to theme change events (#4624) 2019-03-20 18:47:57 -07:00
Matt Irvine
ed2641ea02 Update how we create services (#4638) 2019-03-20 17:08:15 -07:00
Matt Irvine
66939636dc Bring over code-url-handler.desktop (#4629) 2019-03-20 14:56:50 -07:00
Anthony Dresser
2c331d929a remove duplicate launch task (#4631) 2019-03-20 12:34:27 -07:00
Anthony Dresser
4472764f3a extensions tslint cleanup/Connection config refactor (#4370)
* various clean ups

* formatting

* remove linting

* formatting

* IConfigurationService is even better

* messing with connection config tests

* update tests

* formatting

* foramtting

* remove unused code

* add more tests
2019-03-20 11:59:07 -07:00
Karl Burtram
6cb7153bdd VBump ADS to 1.6.0 2019-03-20 11:41:41 -07:00
Anthony Dresser
5142f69655 Cleanup dependencies (#4546)
* Merge VS Code 1.31.1

* Fix missed merge conflict

* Fix license in new files

* Remove extra extension files

* Fix compile error

* Fix TSLint errors

* Fix integration tests

* Fixed saved and recent connections list

* Fix tests

* move dependecies and delete unused ones

* fix test compile
2019-03-20 11:30:20 -07:00
Anthony Dresser
dfe23f7bfe Remove some CARBON Edits (#4571)
* remove some unnecessary sql carbon edits to vs source and adds correct fix where necessary

* revert bad change
2019-03-20 11:30:06 -07:00
Raj
6873353cd4 #4618: Notebook JSON has extra } after save (#4627) 2019-03-20 11:12:02 -07:00
Matt Irvine
025c97673f Reapply extension management edits that were removed in 1.31 merge (#4606) 2019-03-20 10:40:09 -07:00
Anthony Dresser
c814b92557 VSCode merge (#4610)
* Merge from vscode e388c734f30757875976c7e326d6cfeee77710de

* fix yarn lcoks

* remove small issue
2019-03-20 10:39:09 -07:00
Anthony Dresser
87765e8673 Vscode merge (#4582)
* Merge from vscode 37cb23d3dd4f9433d56d4ba5ea3203580719a0bd

* fix issues with merges

* bump node version in azpipe

* replace license headers

* remove duplicate launch task

* fix build errors

* fix build errors

* fix tslint issues

* working through package and linux build issues

* more work

* wip

* fix packaged builds

* working through linux build errors

* wip

* wip

* wip

* fix mac and linux file limits

* iterate linux pipeline

* disable editor typing

* revert series to parallel

* remove optimize vscode from linux

* fix linting issues

* revert testing change

* add work round for new node

* readd packaging for extensions

* fix issue with angular not resolving decorator dependencies
2019-03-19 17:44:35 -07:00
Raj
833d197412 #3973: Persist scroll position when tab notebooks (#4531)
* #3973: Persist scroll position when tab notebooks

* Remove getter and setter
2019-03-19 14:01:24 -07:00
Maddy
5e72cd12d1 Maddy/prompt password when different (#4537)
* Prompt for password once when the sql instance password doesn't work for hdfs. If the user provides the correct password, connect and continue, else show Unauthorized  node.

* Removed the hardcoded bad password

* Added check for empty folder scenarios

* Added ErrorStatusCode as property of TreeNode. Checking for the error code instead of the error string to avoid localization issues

* type fixed

* implemented hasExpansionError
2019-03-19 13:46:55 -07:00
Charles Gagnon
330fb6dff5 Remove tooltip from editable dropdown (#4542)
* Fix our custom dropdown control to update the tooltip text correctly when a new value is selected. Previous behavior was to always keep the initial text (<Default> for example). Now we'll update it as appropriate (and default back to Placeholder text when we're clearing the value)

* Spaces -> tabs

* Remove extra ;

* Remove tooltips from the text box part of the dropdown

* Remove tooltips from dropdown arrows

* Revert "Remove tooltips from dropdown arrows"

This reverts commit 31a0748aaea42d5009eb9752bd075ce49a6716f5.
2019-03-19 13:46:36 -07:00
Raj
6d7d485a38 #4331: Set notebook dirty when change 'trusted' (#4569) 2019-03-18 10:56:04 -07:00
Karl Burtram
0160901060 Update readme\changelog for March (#4575) 2019-03-18 10:23:28 -07:00
Chris LaFreniere
9313140c59 Fix install packages not always showing on startup (#4566) 2019-03-18 10:18:03 -07:00
Raj
25b1d4b673 #4565: Open notebook from Dashboard - can't close dirty notebook (#4568)
* #4565:"Don't save" doesn't close editor -Dashboard

* Misc change
2019-03-17 08:03:43 -07:00
Chris LaFreniere
b9d0602f55 Only show filename for notebook titlebar when opening from Open Notebook (#4556) 2019-03-15 17:45:06 -07:00
Charles Gagnon
ec0a3bbc95 Fix recent connections list to use <default> DB if no DB is specified by the user when a connection is made (#4564) 2019-03-15 17:28:56 -07:00
Matt Irvine
50f63a2f72 Fix extension bugs from VS Code merge (#4563) 2019-03-15 17:22:56 -07:00
Raj
cb47cb7dbf #4365: Make notebook dirty when changing kernel (#4547) 2019-03-15 17:02:00 -07:00
Cory Rivera
c9ac49c758 Check error in webhdfs.sendRequest before trying to check response code. (#4561) 2019-03-15 16:52:08 -07:00
Karl Burtram
c4e8aba1c9 Fix missing Azure account name (#4555)
* Fix Azure account picker row height

* Remove unneeded styles
2019-03-15 15:37:00 -07:00
Karl Burtram
ba6b8b1f69 Fix Azure account picker row height 2019-03-15 15:29:25 -07:00
Anthony Dresser
09af2fc2cb Add needs repro closer bot (#4517)
* change message of needs repro and enable

* update needs repro to needs info

* update to needs more info
2019-03-15 14:53:15 -07:00
Alan Ren
5caf0b02f0 make connectiondialog react to provider event (#4544)
* make connectiondialog react to provider event

* fix unit test error

* code review comments
2019-03-15 14:47:23 -07:00
Matt Irvine
86bac90001 Merge VS Code 1.31.1 (#4283) 2019-03-15 13:09:45 -07:00
Charles Gagnon
7d31575149 Fix dropdowns flickering every other time they're opened. The hide message code was being invoked which called hideContextView (the actual dropdown part) even if no message was ever displayed. Now we'll delay setting the message to null and only call hideContextView if we actually had a message to display. (#4528)
Also fixed small issue where messages that didn't have a container would throw an error when trying to call removeClass (since this.element is pulled from the container and thus was undefined.

Tested that the flicker is gone and that messages still show up correctly
2019-03-15 11:19:14 -07:00
Charles Gagnon
7223b28829 Add config for running extension tests (#4495)
* Add .bat for running extension tests (currently only Notebook) and set up launch.json with 2 new launch configs for running & debugging extension tests.

* Rename files to make it clear these aren't the integration tests

* Update launch.config too

* Fix spacing and missed file name update
2019-03-15 09:12:04 -07:00
Anthony Dresser
4014c1d0ab Small strict null checking pass on a few files (#4293)
* fix some null checking

* fix various null strict checks

* move location fo sql files in json

* fix compile and more unused properties

* formatting

* small formatting changes

* readd types

* add comments for angular components

* formatting

* remove any decl
2019-03-14 18:18:32 -07:00
Anthony Dresser
0bf0e795ca More data explorer actions (#4307)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code

* add node context on data explorer

* formatting

* add new Query action

* fix various errors with how the context menus work

* add manage and new query

* add refresh command

* formatting
2019-03-14 17:19:37 -07:00
Kevin Cunnane
0bc3716f74 FIX #4513 Notebook stuck at changing kernel (#4518)
* FIX #4513 Notebook stuck at changing kernel
- Intra-provider kernel change didn't happen because we only tried changing kernel on new session creation.
- Inverted the logic (e.g. did the right thing) and renamed the method so it's clearer what we're doing & what the boolean value should be
- Manually tested all the known scenarios
2019-03-14 15:49:14 -07:00
Alan Ren
ca23ea0f69 use the new dataprotocol client (#4515) 2019-03-14 15:48:49 -07:00
Kevin Cunnane
efaa2c0e3f Fix #4505 Notebooks: New Notebook will not work if existing untitled notebooks are rehydrated (#4506)
* Fix #4505 Notebooks: New Notebook will not work if existing untitled notebooks are rehydrated
* Also fixes #4508
* Unify behavior across New Notebook entry points
- Use Notebook-{n} as the standard in both entry points
- Use SQL as default provider in both
- Ensure both check for other names and only use free number
2019-03-14 14:27:08 -07:00
udeeshagautam
d91f4d5748 Fix for : 4471 Backup/Restore shows in Context Menu for Azure DB (#4498)
* Remove Back Restore from Cloud db's Context menu

* checking for null

* Cleaning up the check
2019-03-14 14:06:11 -07:00
Kevin Cunnane
6f1a03587a Fix #4029 Ensure changeKernels always resolves, even in error states (#4488)
* Fix #4029 Ensure changeKernels always resolves, even in error states
This is necessary to unblock reverting the kernel on canceling  Python install
- startSession now correctly sets up kernel information, since a kernel is loaded there.
- Remove call to change kernel on session initialize. This isn't needed due to refactor
- Handle kernel change failure by attempting to fall back to old kernel
- ExtensionHost $startNewSession now ensures errors are sent across the wire.
- Update AttachTo and Kernel dropdowns so they handle kernel being available. This is needed since other changes mean the session is likely ready before these get going

* Fix to handle python cancel and load existing scenarios
- Made changes to handle failure flow when Python dialog is canceled
- Made changes to handle initial load fail. Kernel and Attach To dropdowns show No Kernel / None and you can choose a kernel
- Added error wrapping in ext host so that string errors make it across and aren't lost.
2019-03-14 13:07:08 -07:00
Matt Irvine
7973f0f178 Disable custom title bar popup on Linux (#4478) 2019-03-14 09:28:19 -07:00
Chris LaFreniere
11f0ca371b Show running state in cell toolbar immediately when cell is run (#4484) 2019-03-13 20:04:46 -07:00
Kevin Cunnane
0131746919 Fix #4452 Notebook is reloaded with wrong kernel (#4480)
* Fix #4452 Notebook is reloaded with wrong kernel
- Await all extension registration before getting providers and providerId
To do this, we need to await way up in the NotebookInput, and promise the model that we'll have values eventually
2019-03-13 20:03:01 -07:00
Chris LaFreniere
5774b1f69a remove ugly border around text cells (#4481) 2019-03-13 19:24:39 -07:00
Cory Rivera
34d36c1de1 Prompt for Python installation after choosing a Jupyter kernel in notebook (#4453) 2019-03-13 18:44:54 -07:00
Chris LaFreniere
cca84e6455 Fix disappearing notebook table (#4466) 2019-03-13 18:32:31 -07:00
Matt Irvine
82ce1ace28 Put back code to refresh tree when capabilities provider gets new capabilities (#4475) 2019-03-13 17:41:29 -07:00
Anthony Dresser
e59d5a766f Enable similarity and locker (#4435)
* turn on similarity

* turn on locker
2019-03-13 16:51:01 -07:00
Anthony Dresser
98a8103f5a fix spacing in yml (#4436) 2019-03-13 16:49:26 -07:00
Alan Ren
81a8593eb6 bump up the extension version for new release (#4456)
* bump up the extension version for new release

* change url for dacpac ext
2019-03-13 15:46:03 -07:00
Anthony Dresser
b6584c9ddf Change shutdown listeners (#4282)
* change shutdown to use proper notification

* change to use storage service

* remove unused imports

* fix test

* change shutdown methods to private

* remove unusde imports

* fix tests

* formatting
2019-03-13 15:15:51 -07:00
Raj
08d4cc9690 #4441:Remove notebook editor - don't save selected (#4449) 2019-03-13 14:25:45 -07:00
kisantia
0565162fde remove broken link from dacpac extension readme (#4448) 2019-03-13 14:03:32 -07:00
Charles Gagnon
31614247cf #3127 #478 Fix 2 Edit Data grid issues (#4429)
* 2 fixes :

1. Fix it so that edits made to a table with a single column will correctly create a new row when done editing. This is done by caching whether the cell is dirty in on the onCellEditEnd callback and then when onCellSelect is called to switch cells we only return early if the cell isn't dirty. Before we were returning early and thus not going into any of the code that creates the new row (submitCurrentCellChange)

2. Fix it so we don't throw an error when deleting the NULL (new) row. As noted in the code this should really be handled by not displaying the context menu to begin with for that row but that's a bit more involved of a fix so for now I'm at least making it not display an error to the user - it'll just do nothing.

Manual testing for now - I tried to add tests but the core logic and the UI layer are too intertwined so this will likely need to be a full end-to-end UI test. I'll continue working on adding some for this dialog but for now I'd like to at least submit these fixes since they're pretty painful to deal with.

* Undo changes to addRow

* Clean up resetCurrentCell and add comment to setCurrentCell

* Use reset method instead of directly initializing currentCell

* Add issue # to comment
2019-03-13 13:52:35 -07:00
Karl Burtram
e9390dcd48 Update "Preview features" notification text (#4444) 2019-03-13 13:50:42 -07:00
Gene Lee
c9b3e2b156 Added 'serverMajorVersion' key to context which is missed currently and caused failure for loading 'Create External Table' menu (#4423) 2019-03-13 13:32:04 -07:00
Anthony Dresser
ace6012c1c Grid scrolling bugs (#4396)
* better maintense of heights in scrollable splitview

* formatting

* fix issue around views resizing messing with render logic

* remove commented code
2019-03-13 11:51:47 -07:00
Chris LaFreniere
17901fbf3d Fix Issue when OE connection Disconnects on Notebook Close (#4425)
* Fix OE connection closing when notebook closes

* handle connections created through Add new connection
2019-03-13 11:32:12 -07:00
udeeshagautam
2c7b5578e7 Bug/3731 results scroll (#4408)
* Bug 3731: Minor issues saving scroll position in results

Fixing the second part i.e. scroll position of results view while switching tabs. The resize call was not respecting the old scrolbar state. Making an explit call to set scroll position  to ensure this sets at the end.
2019-03-13 11:03:41 -07:00
Chris LaFreniere
b2b7d18802 Change Kernels Dropdown to show Switching instead of Loading On Kernel Change (#4426)
Change Kernels Dropdown to show Changing instead of Loading On Kernel Change
2019-03-13 09:59:19 -07:00
Geoff Young
965da0535a Fix sqlDropColumn description (#4422) 2019-03-13 01:06:16 -07:00
Raj
ebd187ec06 #4363: Reopen notebook editors when ADS launched (#4424)
* #4363: Reopen notebook editors when ADS launched

* Code review changes
2019-03-12 23:02:37 -07:00
Kevin Cunnane
b495fb7a37 Add notebook file icon support (#4419)
- Added to both built-in themes. Long term would be good to contribute this back to Seti so it works without a carbon edit tag.
- Issue #4418 tracks remaining problem where for new notebooks it's not set initially. This is blocked by Raj's work so will update once that lands.
2019-03-12 21:23:43 -07:00
David Shiflet
58e5125cde Add scripted object name to query editor tab display (#4403)
* Add scripted object name to query editor tab display

* Update src/sql/parts/query/common/queryInput.ts

Co-Authored-By: shueybubbles <shueybubbles@hotmail.com>
2019-03-12 22:07:11 -04:00
Anthony Dresser
dadfbd2ddb add check to remove errors (#4412) 2019-03-12 15:32:41 -07:00
Charles Gagnon
9fdeec6128 #3224 Fix extra connections populating MRU List (#4368)
* Modify where we add active connections so that we can use the saveTheConnection option to decide whether to add a connection to the MRU. This was necessary because the old location was called from onConnectionComplete which is sent by the SqlToolsService - and that doesn't get any UI related information like the options. The new location is still only called after the connection completes and will be added only if the connection succeeds.

Added new test and updated existing tests to handle new logic (plus a bit of async-refactoring).

* Fix couple spacing issues

* Add logic back in to short-circuit if we already have the connection in the active connections list.

* Fix spaces -> tabs
2019-03-12 14:05:50 -07:00
Matt Irvine
e783aeab66 Use max column width when auto-sizing columns (#4394) 2019-03-12 14:03:37 -07:00
Karl Burtram
53a94cc7bb Revert SQL Tools Service to 1.5.0-alpha.71 (#4413) 2019-03-12 13:38:44 -07:00
Anthony Dresser
839a9b6cb8 move event to better understand when things are happening (#4393) 2019-03-12 13:19:08 -07:00
Anthony Dresser
2557d77ae3 readd row height; add font styles to message panel as well (#4388) 2019-03-12 13:18:25 -07:00
Raj
6a7df2f1ae Adding back save api (#4407)
* #4339: Kernel change event occurs after model load

* #4347: Code cleanup - Notebooks Save

* Remove save method from sqlops

* Adding save method to api's

* Adding save method to ext host

* Misc change
2019-03-12 13:07:10 -07:00
Maddy
2db83b3892 Added localizie for the warning string (#4411) 2019-03-12 13:06:16 -07:00
Yurong He
08c7cc0918 Fixed #4384 add await on disconnect (#4389)
* Fixed #4384 add await on disconnect

* Resolve PR comment
2019-03-12 13:06:02 -07:00
Kevin Cunnane
d555dcb6d7 Fix minor error in snippet (#4398)
- Found a snippet missing a `,`
2019-03-12 12:14:24 -07:00
Kevin Cunnane
7226f25c67 Fix #4356 New Notebook from connection doesn't connect (#4364)
* Fix #4356 New Notebook from connection doesn't connect
Fix new notebook error by passing profile instead of ID.
- I could've just sent the ID over, but this fix sets the stage for disconnected connections to work (since we have enough info to properly connect).
- There's a bug in NotebookModel blocking the disconnected connection part working, but Yurong's in progress fixes will unblock this. Hence checking in as-is and working to properly unblock once that's in.

* Support connection profile in commandline service
- Added new context API for things that want to work on commandline and object explorer
- Refactored commandlineservice slightly to be async & have a simpler execution flow (far fewer if/else statements)

* Fix unit tests
- Fixed 2 issues raised by tests (sholdn't do new query if no profile passed, shouldn't error on new query failing)
- Updated unit tests to pass as expected given changes to the APIs.
2019-03-12 12:14:08 -07:00
Maddy
77a3be6fd7 pulling max bytes of data through the webhdfs api (#4314)
* pulling max bytes of data through the webhdfs api

* Added warning message for the Users to inform the data truncation.

* Updated the warning message on the notification flyer.
2019-03-12 10:38:34 -07:00
Raj
2397df7f22 #4339: Kernel change event occurs after model load (#4366) 2019-03-12 08:21:00 -07:00
Yurong He
118d2c7273 Fix #4047 Redesign notebook model to handle single client session (#4371)
* Start single client session based on the default kernel or saved kernel in NB.

* Added kernel displayName to standardKernel.
Modified name to allign wtih Juptyer Kernel.name.
So we can show the displayName during startup and use the name to start the session.

* Change session.OnSessionReady event in KernelDropDown

* Added model.KernelChnaged for switching kernel in the same provider

* Fixed session.Ready sequence

* Fixed merge issues

* Solve merged issue

* Fixed wrong kernel name in saved NB

* Added new event in Model to notify kernel change.
Toolbar depends on ModelReady to load

* Change attachTo to wait for ModelReady like KenelDropDown

* sanitizeSavedKernelInfo to fix invalid kernel and display_name. For example: PySpark1111 and PySpark 1111


* Added _contextsChangingEmitter to change loadContext msg when changing kernel

* Resolve PR comments
2019-03-11 17:59:13 -07:00
David Shiflet
b44d2b1bb3 Re-enabled command line service tests (#4387) 2019-03-11 19:31:11 -04:00
Chris LaFreniere
9867d88067 check for changeRef not destroyed before detecting changes (#4385) 2019-03-11 14:23:37 -07:00
Chris LaFreniere
037c49e2c6 Warning for table max rows displayed (#4357)
* Warning for top rows

* change single to double quotes when calling localize method
2019-03-08 19:01:21 -08:00
Chris LaFreniere
1e989060f9 Rewrite Spark UI link when using unified connection (#4362)
* Rewrite Spark UI link when using unified connection

* Add more robust error checking
2019-03-08 17:34:47 -08:00
Raj
4d6271c161 'Confirm save' implementation while closing untitled/existing notebooks (#4349)
* #4326: 'Confirm save' while closing both notebook

* Adding comment
2019-03-08 13:28:15 -08:00
Kevin Cunnane
c3900f6984 Notebooks: fix AttachTo showed only Localhost (#4354)
The Attach To was showing only localhost for SQL, since we overrode the standard kernels from SQL with the ones from Jupyter.
Fix is to save all standard kernels.
Also, added dispose handling for some events I found during debugging and removed unused imports
2019-03-08 13:27:14 -08:00
Gene Lee
aa1a036f66 Fixed bug: tree in extension does not show icon (#4348) 2019-03-08 12:34:13 -08:00
Karl Burtram
26274e6c5d Fix connection dialog Saved Connections refresh timing (#4346) 2019-03-08 11:45:11 -08:00
kisantia
bcfbe5a284 fix flatfile and dacfx wizard not defaulting to selected connection when launched from command palette (#4344) 2019-03-08 09:19:03 -08:00
Charles Gagnon
496243fbc7 Fix extra spacing in the file search view (#4343)
Recent changes in the VS Code layout for that viewlet caused the min-height property to affect the display (the property was set on that class before but the layout prevented it from actually being displayed as it was).
I removed the style for the messages class completely since I couldn't find a place that we actually used that ourselves - and anyways having styling on such a common name like messages isn't really a good practice since it applies to everything in ADS.
2019-03-08 07:32:04 -08:00
Raj
036ffe595a #3920: Notebooks file save/save all/cache - for existing files (#4286)
* #3920: Notebooks file save

* Missed in merge

* #4290: Untitled save and native dirty implementation

* Misc changes

* Content Manager, notebooks extension and commented failed unit tests

* Removing modelLoaded event
2019-03-07 18:07:20 -08:00
Maddy
2a903e9f03 handle non ascii characters hdfs filename (#4340)
* encoding the url so that special characters doesn't make the rquest a bad request.
2019-03-07 16:21:03 -08:00
Gene Lee
36e5bbb752 added 'fireOnTextChange' field to azdata.proposed.d.ts (#4341) 2019-03-07 16:03:50 -08:00
Anthony Dresser
96a976d826 add new release yml (#4333) 2019-03-07 15:33:13 -08:00
Kevin Cunnane
ff514a0568 Remove Analyze in Notebook from command palette (#4330) 2019-03-07 14:26:42 -08:00
Kevin Cunnane
a4d99b78d5 cluster deploy extension: Add localization support and fix " to ' strings (#4332)
* Add localization support and fix " to ' strings

* Fix ${ usage
2019-03-07 14:26:31 -08:00
Gene Lee
029c69ecd3 Fixed issue: input change on dropdownbox not reflected to 'dropdownbox.… (#4316) 2019-03-07 13:35:10 -08:00
Alan Ren
9e1f04e476 Enforce vscode and ads version check when installing extensions (#4267)
* engine check when install extension

* gallery install/update and vsix install

* FIX COMMENTS

* Fix the detail not loading issue when version is invalid

* add more comments and adress PR comments

* add install telemetry for install from vsix scenario

* correct the name of the version property for telemetry
2019-03-07 13:04:50 -08:00
Aditya Bist
c8bde41451 Disable edit step until all steps are loaded (#4327)
* disable edit step until all steps are loaded

* job check
2019-03-07 12:58:58 -08:00
Maddy
e16c01623d Added the new hdfs icon for the web HDFS folder. (#4317)
* Added the new hdfs icon for the web HDFS folder.

* overriding the getNodeInfo() in the ConnectionNode
2019-03-07 00:57:12 -08:00
Anthony Dresser
7de294a58e change sizing behavior to allow the messages to fulling collapse down (set results to have no max height) (#4313) 2019-03-06 22:44:49 -08:00
Alan Ren
060343f096 fix couple build issue due to merge issue (#4324)
* fix the build error on linux and mac

* one more fix
2019-03-06 21:53:09 -08:00
Alan Ren
addba0d007 remove the modelviewdialog namespace from azdata (#4301) 2019-03-06 21:32:05 -08:00
Alan Ren
68418f2c8f update target environment type page based on latest design (#4311) 2019-03-06 21:21:47 -08:00
Karl Burtram
428dd17d54 Set dashboard DB to master only for MSSQL provider (#4321) 2019-03-06 19:40:46 -08:00
Matt Irvine
7344b41f47 Fix bug where git extension fails in packaged builds (#4318) 2019-03-06 18:56:04 -08:00
kisantia
5f003b0dd7 Move DacFx wizard into separate extension (#4115)
* Moved dacfx wizard into separate extension

* updating to use azdata

* one more azdata change

* bump import extension version

* renaming extension to dacpac
2019-03-06 17:45:30 -08:00
Kevin Cunnane
b8f454b8ac Add request dependency to correct package.json (#4306)
- This was needed in mssql extension, not the main package.json
2019-03-06 14:33:07 -08:00
Anthony Dresser
b45e03a45a enable classifier (#4296) 2019-03-05 23:49:58 -08:00
Anup N. Kamath
0a268f35bc Fix backup (#4274)
* check to see whether options is available

* removing options state and using _modelOptions

* removed additional space

* making options getter private as not intended to expose
2019-03-05 19:17:42 -08:00
Chris LaFreniere
e3709533b4 Add New Notebook to File Menu (#4287) 2019-03-05 17:42:35 -08:00
Gene Lee
acc8d5f7b2 Added request dependency in mssql extension (#4297) 2019-03-05 17:33:28 -08:00
Anthony Dresser
da06a96630 Add Data Explorer Context and apply to disconnect (#4265)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code

* add node context on data explorer

* formatting

* fix various errors with how the context menus work
2019-03-05 17:09:00 -08:00
Yurong He
7eaf8cfd2f Add check OE node tests (#4273)
* Added verify OE child nodes tests for sandalone and BDC instance

* msg changes

* Added standalone OE node test

* Resolved PR comments

* Change env name

* Added scripts to set env for integration test
2019-03-05 14:42:05 -08:00
Cory Rivera
5248c8f78d Convert caught error to string in notebook onLoad error message. (#4276) 2019-03-04 16:49:23 -08:00
Gene Lee
f4365dbd3a Added WebHDFS rewritten to provide correct Error object and localized error messages (#4223) 2019-03-04 15:23:50 -08:00
Karl Burtram
2309b16bd4 Remove watch script and use 'yarn watch' instead (#4277) 2019-03-04 13:47:31 -08:00
Karl Burtram
c65af61a68 Bump ADS to 1.5.1 2019-03-04 13:01:37 -08:00
Yurong He
a48e9bc64c Fixed #4206 and #4207 open and close notebook quickly issue #4240 (#4255)
* Fixed  #4206 and #4207

* Check if it makes my PR run tests

* Add isReady to JupyterSessionManager
2019-03-04 12:46:27 -08:00
Karl Burtram
b4984d7f2d Update 'sqlops' to 'azdata' in Notebook Manager (#4275) 2019-03-04 12:45:34 -08:00
Yurong He
1017d62f0d Move sql related code to sqlNotebook folder (#4254)
* Move sql related code to sqlNotebook folder

* Resolve PR comments: rename folder to sql.

* Fixed the import path after rename folder
2019-03-04 09:45:32 -08:00
Yurong He
ebc208cacd Fixed #4181 and #4167 change kernel issue between SQL and Sparks #4238 (#4256)
* Fixed #4181 and #4167

The problems are:
- When change Kernel, setProviderIdForKernel switches Kernel first. So when we switch kernel across provider for exmple from PySpark3 to SQL. The Kernel is set SQL already in ClientSession.changeKernel. Then we lost oldKernel info.
  The fix is cache the old session in model before switch and use it to get the correct context
- SQL Kenerl could make mulitple connects from "Add new connection". While we didn't track them, those connections didn't close properly when close the notebook
  The fix is saving the connections made from "Add new connection" in model. and close them and activeConnection when close notebook

Problem is not solved yet in this PR:
- Didn't shutdown Jupytper when swich kernel from spark kernel to SQL.
2019-03-01 22:11:45 -08:00
Anthony Dresser
0236c8e7f8 Data Explorer Disconnect and Error Handling (#4243)
* adding context

* apply extension changes

* shimming disconnect

* add data explorer context menu and add disconnect to it

* clean up shim code; better handle errors

* remove tpromise

* simplify code
2019-03-01 17:47:28 -08:00
Cory Rivera
db8a92f5c2 Update LabeledMenuItemActionItem to match new vscode behavior. (#4264) 2019-03-01 17:37:12 -08:00
Charles Gagnon
c1e5408492 Change vscode folder name to azuredatastudio 2019-03-01 14:32:42 -08:00
Karl Burtram
84890eb1b4 Update product references from 'sqlops' to 'azdata' (#4259)
* Update extensions to use azdata

* Switch core code to use azdata
2019-03-01 13:59:37 -08:00
Alan Ren
220685a522 fix the undefined error when uninstalling extension (#4258) 2019-03-01 13:58:00 -08:00
Karl Burtram
8ebf5dbcb4 Add azdata.d.ts for new extensibility APIs (#4247)
* Add azdata.d.ts for new extensibility APIs

* Update azdata typing files for connection API proposal

* Add implementation for azdata module

* Fix build break in agent
2019-03-01 11:58:32 -08:00
Alan Ren
dad807d62d select cluster page and status update for tool when installing (#4251) 2019-03-01 11:12:57 -08:00
Karl Burtram
8e52ffa30e Fix copywrite headers in notebook extension (#4253) 2019-03-01 10:34:26 -08:00
Raj
18970ff0b9 #4225: Markdown content disappears when edit (#4249) 2019-02-28 19:18:40 -08:00
Cory Rivera
630698459b Update output channel name for Jupyter Notebooks. (#4246) 2019-02-28 16:45:56 -08:00
Karl Burtram
f8e854a087 Turn-on auto-size columns by default (#4241) 2019-02-28 15:24:38 -08:00
Karl Burtram
3b2274b0aa Update Azure resource explorer section title (#4237) 2019-02-28 15:22:40 -08:00
Ronald Quan
0d1ebce1a1 Ron/bdc script (#4221)
* WIP adding scripting support.

* Adding deploy command along with additional env vars needed.

* Adding script generation that sets envars, kube context, and mssqlctl

* Adding test email for docker email envar until we update UI.

* Adding cluster platform detection and disabling generate script after first click.

* Fix spacing and adding comment.
2019-02-28 14:26:50 -08:00
Alan Ren
70d86ce9a2 bump the version of import extension for a new package (#4244) 2019-02-28 13:41:01 -08:00
Cory Rivera
291f591af3 Use upper case PATH for jupyter environment variables. (#4222) 2019-02-27 14:21:34 -08:00
Cory Rivera
5a48c52a95 Delete duplicate path variables when setting up Jupyter environment config. Also added additional error message info on jupyter start. (#4212) 2019-02-27 13:30:27 -08:00
Raj
5625ef956d mkdir under notebook extension folder (#4214) 2019-02-27 13:29:01 -08:00
PaulRandal
969733ab77 Added VDI_CLIENT_OTHER to the list of ignored waits (#4197)
* Update waits_paul_randal.sql

Added the PREEMPTIVE_OS_FLUSHFILEBUFFERS wait to the list to ignore

* Update waits_detail_paul_randal.sql

Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list.

* Update waits_paul_randal.sql

* Update waits_detail_paul_randal.sql

* Update package.json

* Update README.md

* Update CHANGELOG.md
2019-02-27 09:07:41 -08:00
Cory Rivera
12a1ac1a7d Remove leftover merge tags from windows integration test script. (#4213) 2019-02-26 18:02:13 -08:00
Chris LaFreniere
4da322a03f Stop cell content from moving around on hover (#4202) 2019-02-26 17:01:20 -08:00
Anthony Dresser
d5754c00e2 Add bot configs to enable bot features (#4209)
* add bot configs to enable bot features

* update classifier label strings

* initial check in should perform be false to double ensure things are working properly
2019-02-26 16:23:48 -08:00
Anthony Dresser
2e9b5f3a2b Fixes azure sql expansion (#4185)
* fixes azure sql expansion

* remove unneeded changes

* remove unused code
2019-02-26 15:29:02 -08:00
Alan Ren
b11a8e9c0c add placeholder for container username/password input box (#4210)
* add placeholder for container username/password input box

* spacing
2019-02-26 15:19:36 -08:00
Karl Burtram
e37533afbb Add null check to extensionIdentifier to show reload notification (#4208) 2019-02-26 14:18:05 -08:00
Karl Burtram
4a476673ac Update SQL Tools Service to 1.5.0-alpha.73
Update SQL Tools Service to 1.5.0-alpha.73 to pickup the managed batch parser assembly.
2019-02-26 13:35:50 -08:00
Raj
fe5386cc08 display_name undefined error in javascript (#4187) 2019-02-26 13:10:40 -08:00
Alan Ren
8bfb1a9d39 add restore default values button for ports and container settings (#4195)
* add restore default values button for ports and container settings

* change some resource strings
2019-02-26 12:48:35 -08:00
Kevin Cunnane
109aafcbc0 Update vscode-nls in notebook and samples, plus fix samples compilation (#4203)
- Upate vscode-nls to 4.0.0 in notebook extension. Should be fix for Insiders build failure due to localization package failing
- Updated samples to remove as vscode-nls if not needed, or update if still needed
- Updated samples using `npm audit fix`
- Fixed compile errors in all the samples
2019-02-26 12:45:45 -08:00
Karl Burtram
e289c06d8d Add no-op debug extensibility APIs (follow-up) (#4192)
* Add no-op debug extensibility APIs

* Remove unneeded SQL EDIT tags
2019-02-26 08:51:55 -08:00
Karl Burtram
78c1c318c5 Add no-op debug extensibility APIs (#4188) 2019-02-26 08:47:10 -08:00
Matt Irvine
07983d75ea Use correct new line character when copying query results (#4170) 2019-02-25 16:39:57 -08:00
Ronald Quan
d0a4a4242d Feature/mssql-big-data-cluster (#4107)
* Adding kubernetes installer.

* Adding variety of kubectl support and integrating into the kubeconfig target cluster page.

* Addressing PR comments, refactored utility file locations and added missing license headers.
2019-02-25 15:09:22 -08:00
Alan Ren
a71be2b193 Alanren/bdc (#4161)
* wip

* target cluster type page

* finish target cluster type page

* remove commented line
2019-02-25 14:04:12 -08:00
Anthony Dresser
779ca13d48 remove builder references from panel (#4149) 2019-02-25 12:43:00 -08:00
Anthony Dresser
f3b0a50db7 remove builder from button (#4146) 2019-02-25 12:42:33 -08:00
Matt Irvine
c831596e02 Fix Windows issues with menu, packaging, and test script (#4144) 2019-02-25 11:06:31 -08:00
Kevin Cunnane
2ae369fbdb Notebook fixes: Fix #4129, fix #4116, Fix #3913, fix empty results error (#4150)
- Fixes #4129 Overlapping command help windows in notebook
  - Do not show parameter hints for inactive cells, to avoid them hanging around when no longer selected
- Fixes #4116 Notebooks: Intellisense Doesn't Work using Add New Connection
  - Move connect/disconnect logic to 1 place (code component) instead of 2
  - Handle the case where you connect after choosing active cell. We now hook to the event and update connection
- Fix issues in sql session manager where result outputs 0 rows. This was failing to show the empty resultset contents, which is a regression vs. query editor. It also put unhandled error on the debug console
- Fix #3913 Notebook: words selected in other cells should be unselected on cell change

Note: after fix, now looks as follows. Need to do follow up to get correct grid min height

![image](https://user-images.githubusercontent.com/10819925/53280226-9e629580-36cc-11e9-8f40-59cd913caeee.png)
2019-02-25 10:52:07 -08:00
Aditya Bist
f2c9d968a4 fix Object explorer tests (#4135) 2019-02-25 09:51:43 -08:00
Raj
c3f02980a0 Fix #3734 - Codecell content disappers when tabbing editors (#4153) 2019-02-24 19:31:42 -08:00
Kevin Cunnane
7d5ce7b5d7 Fix #4145 Possible for loading icon to appear with rendered widget (#4147)
- For cached insights, it was going down the checkStorage path which wasn't covered
2019-02-22 18:06:00 -08:00
Cory Rivera
8bb71eeb51 Fix lingering bugs from notebook code merge. (#4143)
* Add request to notebook dependencies.
* Use offline python package for windows installation.
2019-02-22 17:31:24 -08:00
Kevin Cunnane
5a88598811 Fix #3778 intellisense is delayed (#4134)
- Jupyter completion item support was awaiting info before responding. Fix is to check if this is even a notebook cell first, then only await stuff if that's true
2019-02-22 16:22:40 -08:00
Raj
da3fbd386d #4132 fix for disolayError (#4133) 2019-02-22 16:00:59 -08:00
Chris LaFreniere
1e915aad20 Make "Double-click to edit" in empty md cell a bit nicer looking (#4102) 2019-02-22 16:00:35 -08:00
Kevin Cunnane
889d5e5b28 Add loading spinner for insight widgets while they're in a loading state (#4136)
- This was very confusing without status indicator, is it complete or in progress?

Please let me know if you see issues with this e.g. was there a reason we didn't do this up to now?

Initial state:

![image](https://user-images.githubusercontent.com/10819925/52883300-4e5d5f00-311f-11e9-988c-f7a70dc0a899.png)

On loading / error:
![image](https://user-images.githubusercontent.com/10819925/52883203-faeb1100-311e-11e9-9757-13a5fd4104c6.png)
2019-02-22 16:00:06 -08:00
Matt Irvine
51cf2df2f8 Reapply changes to publish.ts script (#4138) 2019-02-22 15:52:10 -08:00
Matt Irvine
e690285d9d Undo accidental merge commits (#4137) 2019-02-22 14:52:43 -08:00
Raj Musuku
81d6423f24 Merge branch 'master' of https://github.com/Microsoft/azuredatastudio 2019-02-22 12:58:25 -08:00
Alan Ren
046dee7389 add link area support for text component (#4103)
* add link area support for text component

* add comment for localizable resources

* address comments
2019-02-22 12:53:09 -08:00
Raj Musuku
a33a24dddf Merge branch 'master' of https://github.com/Microsoft/azuredatastudio 2019-02-22 08:45:17 -08:00
Anthony Dresser
636bdbd12c copycat bot yml (#4125) 2019-02-21 22:51:13 -08:00
Raj Musuku
0ab3492afd Merge from master 2019-02-21 18:07:18 -08:00
Raj Musuku
666ae11639 Merge from master 2019-02-21 17:56:04 -08:00
Matt Irvine
826856c390 Merge VS Code 1.30.1 (#4092) 2019-02-21 17:17:23 -08:00
Chris LaFreniere
a764a481f3 Ensure that we preserve rest of PATH when starting Jupyter (#4109) 2019-02-21 11:54:08 -10:00
Anthony Dresser
e345090015 Data Explorer Sourcing (#4033)
* added initial data explorer viewlet

* added dataexplorer contribution point

* removed test view

* remove unused imports

* inital data source, needs work

* add shim for ext host

* formatting

* making the necessary changes to use OE for tree view; need to look at TreeUpdateUtils.connectAndCreateOeSession

* formatting

* shimming oe more

* update to add correct context

* working cross provider; need to fix connection

* connection works but it adds the connection to the oe for some reason

* formatting

* add custom connection dialog code path

* hashing between trees

* added flag and tests

* add id maps to handle multiple nodepaths

* add necessary car edit parts

* keep current behavior in prodc

* fix tests

* address comments

* update comments to be more informative

* finish merge

* update comments

* fix whitespace
2019-02-21 13:47:59 -08:00
Aditya Bist
cc97198fe4 Agent improvements and fixes (#4077)
* fixed new/edit job break when no start step given

* bump agent version

* agent version bump
2019-02-20 15:56:54 -08:00
Aditya Bist
5a146e34fa fixed scrolling for connection viewlet (#4073)
* fixed scrolling for connection viewlet

* removed horizontal scrolling
2019-02-20 13:51:56 -08:00
Cory Rivera
70838c3e24 Move SQL 2019 extension's notebook code into Azure Data Studio (#4090) 2019-02-20 10:55:49 -08:00
Chris LaFreniere
2dd71cbe26 Change SQL kernel to check queryManagementService instead of hardcoding (#4098)
* Change SQL kernel to check queryManagementService instead of hardcoding

* addressing PR comments
2019-02-19 15:55:11 -10:00
Chris LaFreniere
32c013a72c Add total execution time message for SQL notebooks (#4093)
* Display total execution time for sql notebooks

* remove handlebatchstart since it was unnecessary
2019-02-19 15:43:16 -10:00
Chris LaFreniere
db6e1ae558 Merge branch 'master' of https://github.com/microsoft/azuredatastudio 2019-02-19 17:38:10 -08:00
Chris LaFreniere
4117da6e93 undo remove sql kernel setting 2019-02-19 17:38:00 -08:00
Chris LaFreniere
02b1ba03ed remove sql kernel setting 2019-02-19 17:35:33 -08:00
Kevin Cunnane
1f501f4553 Improve cell language detection and add support for language magics (#4081)
* Move to using notebook language by default, with override in cell
* Update cell language on kernel change
* Tweak language logic so that it prefers code mirror mode, then falls back since this was failing some notebooks
* Add new package.json contribution to define language magics. These result in cell language changing. Language is cleared out on removing the language magic
* Added support for executing Python, R and Java in the SQL Kernel to prove this out. It converts to the sp_execute_external_script format

TODO in future PR:

* Need to hook up completion item support for magics (issue #4078)
* Should add indicator at the bottom of a cell when an alternate language has been detected (issue #4079)
* On executing Python, R or Java, should add some output showing the generated code (issue #4080)
2019-02-19 17:05:56 -08:00
Chris LaFreniere
0205d0afb5 Change default max table rows returned in notebook to 5000, make it user configurable (#4084) 2019-02-19 15:01:47 -10:00
Anthony Dresser
ccf9bf4613 change rendering in panel to fix event handelrs (#4082) 2019-02-19 12:48:07 -08:00
Anthony Dresser
d4704e39ac Another code layering (#4037)
* working on formatting

* fixed basic lint errors; starting moving things to their appropriate location

* formatting

* update tslint to match the version of vscode we have

* remove unused code

* work in progress fixing layering

* formatting

* moved connection management service to platform

* formatting

* add missing file

* moving more servies

* formatting

* moving more services

* formatting

* wip

* moving more services

* formatting

* move css file

* add missing svgs

* moved the rest of services

* formatting

* changing around some references

* formatting

* revert tslint

* revert some changes that brake things

* formatting

* fix tests

* fix testzx

* fix tests

* fix tests

* fix compile issue
2019-02-19 12:11:54 -08:00
Chris LaFreniere
4a82abc19b Notebooks: Greatly Reduce Time to Generate HTML Table String (#4086)
* Greatly reduce time to generate html table string

* change outer tag to table instead of html

* address PR feedback for more descriptive variable name
2019-02-19 09:03:58 -10:00
Chris LaFreniere
3ae32ab0a0 Fix the Attempt to use a destroyed view Errors (#4087) 2019-02-19 08:28:55 -10:00
Alan Ren
1cc6a108a7 Deprecate the modelviewdialog namespace (#4075)
* Deprecate the modelviewdialog namespace

* use @deprecated tag
2019-02-19 09:43:37 -08:00
Chris LaFreniere
d4ffe53dbd Add database name to attach to (if not connected to master) (#4076) 2019-02-15 16:50:14 -10:00
Yurong He
f43b3508e9 Added SQL notebook IntelliSense (#4064)
* Added SQL notebook intelliSense

* Resolved PR comments.

* catch disconnect error
2019-02-15 15:53:06 -08:00
Chris LaFreniere
ea0c326d3d Allow code coverage command to succeed again (#4054) 2019-02-15 12:07:52 -10:00
Kevin Cunnane
4843480fbf Update server reports extension version, fix its build breaks, and reduce its size to 86Kb (#4062)
* Updated version
* On building found it had build break due to unused imports. Turns out none of the code is used (it's just a package.json + SQL file extension) so removed it all
* Removed all unnecessary node module imports, reducing size from 5.4MB to 86KB. We should probably do this for all extensions
2019-02-15 10:33:22 -08:00
Chris LaFreniere
930e14e258 Add bottom margin to notebook table, fix python highlighting (#4055) 2019-02-15 08:32:27 -10:00
David Shiflet
87bbb41fb6 window reuse for connections (#4049)
* window reuse for connections

* space after colon

* use undefined instead of null
2019-02-15 09:44:06 -05:00
Kevin Cunnane
e767f68f89 Move New Notebook to the connection node in MSSQL server OE connections (#4053) 2019-02-14 15:47:36 -10:00
Raj
b4d304c21e 'Attach to' with Spark kernel resets to sql connection on cancelling connection dialog (#4024)
* Sql connection resets to Select Connection on cancelling dialog

* Hiding error message wehen cancel the connection dialog
2019-02-14 16:39:23 -08:00
Chris LaFreniere
db1f412dae show errors and messages in output (#4031) 2019-02-13 14:54:51 -10:00
PaulRandal
a73f5e83ec Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list in waits script (#4030)
* Update waits_paul_randal.sql

Added the PREEMPTIVE_OS_FLUSHFILEBUFFERS wait to the list to ignore

* Update waits_detail_paul_randal.sql

Add PREEMPTIVE_OS_FLUSHFILEBUFFERS to ignore list.
2019-02-13 13:21:17 -08:00
Karl Burtram
49b2e98a8f Update readme and changelog for Feb release (#4025) 2019-02-13 09:33:40 -08:00
Alan Ren
b3a16fd0ce Feature/bdc create (#4012)
* initial checkin

* rename

* wizard pages

* target cluster radio button group

* resource strings

* existing cluster picker

* revert changes to unwanted file

* revert unwanted changes-2

* update cluster icon

* settings page

* fix group container

* hyperlink component

* address review comments

* comments part 2
2019-02-12 22:13:30 -08:00
Karl Burtram
dd6735ec04 Bump Azure Data Studio to 1.5.0 2019-02-12 21:10:55 -08:00
Chris LaFreniere
9ebf1436d2 Ensure we always switch to a kernel that exists in the session manager (#4015)
* Ensure we always switch to a kernel that exists in the session manager

* PR feedback, create new helper method
2019-02-12 12:46:58 -10:00
Kevin Cunnane
0aa71b5237 Fix kernel name check bug, double-event hooking, and other Notebook issues (#4016)
* Fix error where kernel name was compared to itself. This doesn't break anything right now since we happen to have special handling of Python3, and for other kernels they share the same set of supported providers (which is what the check is used for). However it needs fixing for next release.
* Fix console error due to queryTextEditor trying to access model before it's ready
* Fix issues where notebook model hooked to session events multiple times
* Removed calls that weren't needed such as loadActiveContexts (if undefined, does nothing) and passing connection to initialize method
2019-02-12 14:26:22 -08:00
Alan Ren
7ef2d52efd add admin pack to recommended ext list (#4019) 2019-02-12 13:00:31 -08:00
Karl Burtram
a5c6dfe62b Bump agent and import extension versions (#4018) 2019-02-12 12:53:52 -08:00
Alan Ren
4a606e0cb2 Alanren/admin pack (#4014)
* admin pack

* formatting and typo fixes

* remove changelog.md
2019-02-12 12:04:19 -08:00
Chris LaFreniere
27370c655d fix left table border to be dotted, no longer have table border on div (#4010) 2019-02-12 08:49:44 -10:00
13039 changed files with 315639 additions and 395325 deletions

View File

@@ -1,4 +1,4 @@
# EditorConfig is awesome: http://EditorConfig.org
# EditorConfig is awesome: https://EditorConfig.org
# top-most EditorConfig file
root = true
@@ -6,7 +6,6 @@ root = true
# Tab indentation
[*]
indent_style = tab
indent_size = 4
trim_trailing_whitespace = true
# The indent size used in the `package.json` file cannot be changed

View File

@@ -1,19 +0,0 @@
{
"env": {
"node": true,
"es6": true
},
"rules": {
"no-console": 0,
"no-cond-assign": 0,
"no-unused-vars": 1,
"no-extra-semi": "warn",
"semi": "warn"
},
"extends": "eslint:recommended",
"parserOptions": {
"ecmaFeatures": {
"experimentalObjectRestSpread": true
}
}
}

20
.eslintrc.json Normal file
View File

@@ -0,0 +1,20 @@
{
"root": true,
"env": {
"node": true,
"es6": true
},
"rules": {
"no-console": 0,
"no-cond-assign": 0,
"no-unused-vars": 1,
"no-extra-semi": "warn",
"semi": "warn"
},
"extends": "eslint:recommended",
"parserOptions": {
"ecmaFeatures": {
"experimentalObjectRestSpread": true
}
}
}

3
.gitattributes vendored
View File

@@ -6,4 +6,5 @@ ThirdPartyNotices.txt eol=crlf
*.bat eol=crlf
*.cmd eol=crlf
*.ps1 eol=lf
*.sh eol=lf
*.sh eol=lf
*.rtf -text

View File

@@ -2,7 +2,7 @@
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
labels: Bug
assignees: ''
---

View File

@@ -2,7 +2,7 @@
name: Feature request
about: Suggest an idea for this project
title: ''
labels: feature request
labels: Enhancement
assignees: ''
---

49
.github/classifier.yml vendored Normal file
View File

@@ -0,0 +1,49 @@
{
perform: false,
alwaysRequireAssignee: false,
labelsRequiringAssignee: [],
autoAssignees: {
accessibility: [],
acquisition: [],
agent: [],
azure: [],
backup: [],
bcdr: [],
'chart viewer': [],
connection: [],
dacfx: [],
dashboard: [],
'data explorer': [],
documentation: [],
'edit data': [],
export: [],
extensibility: [],
extensionManager: [],
globalization: [],
grid: [],
import: [],
insights: [],
intellisense: [],
localization: [],
'managed instance': [],
notebooks: [],
'object explorer': [],
performance: [],
profiler: [],
'query editor': [],
'query execution': [],
reliability: [],
restore: [],
scripting: [],
'server group': [],
settings: [],
setup: [],
shell: [],
showplan: [],
snippet: [],
sql2019Preview: [],
sqldw: [],
supportability: [],
ux: []
}
}

12
.github/commands.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
{
perform: false,
commands: [
{
type: 'label',
name: 'duplicate',
allowTriggerByBot: true,
action: 'close',
comment: "Thanks for creating this issue! We figured it's covering the same as another one we already have. Thus, we closed this one as a duplicate. You can search for existing issues [here](https://aka.ms/vscodeissuesearch). See also our [issue reporting](https://aka.ms/vscodeissuereporting) guidelines.\n\nHappy Coding!"
}
]
}

5
.github/copycat.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
{
perform: true,
target_owner: 'anthonydresser',
target_repo: 'testissues'
}

6
.github/locker.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
daysAfterClose: 45,
daysSinceLastUpdate: 3,
ignoredLabels: [],
perform: true
}

6
.github/needs_more_info.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
daysUntilClose: 7,
needsMoreInfoLabel: 'needs more info',
perform: true,
closeComment: "This issue has been closed automatically because it needs more information and has not had recent activity in the last 7 days. If you have more info to help resolve the issue, leave a comment"
}

6
.github/new_release.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
{
newReleaseLabel: 'new-release',
newReleaseColor: '006b75',
daysAfterRelease: 5,
perform: true
}

5
.github/similarity.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
{
perform: true,
whenCreatedByTeam: true,
comment: "Thanks for submitting this issue. Please also check if it is already covered by an existing one, like:\n${potentialDuplicates}"
}

3
.gitignore vendored
View File

@@ -3,6 +3,7 @@ npm-debug.log
Thumbs.db
node_modules/
.build/
extensions/**/dist/
out/
out-build/
out-editor/
@@ -17,4 +18,4 @@ build/node_modules
coverage/
test_data/
test-results/
yarn-error.log
yarn-error.log

2
.nvmrc
View File

@@ -1 +1 @@
8.9.2
10

23
.vscode/cglicenses.schema.json vendored Normal file
View File

@@ -0,0 +1,23 @@
{
"type": "array",
"items": {
"type": "object",
"required": [
"name",
"licenseDetail"
],
"properties": {
"name": {
"type": "string",
"description": "The name of the dependency"
},
"licenseDetail": {
"type": "array",
"description": "The complete license text of the dependency",
"items": {
"type": "string"
}
}
}
}
}

142
.vscode/cgmanifest.schema.json vendored Normal file
View File

@@ -0,0 +1,142 @@
{
"type": "object",
"properties": {
"registrations": {
"type": "array",
"items": {
"type": "object",
"properties": {
"component": {
"oneOf": [
{
"type": "object",
"required": [
"type",
"git"
],
"properties": {
"type": {
"type": "string",
"enum": [
"git"
]
},
"git": {
"type": "object",
"required": [
"name",
"repositoryUrl",
"commitHash"
],
"properties": {
"name": {
"type": "string"
},
"repositoryUrl": {
"type": "string"
},
"commitHash": {
"type": "string"
}
}
}
}
},
{
"type": "object",
"required": [
"type",
"npm"
],
"properties": {
"type": {
"type": "string",
"enum": [
"npm"
]
},
"npm": {
"type": "object",
"required": [
"name",
"version"
],
"properties": {
"name": {
"type": "string"
},
"version": {
"type": "string"
}
}
}
}
},
{
"type": "object",
"required": [
"type",
"other"
],
"properties": {
"type": {
"type": "string",
"enum": [
"other"
]
},
"other": {
"type": "object",
"required": [
"name",
"downloadUrl",
"version"
],
"properties": {
"name": {
"type": "string"
},
"downloadUrl": {
"type": "string"
},
"version": {
"type": "string"
}
}
}
}
}
]
},
"repositoryUrl": {
"type": "string",
"description": "The git url of the component"
},
"version": {
"type": "string",
"description": "The version of the component"
},
"license": {
"type": "string",
"description": "The name of the license"
},
"developmentDependency": {
"type": "boolean",
"description": "This component is inlined in the vscode repo and **is not shipped**."
},
"isOnlyProductionDependency": {
"type": "boolean",
"description": "This component is shipped and **is not inlined in the vscode repo**."
},
"licenseDetail": {
"type": "array",
"items": {
"type": "string"
},
"description": "The license text"
}
}
}
}
}
}

View File

@@ -2,7 +2,7 @@
// See https://go.microsoft.com/fwlink/?LinkId=827846
// for the documentation about the extensions.json format
"recommendations": [
"eg2.tslint",
"ms-vscode.vscode-typescript-tslint-plugin",
"dbaeumer.vscode-eslint",
"msjsdiag.debugger-for-chrome"
]

125
.vscode/launch.json vendored
View File

@@ -9,14 +9,12 @@
"stopOnEntry": true,
"args": [
"hygiene"
],
"cwd": "${workspaceFolder}"
]
},
{
"type": "node",
"request": "attach",
"name": "Attach to Extension Host",
"protocol": "inspector",
"port": 5870,
"restart": true,
"outFiles": [
@@ -24,19 +22,15 @@
]
},
{
"type": "node",
"type": "chrome",
"request": "attach",
"name": "Attach to Shared Process",
"protocol": "inspector",
"port": 5871,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
"port": 9222,
"urlFilter": "*"
},
{
"type": "node",
"request": "attach",
"protocol": "inspector",
"name": "Attach to Search Process",
"port": 5876,
"outFiles": [
@@ -47,7 +41,6 @@
"type": "node",
"request": "attach",
"name": "Attach to CLI Process",
"protocol": "inspector",
"port": 5874,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -57,7 +50,6 @@
"type": "node",
"request": "attach",
"name": "Attach to Main Process",
"protocol": "inspector",
"port": 5875,
"outFiles": [
"${workspaceFolder}/out/**/*.js"
@@ -74,7 +66,8 @@
"request": "launch",
"name": "Launch azuredatastudio",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat"
"runtimeExecutable": "${workspaceFolder}/scripts/sql.bat",
"timeout": 20000
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
@@ -82,15 +75,28 @@
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh"
},
"urlFilter": "*index.html*",
"env": {
"VSCODE_EXTHOST_WILL_SEND_SOCKET": null
},
"breakOnLoad": false,
"urlFilter": "*workbench.html*",
"runtimeArgs": [
"--inspect=5875"
"--inspect=5875",
"--no-cached-data"
],
"skipFiles": [
"**/winjs*.js"
"webRoot": "${workspaceFolder}"
},
{
"type": "node",
"request": "launch",
"name": "Launch ADS (Main Process)",
"runtimeExecutable": "${workspaceFolder}/scripts/sql.sh",
"runtimeArgs": [
"--no-cached-data"
],
"webRoot": "${workspaceFolder}",
"timeout": 45000
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "chrome",
@@ -116,34 +122,6 @@
"webRoot": "${workspaceFolder}",
"timeout": 45000
},
{
"type": "node",
"request": "launch",
"name": "Unit Tests",
"protocol": "inspector",
"program": "${workspaceFolder}/node_modules/mocha/bin/_mocha",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"stopOnEntry": false,
"outputCapture": "std",
"args": [
"--delay",
"--timeout",
"2000"
],
"cwd": "${workspaceFolder}",
"env": {
"ELECTRON_RUN_AS_NODE": "true"
},
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"name": "Launch Built-in Extension",
"type": "extensionHost",
@@ -162,9 +140,60 @@
"env": {
"BUILD_ARTIFACTSTAGINGDIRECTORY": "${workspaceFolder}"
}
}
},
{
"type": "node",
"request": "launch",
"name": "Run Unit Tests",
"program": "${workspaceFolder}/test/electron/index.js",
"runtimeExecutable": "${workspaceFolder}/.build/electron/Azure Data Studio.app/Contents/MacOS/Electron",
"windows": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio.exe"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/.build/electron/azuredatastudio"
},
"outputCapture": "std",
"args": [
"--remote-debugging-port=9222"
],
"cwd": "${workspaceFolder}",
"outFiles": [
"${workspaceFolder}/out/**/*.js"
]
},
{
"type": "chrome",
"request": "launch",
"name": "Run Extension Unit Tests",
"windows": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.bat"
},
"osx": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"linux": {
"runtimeExecutable": "${workspaceFolder}/scripts/test-extensions-unit.sh"
},
"webRoot": "${workspaceFolder}",
"timeout": 45000
},
],
"compounds": [
{
"name": "Debug Unit Tests",
"configurations": [
"Attach to azuredatastudio",
"Run Unit Tests"
]
},
{
"name": "Debug Extension Unit Tests",
"configurations": [
"Attach to Extension Host",
"Run Extension Unit Tests"
]
},
{
"name": "Debug azuredatastudio Main and Renderer",
"configurations": [

26
.vscode/settings.json vendored
View File

@@ -11,7 +11,7 @@
}
},
"files.associations": {
"OSSREADME.json": "jsonc"
"cglicenses.json": "jsonc"
},
"search.exclude": {
"**/node_modules": true,
@@ -22,9 +22,9 @@
"out-vscode/**": true,
"i18n/**": true,
"extensions/**/out/**": true,
"test/smoke/out/**": true
"test/smoke/out/**": true,
"src/vs/base/test/node/uri.test.data.txt": true
},
"tslint.enable": true,
"lcov.path": [
"./.build/coverage/lcov.info",
"./.build/coverage-single/lcov.info"
@@ -43,6 +43,20 @@
"git.ignoreLimitWarning": true,
"emmet.excludeLanguages": [],
"typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.quoteStyle": "single"
}
"typescript.preferences.quoteStyle": "single",
"json.schemas": [
{
"fileMatch": [
"cgmanifest.json"
],
"url": "./.vscode/cgmanifest.schema.json"
},
{
"fileMatch": [
"cglicenses.json"
],
"url": "./.vscode/cglicenses.schema.json"
}
],
"git.ignoreLimitWarning": true
}

40
.vscode/shared.code-snippets vendored Normal file
View File

@@ -0,0 +1,40 @@
{
// Each snippet is defined under a snippet name and has a scope, prefix, body and
// description. The scope defines in watch languages the snippet is applicable. The prefix is what is
// used to trigger the snippet and the body will be expanded and inserted.Possible variables are:
// $1, $2 for tab stops, $0 for the final cursor position, and ${1:label}, ${2:another} for placeholders.
// Placeholders with the same ids are connected.
// Example:
"MSFT Copyright Header": {
"scope": "javascript,typescript,css",
"prefix": [
"header",
"stub",
"copyright"
],
"body": [
"/*---------------------------------------------------------------------------------------------",
" * Copyright (c) Microsoft Corporation. All rights reserved.",
" * Licensed under the Source EULA. See License.txt in the project root for license information.",
" *--------------------------------------------------------------------------------------------*/",
"",
"$0"
],
"description": "Insert Copyright Statement"
},
"TS -> Inject Service": {
"scope": "typescript",
"description": "Constructor Injection Pattern",
"prefix": "@inject",
"body": "@$1 private readonly _$2: ${1},$0"
},
"TS -> Event & Emitter": {
"scope": "typescript",
"prefix": "emitter",
"description": "Add emitter and event properties",
"body": [
"private readonly _onDid$1 = new Emitter<$2>();",
"readonly onDid$1: Event<$2> = this._onDid$1.event;"
],
}
}

2
.vscode/tasks.json vendored
View File

@@ -69,4 +69,4 @@
"problemMatcher": []
}
]
}
}

View File

@@ -1,3 +1,3 @@
disturl "https://atom.io/download/electron"
target "2.0.9"
target "3.1.6"
runtime "electron"

View File

@@ -1,5 +1,45 @@
# Change Log
## Version 1.5.1
* Release date: March 18, 2019
* Release status: General Availability
## What's new in this version
* Announcing T-SQL Notebooks
* Announcing PostgreSQL extension
* Announcing SQL Server Dacpac extension
* Resolved [bugs and issues](https://github.com/Microsoft/azuredatastudio/milestone/25?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* GeoffYoung for `Fix sqlDropColumn description #4422`
## Version 1.4.5
* Release date: February 13, 2019
* Release status: General Availability
## What's new in this version
* Added **Admin pack for SQL Server** extension pack to make it easier to install SQL Server admin-related extensions. This includes:
* [SQL Server Agent](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-agent-extension?view=sql-server-2017)
* [SQL Server Profiler](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-profiler-extension?view=sql-server-2017)
* [SQL Server Import](https://docs.microsoft.com/en-us/sql/azure-data-studio/sql-server-import-extension?view=sql-server-2017)
* Added filtering extended event support in Profiler extension
* Added Save as XML feature that can save T-SQL results as XML
* Added Data-Tier Application Wizard improvements
* Added Generate script button
* Added view to give warnings of possible data loss during deployment
* Updates to the [SQL Server 2019 Preview extension](https://docs.microsoft.com/sql/azure-data-studio/sql-server-2019-extension?view=sql-server-ver15)
* Results streaming enabled by default for long running queries
* Resolved [bugs and issues](https://github.com/Microsoft/azuredatastudio/milestone/23?closed=1).
## Contributions and "thank you"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* AlexFsmn for `Added context menu for DBs in explorer view to backup & restore db. #2277`
* sadedil for `Missing feature request: Save as XML #3729`
* gbritton1 for `Removed reference to object explorer #3463`
## Version 1.3.8
* Release date: January 9, 2019
* Release status: General Availability

View File

@@ -18,11 +18,15 @@ File a single issue per problem and feature request.
* Do not enumerate multiple bugs or feature requests in the same issue.
* Do not add your issue as a comment to an existing issue unless it's for the identical input. Many issues look similar, but have different causes.
The more information you can provide, the more likely someone will be successful reproducing the issue and finding a fix.
The more information you can provide, the more likely someone will be successful at reproducing the issue and finding a fix.
The built-in tool for reporting an issue, which you can access by using `Report Issue` in Azure Data Studio's Help menu, can help streamline this process by automatically providing the version of Azure Data Studio, all your installed extensions, and your system info.
Please include the following with each issue.
* Version of Azure Data Studio (formerly SQL Operations Studio).
* Version of Azure Data Studio (formerly SQL Operations Studio)
* Your operating system
> **Tip:** You can easily create an issue using `Report Issues` from Azure Data Studio Help menu.

File diff suppressed because it is too large Load Diff

View File

@@ -9,13 +9,13 @@ Azure Data Studio is a data management tool that enables you to work with SQL Se
Platform | Link
-- | --
Windows User Installer | https://go.microsoft.com/fwlink/?linkid=2049972
Windows System Installer | https://go.microsoft.com/fwlink/?linkid=2049975
Windows ZIP | https://go.microsoft.com/fwlink/?linkid=2050146
macOS ZIP | https://go.microsoft.com/fwlink/?linkid=2049981
Linux TAR.GZ | https://go.microsoft.com/fwlink/?linkid=2049986
Linux RPM | https://go.microsoft.com/fwlink/?linkid=2049989
Linux DEB | https://go.microsoft.com/fwlink/?linkid=2050157
Windows User Installer | https://go.microsoft.com/fwlink/?linkid=2083322
Windows System Installer | https://go.microsoft.com/fwlink/?linkid=2083323
Windows ZIP | https://go.microsoft.com/fwlink/?linkid=2083324
macOS ZIP | https://go.microsoft.com/fwlink/?linkid=2083325
Linux TAR.GZ | https://go.microsoft.com/fwlink/?linkid=2083424
Linux RPM | https://go.microsoft.com/fwlink/?linkid=2083326
Linux DEB | https://go.microsoft.com/fwlink/?linkid=2083327
Go to our [download page](https://aka.ms/azuredatastudio) for more specific instructions.
@@ -68,6 +68,10 @@ The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.micro
## Contributions and "Thank You"
We would like to thank all our users who raised issues, and in particular the following users who helped contribute fixes:
* GeoffYoung for `Fix sqlDropColumn description #4422`
* AlexFsmn for `Added context menu for DBs in explorer view to backup & restore db. #2277`
* sadedil for `Missing feature request: Save as XML #3729`
* gbritton1 for `Removed reference to object explorer #3463`
* Tarig0 for `Add Routine_Type to CreateStoredProc fixes #3257 (#3286)`
* oltruong for `typo fix #3025'`
* Thomas-S-B for `Removed unnecessary IErrorDetectionStrategy #749`

View File

@@ -17,10 +17,12 @@ expressly granted herein, whether by implication, estoppel or otherwise.
chokidar: https://github.com/paulmillr/chokidar
comment-json: https://github.com/kaelzhang/node-comment-json
core-js: https://github.com/zloirock/core-js
decompress: https://github.com/kevva/decompress
emmet: https://github.com/emmetio/emmet
error-ex: https://github.com/Qix-/node-error-ex
escape-string-regexp: https://github.com/sindresorhus/escape-string-regexp
fast-plist: https://github.com/Microsoft/node-fast-plist
figures: https://github.com/sindresorhus/figures
find-remove: https://www.npmjs.com/package/find-remove
fs-extra: https://github.com/jprichardson/node-fs-extra
gc-signals: https://github.com/Microsoft/node-gc-signals
@@ -41,22 +43,28 @@ expressly granted herein, whether by implication, estoppel or otherwise.
native-keymap: https://github.com/Microsoft/node-native-keymap
native-watchdog: https://github.com/Microsoft/node-native-watchdog
ng2-charts: https://github.com/valor-software/ng2-charts
node-fetch: https://github.com/bitinn/node-fetch
node-pty: https://github.com/Tyriar/node-pty
nsfw: https://github.com/Axosoft/nsfw
pretty-data: https://github.com/vkiryukhin/pretty-data
primeng: https://github.com/primefaces/primeng
process-nextick-args: https://github.com/calvinmetcalf/process-nextick-args
pty.js: https://github.com/chjj/pty.js
reflect-metadata: https://github.com/rbuckton/reflect-metadata
request: https://github.com/request/request
rxjs: https://github.com/ReactiveX/RxJS
semver: https://github.com/npm/node-semver
slickgrid: https://github.com/6pac/SlickGrid
sqltoolsservice: https://github.com/Microsoft/sqltoolsservice
svg.js: https://github.com/svgdotjs/svg.js
systemjs: https://github.com/systemjs/systemjs
temp-write: https://github.com/sindresorhus/temp-write
underscore: https://github.com/jashkenas/underscore
v8-profiler: https://github.com/node-inspector/v8-profiler
vscode: https://github.com/microsoft/vscode
vscode-debugprotocol: https://github.com/Microsoft/vscode-debugadapter-node
vscode-languageclient: https://github.com/Microsoft/vscode-languageserver-node
vscode-nls: https://github.com/Microsoft/vscode-nls
vscode-ripgrep: https://github.com/roblourens/vscode-ripgrep
vscode-textmate: https://github.com/Microsoft/vscode-textmate
winreg: https://github.com/fresc81/node-winreg
@@ -64,10 +72,9 @@ expressly granted herein, whether by implication, estoppel or otherwise.
yauzl: https://github.com/thejoshwolfe/yauzl
zone.js: https://www.npmjs.com/package/zone
Microsoft PROSE SDK: https://microsoft.github.io/prose
%% angular NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License
Copyright (c) 2014-2017 Google, Inc. http://angular.io
@@ -293,6 +300,20 @@ THE SOFTWARE.
=========================================
END OF core-js NOTICES AND INFORMATION
%% decompress NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Kevin Mårtensson <kevinmartensson@gmail.com> (github.com/kevva)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF decompress NOTICES AND INFORMATION
%% emmet NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
@@ -322,32 +343,6 @@ END OF emmet NOTICES AND INFORMATION
=========================================
The MIT License (MIT)
Copyright (c) 2015 JD Ballard
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
=========================================
END OF error-ex NOTICES AND INFORMATION
%% escape-string-regexp NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy
@@ -394,6 +389,20 @@ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEAL
=========================================
END OF fast-plist NOTICES AND INFORMATION
%% figures NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF figures NOTICES AND INFORMATION
%% fs-extra NOTICES AND INFORMATION BEGIN HERE
=========================================
(The MIT License)
@@ -1335,6 +1344,32 @@ SOFTWARE.
=========================================
END OF ng2-charts NOTICES AND INFORMATION
%% node-fetch NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) 2016 David Frank
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
=========================================
END OF node-fetch NOTICES AND INFORMATION
%% node-pty NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
@@ -1409,6 +1444,30 @@ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLI
=========================================
END OF primeng NOTICES AND INFORMATION
%% process-nextick-args NOTICES AND INFORMATION BEGIN HERE
=========================================
# Copyright (c) 2015 Calvin Metcalf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
**THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.**
=========================================
END OF process-nextick-args NOTICES AND INFORMATION
%% pty.js NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2012-2015, Christopher Jeffrey (https://github.com/chjj/)
@@ -1493,6 +1552,66 @@ END OF TERMS AND CONDITIONS
=========================================
END OF reflect-metadata NOTICES AND INFORMATION
%% request NOTICES AND INFORMATION BEGIN HERE
=========================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
You must give any other recipients of the Work or Derivative Works a copy of this License; and
You must cause any modified files to carry prominent notices stating that You changed the files; and
You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
=========================================
END OF request NOTICES AND INFORMATION
%% rxjs NOTICES AND INFORMATION BEGIN HERE
=========================================
Apache License
@@ -1818,6 +1937,20 @@ ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEAL
=========================================
END OF systemjs NOTICES AND INFORMATION
%% temp-write NOTICES AND INFORMATION BEGIN HERE
=========================================
MIT License
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF temp-write NOTICES AND INFORMATION
%% underscore NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) 2009-2017 Jeremy Ashkenas, DocumentCloud and Investigative
@@ -1920,6 +2053,50 @@ OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWA
=========================================
END OF vscode-debugprotocol NOTICES AND INFORMATION
%% vscode-languageclient NOTICES AND INFORMATION BEGIN HERE
=========================================
Copyright (c) Microsoft Corporation
All rights reserved.
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF vscode-languageclient NOTICES AND INFORMATION
%% vscode-nls NOTICES AND INFORMATION BEGIN HERE
=========================================
The MIT License (MIT)
Copyright (c) Microsoft Corporation
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT
OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
=========================================
END OF vscode-nls NOTICES AND INFORMATION
%% vscode-ripgrep NOTICES AND INFORMATION BEGIN HERE
=========================================
vscode-ripgrep
@@ -2079,3 +2256,187 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
=========================================
END OF zone.js NOTICES AND INFORMATION
%% Microsoft.ProgramSynthesis.Common NOTICES AND INFORMATION BEGIN HERE
=========================================
NOTICES AND INFORMATION
Do Not Translate or Localize
This software incorporates material from third parties. Microsoft makes certain
open source code available at http://3rdpartysource.microsoft.com, or you may
send a check or money order for US $5.00, including the product name, the open
source component name, and version number, to:
Source Code Compliance Team
Microsoft Corporation
One Microsoft Way
Redmond, WA 98052
USA
Notwithstanding any other terms, you may reverse engineer this software to the
extent required to debug changes to any libraries licensed under the GNU Lesser
General Public License.
-------------------------------START OF THIRD-PARTY NOTICES-------------------------------------------
===================================CoreFx (BEGIN)
The MIT License (MIT)
Copyright (c) .NET Foundation and Contributors
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================CoreFx (END)
===================================CoreFxLab (BEGIN)
The MIT License (MIT)
Copyright (c) Microsoft Corporation
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================CoreFxLab (END)
===================================Reactive Extensions (BEGIN)
Copyright (c) .NET Foundation and Contributors
All Rights Reserved
Licensed under the Apache License, Version 2.0 (the "License"); you
may not use this file except in compliance with the License. You may
obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied. See the License for the specific language governing permissions
and limitations under the License.
List of contributors to the Rx libraries
Rx and Ix.NET:
Wes Dyer
Jeffrey van Gogh
Matthew Podwysocki
Bart De Smet
Danny van Velzen
Erik Meijer
Brian Beckman
Aaron Lahman
Georgi Chkodrov
Arthur Watson
Gert Drapers
Mark Shields
Eric Rozell
Rx.js and Ix.js:
Matthew Podwysocki
Jeffrey van Gogh
Bart De Smet
Brian Beckman
Wes Dyer
Erik Meijer
Tx:
Georgi Chkodrov
Bart De Smet
Aaron Lahman
Erik Meijer
Brian Grunkemeyer
Beysim Sezgin
Tiho Tarnavski
Collin Meek
Sajay Anthony
Karen Albrecht
John Allen
Zach Kramer
Rx++ and Ix++:
Aaron Lahman
===================================Reactive Extensions (END)
-------------------------------END OF THIRD-PARTY NOTICES-------------------------------------------
=========================================
END OF Microsoft.ProgramSynthesis.Common NOTICES AND INFORMATION
%% Microsoft.ProgramSynthesis.Detection NOTICES AND INFORMATION BEGIN HERE
=========================================
NOTICES AND INFORMATION
Do Not Translate or Localize
This software incorporates material from third parties. Microsoft makes certain
open source code available at http://3rdpartysource.microsoft.com, or you may
send a check or money order for US $5.00, including the product name, the open
source component name, and version number, to:
Source Code Compliance Team
Microsoft Corporation
One Microsoft Way
Redmond, WA 98052
USA
Notwithstanding any other terms, you may reverse engineer this software to the
extent required to debug changes to any libraries licensed under the GNU Lesser
General Public License.
-------------------------------START OF THIRD-PARTY NOTICES-------------------------------------------
The MIT License (MIT)
Copyright (c) 2014 ExcelDataReader
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
===================================ExcelDataReader (END)
-------------------------------END OF THIRD-PARTY NOTICES-------------------------------------------
=========================================
END OF Microsoft.ProgramSynthesis.Detection NOTICES AND INFORMATION

View File

@@ -6,15 +6,19 @@ steps:
- script: |
git submodule update --init --recursive
nvm install 8.9.1
nvm use 8.9.1
nvm install 10.15.1
nvm use 10.15.1
npm i -g yarn
displayName: 'preinstall'
- script: |
export CXX="g++-4.9" CC="gcc-4.9" DISPLAY=:99.0
sh -e /etc/init.d/xvfb start
sleep 3
export CXX="g++-4.9" CC="gcc-4.9" DISPLAY=:10
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
# sh -e /etc/init.d/xvfb start
# sleep 3
displayName: 'Linux preinstall'
condition: eq(variables['Agent.OS'], 'Linux')
@@ -23,13 +27,12 @@ steps:
displayName: 'Install'
- script: |
node_modules/.bin/gulp electron --silent
node_modules/.bin/gulp compile --silent --max_old_space_size=4096
node_modules/.bin/gulp optimize-vscode --silent --max_old_space_size=4096
node_modules/.bin/gulp electron
node_modules/.bin/gulp compile --max_old_space_size=4096
displayName: 'Scripts'
- script: |
./scripts/test.sh --reporter mocha-junit-reporter
DISPLAY=:10 ./scripts/test.sh --reporter mocha-junit-reporter
displayName: 'Tests'
- task: PublishTestResults@2
@@ -38,5 +41,9 @@ steps:
condition: succeededOrFailed()
- script: |
yarn run tslint
displayName: 'Run TSLint'
yarn tslint
displayName: 'Run TSLint'
- script: |
yarn strict-null-check
displayName: 'Run Strict Null Check'

View File

@@ -1,7 +1,7 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: '8.9'
versionSpec: '10.15.1'
displayName: 'Install Node.js'
- script: |
@@ -26,5 +26,9 @@ steps:
condition: succeededOrFailed()
- script: |
yarn run tslint
displayName: 'Run TSLint'
yarn tslint
displayName: 'Run TSLint'
- script: |
yarn strict-null-check
displayName: 'Run Strict Null Check'

View File

@@ -3,10 +3,10 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
const cp = require('child_process');
import * as cp from 'child_process';
function yarnInstall(package: string): void {
cp.execSync(`yarn add --no-lockfile ${package}`);
function yarnInstall(packageName: string): void {
cp.execSync(`yarn add --no-lockfile ${packageName}`);
}
const product = require('../../../product.json');

View File

@@ -6,7 +6,6 @@
'use strict';
import * as fs from 'fs';
import { execSync } from 'child_process';
import { Readable } from 'stream';
import * as crypto from 'crypto';
import * as azure from 'azure-storage';
@@ -44,7 +43,7 @@ function createDefaultConfig(quality: string): Config {
}
function getConfig(quality: string): Promise<Config> {
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
@@ -66,7 +65,8 @@ interface Asset {
platform: string;
type: string;
url: string;
mooncakeUrl: string;
// {{SQL CARBON EDIT}}
mooncakeUrl: string | undefined;
hash: string;
sha256hash: string;
size: number;
@@ -74,7 +74,7 @@ interface Asset {
}
function createOrUpdate(commit: string, quality: string, platform: string, type: string, release: NewDocument, asset: Asset, isUpdate: boolean): Promise<void> {
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const client = new DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT']!, { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/' + quality;
const updateQuery = {
query: 'SELECT TOP 1 * FROM c WHERE c.id = @id',
@@ -128,7 +128,7 @@ async function assertContainer(blobService: azure.BlobService, quality: string):
await new Promise((c, e) => blobService.createContainerIfNotExists(quality, { publicAccessLevel: 'blob' }, err => err ? e(err) : c()));
}
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean> {
async function doesAssetExist(blobService: azure.BlobService, quality: string, blobName: string): Promise<boolean | undefined> {
const existsResult = await new Promise<azure.BlobService.BlobResult>((c, e) => blobService.doesBlobExist(quality, blobName, (err, r) => err ? e(err) : c(r)));
return existsResult.exists;
}
@@ -151,11 +151,15 @@ interface PublishOptions {
async function publish(commit: string, quality: string, platform: string, type: string, name: string, version: string, _isUpdate: string, file: string, opts: PublishOptions): Promise<void> {
const isUpdate = _isUpdate === 'true';
const queuedBy = process.env['BUILD_QUEUEDBY'];
const sourceBranch = process.env['BUILD_SOURCEBRANCH'];
const isReleased = quality === 'insider'
&& /^master$|^refs\/heads\/master$/.test(sourceBranch)
&& /Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy);
const queuedBy = process.env['BUILD_QUEUEDBY']!;
const sourceBranch = process.env['BUILD_SOURCEBRANCH']!;
const isReleased = (
// Insiders: nightly build from master
(quality === 'insider' && /^master$|^refs\/heads\/master$/.test(sourceBranch) && /Project Collection Service Accounts|Microsoft.VisualStudio.Services.TFS/.test(queuedBy)) ||
// Exploration: any build from electron-4.0.x branch
(quality === 'exploration' && /^electron-4.0.x$|^refs\/heads\/electron-4.0.x$/.test(sourceBranch))
);
console.log('Publishing...');
console.log('Quality:', quality);
@@ -180,9 +184,9 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log('SHA256:', sha256hash);
const blobName = commit + '/' + name;
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2'];
const storageAccount = process.env['AZURE_STORAGE_ACCOUNT_2']!;
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2'])
const blobService = azure.createBlobService(storageAccount, process.env['AZURE_STORAGE_ACCESS_KEY_2']!)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// {{SQL CARBON EDIT}}
@@ -198,7 +202,7 @@ async function publish(commit: string, quality: string, platform: string, type:
// {{SQL CARBON EDIT}}
if (process.env['MOONCAKE_STORAGE_ACCESS_KEY']) {
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY'], `${storageAccount}.blob.core.chinacloudapi.cn`)
const mooncakeBlobService = azure.createBlobService(storageAccount, process.env['MOONCAKE_STORAGE_ACCESS_KEY']!, `${storageAccount}.blob.core.chinacloudapi.cn`)
.withFilter(new azure.ExponentialRetryPolicyFilter(20));
// mooncake is fussy and far away, this is needed!
@@ -214,7 +218,7 @@ async function publish(commit: string, quality: string, platform: string, type:
doesAssetExist(mooncakeBlobService, quality, blobName)
]);
const promises = [];
const promises: Array<Promise<void>> = [];
if (!blobExists) {
promises.push(uploadBlob(blobService, quality, blobName, file));
@@ -227,7 +231,6 @@ async function publish(commit: string, quality: string, platform: string, type:
console.log('Skipping Mooncake publishing.');
}
if (promises.length === 0) {
console.log(`Blob ${quality}, ${blobName} already exists, not publishing again.`);
return;
@@ -268,7 +271,7 @@ async function publish(commit: string, quality: string, platform: string, type:
isReleased: config.frozen ? false : isReleased,
sourceBranch,
queuedBy,
assets: [],
assets: [] as Array<Asset>,
updates: {} as any
};
@@ -284,15 +287,23 @@ async function publish(commit: string, quality: string, platform: string, type:
}
function main(): void {
if (process.env['VSCODE_BUILD_SKIP_PUBLISH']) {
console.warn('Skipping publish due to VSCODE_BUILD_SKIP_PUBLISH');
return;
}
const commit = process.env['BUILD_SOURCEVERSION'];
if (!commit) {
console.warn('Skipping publish due to missing BUILD_SOURCEVERSION');
return;
}
const opts = minimist<PublishOptions>(process.argv.slice(2), {
boolean: ['upload-only']
});
// {{SQL CARBON EDIT}}
let [quality, platform, type, name, version, _isUpdate, file, commit] = opts._;
if (!commit) {
commit = execSync('git rev-parse HEAD', { encoding: 'utf8' }).trim();
}
const [quality, platform, type, name, version, _isUpdate, file] = opts._;
publish(commit, quality, platform, type, name, version, _isUpdate, file, opts).catch(err => {
console.error(err);

View File

@@ -97,7 +97,7 @@ function updateVersion(accessor: IVersionAccessor, symbolsPath: string) {
function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Promise<T> {
return new Promise<T>((resolve, reject) => {
request(options, (error, response, body) => {
request(options, (error, _response, body) => {
if (error) {
reject(error);
} else {
@@ -107,17 +107,17 @@ function asyncRequest<T>(options: request.UrlOptions & request.CoreOptions): Pro
});
}
function downloadAsset(repository, assetName: string, targetPath: string, electronVersion: string) {
function downloadAsset(repository: any, assetName: string, targetPath: string, electronVersion: string) {
return new Promise((resolve, reject) => {
repository.getReleases({ tag_name: `v${electronVersion}` }, (err, releases) => {
repository.getReleases({ tag_name: `v${electronVersion}` }, (err: any, releases: any) => {
if (err) {
reject(err);
} else {
const asset = releases[0].assets.filter(asset => asset.name === assetName)[0];
const asset = releases[0].assets.filter((asset: any) => asset.name === assetName)[0];
if (!asset) {
reject(new Error(`Asset with name ${assetName} not found`));
} else {
repository.downloadAsset(asset, (err, reader) => {
repository.downloadAsset(asset, (err: any, reader: any) => {
if (err) {
reject(err);
} else {
@@ -156,7 +156,7 @@ async function ensureVersionAndSymbols(options: IOptions) {
const symbolsName = symbolsZipName(options.platform, options.versions.electron, options.versions.insiders);
const symbolsPath = await tmpFile('symbols.zip');
console.log(`HockeyApp: downloading symbols ${symbolsName} for electron ${options.versions.electron} (${options.platform}) into ${symbolsPath}`);
await downloadAsset(new github({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
await downloadAsset(new (github as any)({ repo: options.repository, token: options.access.githubToken }), symbolsName, symbolsPath, options.versions.electron);
// Create version
console.log(`HockeyApp: creating new version ${options.versions.code} (${options.platform})`);

View File

@@ -0,0 +1,50 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- script: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- script: |
yarn gulp electron-x64
displayName: Download Electron
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- script: |
./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests
- script: |
./scripts/test-integration.sh --tfs "Integration Tests"
displayName: Run Integration Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,101 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- script: |
set -e
cat << EOF > ~/.netrc
machine monacotools.visualstudio.com
password $(VSO_PAT)
machine github.com
login vscode
password $(VSCODE_MIXIN_PASSWORD)
EOF
yarn
VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)" yarn gulp -- mixin
yarn gulp -- hygiene
yarn monaco-compile-check
node build/azure-pipelines/common/installDistro.js
node build/lib/builtInExtensions.js
displayName: Prepare build
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)" \
AZURE_STORAGE_ACCESS_KEY="$(AZURE_STORAGE_ACCESS_KEY)" \
yarn gulp -- vscode-darwin-min
VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)" \
AZURE_STORAGE_ACCESS_KEY="$(AZURE_STORAGE_ACCESS_KEY)" \
yarn gulp -- upload-vscode-sourcemaps
displayName: Build
- script: |
set -e
./scripts/test.sh --build --tfs "Unit Tests"
# APP_NAME="`ls $(agent.builddirectory)/VSCode-darwin | head -n 1`"
# yarn smoketest -- --build "$(agent.builddirectory)/VSCode-darwin/$APP_NAME"
displayName: Run unit tests
- script: |
set -e
pushd ../VSCode-darwin && zip -r -X -y ../VSCode-darwin.zip * && popd
displayName: Archive build
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)'
Pattern: 'VSCode-darwin.zip'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-401337-Apple",
"operationSetCode": "MacAppDeveloperSign",
"parameters": [ ],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
displayName: Codesign
- script: |
set -e
# remove pkg from archive
zip -d ../VSCode-darwin.zip "*.pkg"
# publish the build
PACKAGEJSON=`ls ../VSCode-darwin/*.app/Contents/Resources/app/package.json`
VERSION=`node -p "require(\"$PACKAGEJSON\").version"`
AZURE_DOCUMENTDB_MASTERKEY="$(AZURE_DOCUMENTDB_MASTERKEY)" \
AZURE_STORAGE_ACCESS_KEY_2="$(AZURE_STORAGE_ACCESS_KEY_2)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(MOONCAKE_STORAGE_ACCESS_KEY)" \
node build/azure-pipelines/common/publish.js \
"$(VSCODE_QUALITY)" \
darwin \
archive \
"VSCode-darwin-$(VSCODE_QUALITY).zip" \
$VERSION \
true \
../VSCode-darwin.zip
# publish hockeyapp symbols
node build/azure-pipelines/common/symbols.js "$(VSCODE_MIXIN_PASSWORD)" "$(VSCODE_HOCKEYAPP_TOKEN)" "$(VSCODE_ARCH)" "$(VSCODE_HOCKEYAPP_ID_MACOS)"
# upload configuration
AZURE_STORAGE_ACCESS_KEY="$(AZURE_STORAGE_ACCESS_KEY)" \
yarn gulp -- upload-vscode-configuration
displayName: Publish
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -0,0 +1 @@
pat

View File

@@ -0,0 +1,55 @@
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y libxkbfile-dev pkg-config libsecret-1-dev libxss1 dbus xvfb libgtk-3-0
sudo cp build/azure-pipelines/linux/xvfb.init /etc/init.d/xvfb
sudo chmod +x /etc/init.d/xvfb
sudo update-rc.d xvfb defaults
sudo service xvfb start
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- script: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- script: |
yarn gulp electron-x64
displayName: Download Electron
- script: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- script: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- script: |
yarn compile
displayName: Compile Sources
- script: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- script: |
DISPLAY=:10 ./scripts/test.sh --tfs "Unit Tests"
displayName: Run Unit Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,40 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const documentdb_1 = require("documentdb");
function createDefaultConfig(quality) {
return {
id: quality,
frozen: false
};
}
function getConfig(quality) {
const client = new documentdb_1.DocumentClient(process.env['AZURE_DOCUMENTDB_ENDPOINT'], { masterKey: process.env['AZURE_DOCUMENTDB_MASTERKEY'] });
const collection = 'dbs/builds/colls/config';
const query = {
query: `SELECT TOP 1 * FROM c WHERE c.id = @quality`,
parameters: [
{ name: '@quality', value: quality }
]
};
return new Promise((c, e) => {
client.queryDocuments(collection, query).toArray((err, results) => {
if (err && err.code !== 409) {
return e(err);
}
c(!results || results.length === 0 ? createDefaultConfig(quality) : results[0]);
});
});
}
getConfig(process.argv[2])
.then(config => {
console.log(config.frozen);
process.exit(0);
})
.catch(err => {
console.error(err);
process.exit(1);
});

View File

@@ -0,0 +1,118 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- script: |
set -e
export npm_config_arch="$(VSCODE_ARCH)"
if [[ "$(VSCODE_ARCH)" == "ia32" ]]; then
export PKG_CONFIG_PATH="/usr/lib/i386-linux-gnu/pkgconfig"
fi
cat << EOF > ~/.netrc
machine monacotools.visualstudio.com
password $(VSO_PAT)
machine github.com
login vscode
password $(VSCODE_MIXIN_PASSWORD)
EOF
CHILD_CONCURRENCY=1 yarn
VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)" npm run gulp -- mixin
npm run gulp -- hygiene
npm run monaco-compile-check
node build/azure-pipelines/common/installDistro.js
node build/lib/builtInExtensions.js
- script: |
set -e
VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)" npm run gulp -- vscode-linux-$(VSCODE_ARCH)-min
name: build
- script: |
set -e
npm run gulp -- "electron-$(VSCODE_ARCH)"
# xvfb seems to be crashing often, let's make sure it's always up
service xvfb start
DISPLAY=:10 ./scripts/test.sh --build --tfs "Unit Tests"
# yarn smoketest -- --build "$(agent.builddirectory)/VSCode-linux-$(VSCODE_ARCH)"
name: test
- script: |
set -e
REPO="$(pwd)"
ROOT="$REPO/.."
ARCH="$(VSCODE_ARCH)"
# Publish tarball
PLATFORM_LINUX="linux-$(VSCODE_ARCH)"
[[ "$ARCH" == "ia32" ]] && DEB_ARCH="i386" || DEB_ARCH="amd64"
[[ "$ARCH" == "ia32" ]] && RPM_ARCH="i386" || RPM_ARCH="x86_64"
BUILDNAME="VSCode-$PLATFORM_LINUX"
BUILD="$ROOT/$BUILDNAME"
BUILD_VERSION="$(date +%s)"
[ -z "$VSCODE_QUALITY" ] && TARBALL_FILENAME="code-$BUILD_VERSION.tar.gz" || TARBALL_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.tar.gz"
TARBALL_PATH="$ROOT/$TARBALL_FILENAME"
PACKAGEJSON="$BUILD/resources/app/package.json"
VERSION=$(node -p "require(\"$PACKAGEJSON\").version")
rm -rf $ROOT/code-*.tar.*
(cd $ROOT && tar -czf $TARBALL_PATH $BUILDNAME)
AZURE_DOCUMENTDB_MASTERKEY="$(AZURE_DOCUMENTDB_MASTERKEY)" \
AZURE_STORAGE_ACCESS_KEY_2="$(AZURE_STORAGE_ACCESS_KEY_2)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(MOONCAKE_STORAGE_ACCESS_KEY)" \
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_LINUX" archive-unsigned "$TARBALL_FILENAME" "$VERSION" true "$TARBALL_PATH"
# Publish hockeyapp symbols
node build/azure-pipelines/common/symbols.js "$(VSCODE_MIXIN_PASSWORD)" "$(VSCODE_HOCKEYAPP_TOKEN)" "$(VSCODE_ARCH)" "$(VSCODE_HOCKEYAPP_ID_LINUX64)"
# Publish DEB
npm run gulp -- "vscode-linux-$(VSCODE_ARCH)-build-deb"
PLATFORM_DEB="linux-deb-$ARCH"
[[ "$ARCH" == "ia32" ]] && DEB_ARCH="i386" || DEB_ARCH="amd64"
DEB_FILENAME="$(ls $REPO/.build/linux/deb/$DEB_ARCH/deb/)"
DEB_PATH="$REPO/.build/linux/deb/$DEB_ARCH/deb/$DEB_FILENAME"
AZURE_DOCUMENTDB_MASTERKEY="$(AZURE_DOCUMENTDB_MASTERKEY)" \
AZURE_STORAGE_ACCESS_KEY_2="$(AZURE_STORAGE_ACCESS_KEY_2)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(MOONCAKE_STORAGE_ACCESS_KEY)" \
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_DEB" package "$DEB_FILENAME" "$VERSION" true "$DEB_PATH"
# Publish RPM
npm run gulp -- "vscode-linux-$(VSCODE_ARCH)-build-rpm"
PLATFORM_RPM="linux-rpm-$ARCH"
[[ "$ARCH" == "ia32" ]] && RPM_ARCH="i386" || RPM_ARCH="x86_64"
RPM_FILENAME="$(ls $REPO/.build/linux/rpm/$RPM_ARCH/ | grep .rpm)"
RPM_PATH="$REPO/.build/linux/rpm/$RPM_ARCH/$RPM_FILENAME"
AZURE_DOCUMENTDB_MASTERKEY="$(AZURE_DOCUMENTDB_MASTERKEY)" \
AZURE_STORAGE_ACCESS_KEY_2="$(AZURE_STORAGE_ACCESS_KEY_2)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(MOONCAKE_STORAGE_ACCESS_KEY)" \
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "$PLATFORM_RPM" package "$RPM_FILENAME" "$VERSION" true "$RPM_PATH"
# Publish Snap
npm run gulp -- "vscode-linux-$(VSCODE_ARCH)-prepare-snap"
# Pack snap tarball artifact, in order to preserve file perms
mkdir -p $REPO/.build/linux/snap-tarball
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$(VSCODE_ARCH).tar.gz"
rm -rf $SNAP_TARBALL_PATH
(cd .build/linux && tar -czf $SNAP_TARBALL_PATH snap)
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true
- task: PublishPipelineArtifact@0
displayName: 'Publish Pipeline Artifact'
inputs:
artifactName: snap-$(VSCODE_ARCH)
targetPath: .build/linux/snap-tarball

View File

@@ -0,0 +1,50 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: DownloadPipelineArtifact@0
displayName: 'Download Pipeline Artifact'
inputs:
artifactName: snap-$(VSCODE_ARCH)
targetPath: .build/linux/snap-tarball
- script: |
set -e
# Get snapcraft version
snapcraft --version
# Make sure we get latest packages
sudo apt-get update
sudo apt-get upgrade -y
# Define variables
REPO="$(pwd)"
ARCH="$(VSCODE_ARCH)"
SNAP_ROOT="$REPO/.build/linux/snap/$ARCH"
# Install build dependencies
(cd build && yarn)
# Unpack snap tarball artifact, in order to preserve file perms
SNAP_TARBALL_PATH="$REPO/.build/linux/snap-tarball/snap-$ARCH.tar.gz"
(cd .build/linux && tar -xzf $SNAP_TARBALL_PATH)
# Create snap package
BUILD_VERSION="$(date +%s)"
SNAP_FILENAME="code-$VSCODE_QUALITY-$BUILD_VERSION.snap"
PACKAGEJSON="$(ls $SNAP_ROOT/code*/usr/share/code*/resources/app/package.json)"
VERSION=$(node -p "require(\"$PACKAGEJSON\").version")
SNAP_PATH="$SNAP_ROOT/$SNAP_FILENAME"
(cd $SNAP_ROOT/code-* && sudo snapcraft snap --output "$SNAP_PATH")
# Publish snap package
AZURE_DOCUMENTDB_MASTERKEY="$(AZURE_DOCUMENTDB_MASTERKEY)" \
AZURE_STORAGE_ACCESS_KEY_2="$(AZURE_STORAGE_ACCESS_KEY_2)" \
MOONCAKE_STORAGE_ACCESS_KEY="$(MOONCAKE_STORAGE_ACCESS_KEY)" \
node build/azure-pipelines/common/publish.js "$VSCODE_QUALITY" "linux-snap-$ARCH" package "$SNAP_FILENAME" "$VERSION" true "$SNAP_PATH"

View File

@@ -0,0 +1,65 @@
resources:
containers:
- container: vscode-x64
image: joaomoreno/vscode-linux-build-agent:x64
- container: vscode-ia32
image: joaomoreno/vscode-linux-build-agent:ia32
- container: snapcraft
image: snapcore/snapcraft
jobs:
- job: Windows
condition: eq(variables['VSCODE_BUILD_WIN32'], 'true')
pool:
vmImage: VS2017-Win2016
variables:
VSCODE_ARCH: x64
steps:
- template: win32/product-build-win32.yml
- job: Windows32
condition: eq(variables['VSCODE_BUILD_WIN32_32BIT'], 'true')
pool:
vmImage: VS2017-Win2016
variables:
VSCODE_ARCH: ia32
steps:
- template: win32/product-build-win32.yml
- job: Linux
condition: eq(variables['VSCODE_BUILD_LINUX'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: x64
container: vscode-x64
steps:
- template: linux/product-build-linux.yml
- job: LinuxSnap
condition: eq(variables['VSCODE_BUILD_LINUX'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: x64
container: snapcraft
dependsOn: Linux
steps:
- template: linux/snap-build-linux.yml
- job: Linux32
condition: eq(variables['VSCODE_BUILD_LINUX_32BIT'], 'true')
pool:
vmImage: 'Ubuntu-16.04'
variables:
VSCODE_ARCH: ia32
container: vscode-ia32
steps:
- template: linux/product-build-linux.yml
- job: macOS
condition: eq(variables['VSCODE_BUILD_MACOS'], 'true')
pool:
vmImage: macOS 10.13
steps:
- template: darwin/product-build-darwin.yml

View File

@@ -0,0 +1,54 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
# - task: 1ESLighthouseEng.PipelineArtifactCaching.RestoreCacheV1.RestoreCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: eq(variables['System.PullRequest.PullRequestId'], '')
- powershell: |
yarn
displayName: Install Dependencies
# condition: or(ne(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
# - task: 1ESLighthouseEng.PipelineArtifactCaching.SaveCacheV1.SaveCache@1
# inputs:
# keyfile: '**/yarn.lock, !**/node_modules/**/yarn.lock, !**/.*/**/yarn.lock'
# targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
# vstsFeed: '$(ArtifactFeed)'
# condition: and(succeeded(), eq(variables['System.PullRequest.PullRequestId'], ''), ne(variables['CacheRestored'], 'true'))
- powershell: |
yarn gulp electron
displayName: Download Electron
- powershell: |
yarn gulp hygiene
displayName: Run Hygiene Checks
- powershell: |
yarn monaco-compile-check
displayName: Run Monaco Editor Checks
- powershell: |
yarn compile
displayName: Compile Sources
- powershell: |
yarn download-builtin-extensions
displayName: Download Built-in Extensions
- powershell: |
.\scripts\test.bat --tfs "Unit Tests"
displayName: Run Unit Tests
- powershell: |
.\scripts\test-integration.bat --tfs "Integration Tests"
displayName: Run Integration Tests
- task: PublishTestResults@2
displayName: Publish Tests Results
inputs:
testResultsFiles: '*-results.xml'
searchFolder: '$(Build.ArtifactStagingDirectory)/test-results'
condition: succeededOrFailed()

View File

@@ -0,0 +1,151 @@
steps:
- task: NodeTool@0
inputs:
versionSpec: "10.15.1"
- task: geeklearningio.gl-vsts-tasks-yarn.yarn-installer-task.YarnInstaller@2
inputs:
versionSpec: "1.10.1"
- task: UsePythonVersion@0
inputs:
versionSpec: '2.x'
addToPath: true
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
"machine monacotools.visualstudio.com`npassword $(VSO_PAT)`nmachine github.com`nlogin vscode`npassword $(VSCODE_MIXIN_PASSWORD)" | Out-File "$env:USERPROFILE\_netrc" -Encoding ASCII
$env:npm_config_arch="$(VSCODE_ARCH)"
$env:CHILD_CONCURRENCY="1"
$env:VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)"
exec { yarn }
exec { npm run gulp -- mixin }
exec { npm run gulp -- hygiene }
exec { npm run monaco-compile-check }
exec { node build/azure-pipelines/common/installDistro.js }
exec { node build/lib/builtInExtensions.js }
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
$env:VSCODE_MIXIN_PASSWORD="$(VSCODE_MIXIN_PASSWORD)"
exec { npm run gulp -- "vscode-win32-$(VSCODE_ARCH)-min" }
exec { npm run gulp -- "vscode-win32-$(VSCODE_ARCH)-inno-updater" }
name: build
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { npm run gulp -- "electron-$(VSCODE_ARCH)" }
exec { .\scripts\test.bat --build --tfs "Unit Tests" }
# yarn smoketest -- --build "$(agent.builddirectory)\VSCode-win32-$(VSCODE_ARCH)"
name: test
- task: SFP.build-tasks.custom-build-task-1.EsrpCodeSigning@1
inputs:
ConnectedServiceName: 'ESRP CodeSign'
FolderPath: '$(agent.builddirectory)/VSCode-win32-$(VSCODE_ARCH)'
Pattern: '*.dll,*.exe,*.node'
signConfigType: inlineSignParams
inlineOperation: |
[
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolSign",
"parameters": [
{
"parameterName": "OpusName",
"parameterValue": "VS Code"
},
{
"parameterName": "OpusInfo",
"parameterValue": "https://code.visualstudio.com/"
},
{
"parameterName": "Append",
"parameterValue": "/as"
},
{
"parameterName": "FileDigest",
"parameterValue": "/fd \"SHA256\""
},
{
"parameterName": "PageHash",
"parameterValue": "/NPH"
},
{
"parameterName": "TimeStamp",
"parameterValue": "/tr \"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer\" /td sha256"
}
],
"toolName": "sign",
"toolVersion": "1.0"
},
{
"keyCode": "CP-230012",
"operationSetCode": "SigntoolVerify",
"parameters": [
{
"parameterName": "VerifyAll",
"parameterValue": "/all"
}
],
"toolName": "sign",
"toolVersion": "1.0"
}
]
SessionTimeout: 120
- task: NuGetCommand@2
displayName: Install ESRPClient.exe
inputs:
restoreSolution: 'build\azure-pipelines\win32\ESRPClient\packages.config'
feedsToUse: config
nugetConfigPath: 'build\azure-pipelines\win32\ESRPClient\NuGet.config'
externalFeedCredentials: 3fc0b7f7-da09-4ae7-a9c8-d69824b1819b
restoreDirectory: packages
- task: ESRPImportCertTask@1
displayName: Import ESRP Request Signing Certificate
inputs:
ESRP: 'ESRP CodeSign'
- powershell: |
$ErrorActionPreference = "Stop"
.\build\azure-pipelines\win32\import-esrp-auth-cert.ps1 -AuthCertificateBase64 $(ESRP_AUTH_CERTIFICATE) -AuthCertificateKey $(ESRP_AUTH_CERTIFICATE_KEY)
displayName: Import ESRP Auth Certificate
- powershell: |
. build/azure-pipelines/win32/exec.ps1
$ErrorActionPreference = "Stop"
exec { npm run gulp -- "vscode-win32-$(VSCODE_ARCH)-archive" "vscode-win32-$(VSCODE_ARCH)-system-setup" "vscode-win32-$(VSCODE_ARCH)-user-setup" --sign }
$Repo = "$(pwd)"
$Root = "$Repo\.."
$SystemExe = "$Repo\.build\win32-$(VSCODE_ARCH)\system-setup\VSCodeSetup.exe"
$UserExe = "$Repo\.build\win32-$(VSCODE_ARCH)\user-setup\VSCodeSetup.exe"
$Zip = "$Repo\.build\win32-$(VSCODE_ARCH)\archive\VSCode-win32-$(VSCODE_ARCH).zip"
$Build = "$Root\VSCode-win32-$(VSCODE_ARCH)"
# get version
$PackageJson = Get-Content -Raw -Path "$Build\resources\app\package.json" | ConvertFrom-Json
$Version = $PackageJson.version
$Quality = "$env:VSCODE_QUALITY"
$env:AZURE_STORAGE_ACCESS_KEY_2 = "$(AZURE_STORAGE_ACCESS_KEY_2)"
$env:MOONCAKE_STORAGE_ACCESS_KEY = "$(MOONCAKE_STORAGE_ACCESS_KEY)"
$env:AZURE_DOCUMENTDB_MASTERKEY = "$(AZURE_DOCUMENTDB_MASTERKEY)"
$assetPlatform = if ("$(VSCODE_ARCH)" -eq "ia32") { "win32" } else { "win32-x64" }
exec { node build/azure-pipelines/common/publish.js $Quality "$global:assetPlatform-archive" archive "VSCode-win32-$(VSCODE_ARCH)-$Version.zip" $Version true $Zip }
exec { node build/azure-pipelines/common/publish.js $Quality "$global:assetPlatform" setup "VSCodeSetup-$(VSCODE_ARCH)-$Version.exe" $Version true $SystemExe }
exec { node build/azure-pipelines/common/publish.js $Quality "$global:assetPlatform-user" setup "VSCodeUserSetup-$(VSCODE_ARCH)-$Version.exe" $Version true $UserExe }
# publish hockeyapp symbols
$hockeyAppId = if ("$(VSCODE_ARCH)" -eq "ia32") { "$(VSCODE_HOCKEYAPP_ID_WIN32)" } else { "$(VSCODE_HOCKEYAPP_ID_WIN64)" }
exec { node build/azure-pipelines/common/symbols.js "$(VSCODE_MIXIN_PASSWORD)" "$(VSCODE_HOCKEYAPP_TOKEN)" "$(VSCODE_ARCH)" $hockeyAppId }
- task: ms.vss-governance-buildtask.governance-build-task-component-detection.ComponentGovernanceComponentDetection@0
displayName: 'Component Detection'
continueOnError: true

View File

@@ -0,0 +1,70 @@
function Create-TmpJson($Obj) {
$FileName = [System.IO.Path]::GetTempFileName()
ConvertTo-Json -Depth 100 $Obj | Out-File -Encoding UTF8 $FileName
return $FileName
}
$Auth = Create-TmpJson @{
Version = "1.0.0"
AuthenticationType = "AAD_CERT"
ClientId = $env:ESRPClientId
AuthCert = @{
SubjectName = $env:ESRPAuthCertificateSubjectName
StoreLocation = "LocalMachine"
StoreName = "My"
}
RequestSigningCert = @{
SubjectName = $env:ESRPCertificateSubjectName
StoreLocation = "LocalMachine"
StoreName = "My"
}
}
$Policy = Create-TmpJson @{
Version = "1.0.0"
}
$Input = Create-TmpJson @{
Version = "1.0.0"
SignBatches = @(
@{
SourceLocationType = "UNC"
SignRequestFiles = @(
@{
SourceLocation = $args[0]
}
)
SigningInfo = @{
Operations = @(
@{
KeyCode = "CP-230012"
OperationCode = "SigntoolSign"
Parameters = @{
OpusName = "VS Code"
OpusInfo = "https://code.visualstudio.com/"
Append = "/as"
FileDigest = "/fd `"SHA256`""
PageHash = "/NPH"
TimeStamp = "/tr `"http://rfc3161.gtm.corp.microsoft.com/TSS/HttpTspServer`" /td sha256"
}
ToolName = "sign"
ToolVersion = "1.0"
},
@{
KeyCode = "CP-230012"
OperationCode = "SigntoolVerify"
Parameters = @{
VerifyAll = "/all"
}
ToolName = "sign"
ToolVersion = "1.0"
}
)
}
}
)
}
$Output = [System.IO.Path]::GetTempFileName()
$ScriptPath = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
& "$ScriptPath\ESRPClient\packages\EsrpClient.1.0.27\tools\ESRPClient.exe" Sign -a $Auth -p $Policy -i $Input -o $Output

View File

@@ -1,12 +1,2 @@
[
{
"name": "ms-vscode.node-debug",
"version": "1.26.7",
"repo": "https://github.com/Microsoft/vscode-node-debug"
},
{
"name": "ms-vscode.node-debug2",
"version": "1.26.8",
"repo": "https://github.com/Microsoft/vscode-node-debug2"
}
]

View File

@@ -43,7 +43,7 @@ function asYarnDependency(prefix, tree) {
}
function getYarnProductionDependencies(cwd) {
const raw = cp.execSync('yarn list --json', { cwd, encoding: 'utf8', env: { ...process.env, NODE_ENV: 'production' }, stdio: [null, null, 'ignore'] });
const raw = cp.execSync('yarn list --json', { cwd, encoding: 'utf8', env: { ...process.env, NODE_ENV: 'production' }, stdio: [null, null, 'inherit'] });
const match = /^{"type":"tree".*$/m.exec(raw);
if (!match || match.length !== 1) {

View File

@@ -0,0 +1,91 @@
"use strict";
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
const https = require("https");
const fs = require("fs");
const path = require("path");
const cp = require("child_process");
function ensureDir(filepath) {
if (!fs.existsSync(filepath)) {
ensureDir(path.dirname(filepath));
fs.mkdirSync(filepath);
}
}
function download(options, destination) {
ensureDir(path.dirname(destination));
return new Promise((c, e) => {
const fd = fs.openSync(destination, 'w');
const req = https.get(options, (res) => {
res.on('data', (chunk) => {
fs.writeSync(fd, chunk);
});
res.on('end', () => {
fs.closeSync(fd);
c();
});
});
req.on('error', (reqErr) => {
console.error(`request to ${options.host}${options.path} failed.`);
console.error(reqErr);
e(reqErr);
});
});
}
const MARKER_ARGUMENT = `_download_fork_`;
function base64encode(str) {
return Buffer.from(str, 'utf8').toString('base64');
}
function base64decode(str) {
return Buffer.from(str, 'base64').toString('utf8');
}
function downloadInExternalProcess(options) {
const url = `https://${options.requestOptions.host}${options.requestOptions.path}`;
console.log(`Downloading ${url}...`);
return new Promise((c, e) => {
const child = cp.fork(__filename, [MARKER_ARGUMENT, base64encode(JSON.stringify(options))], {
stdio: ['pipe', 'pipe', 'pipe', 'ipc']
});
let stderr = [];
child.stderr.on('data', (chunk) => {
stderr.push(typeof chunk === 'string' ? Buffer.from(chunk) : chunk);
});
child.on('exit', (code) => {
if (code === 0) {
// normal termination
console.log(`Finished downloading ${url}.`);
c();
}
else {
// abnormal termination
console.error(Buffer.concat(stderr).toString());
e(new Error(`Download of ${url} failed.`));
}
});
});
}
exports.downloadInExternalProcess = downloadInExternalProcess;
function _downloadInExternalProcess() {
let options;
try {
options = JSON.parse(base64decode(process.argv[3]));
}
catch (err) {
console.error(`Cannot read arguments`);
console.error(err);
process.exit(-1);
return;
}
download(options.requestOptions, options.destinationPath).then(() => {
process.exit(0);
}, (err) => {
console.error(err);
process.exit(-2);
});
}
if (process.argv.length >= 4 && process.argv[2] === MARKER_ARGUMENT) {
// running as forked download script
_downloadInExternalProcess();
}

111
build/download/download.ts Normal file
View File

@@ -0,0 +1,111 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import * as https from 'https';
import * as fs from 'fs';
import * as path from 'path';
import * as cp from 'child_process';
function ensureDir(filepath: string) {
if (!fs.existsSync(filepath)) {
ensureDir(path.dirname(filepath));
fs.mkdirSync(filepath);
}
}
function download(options: https.RequestOptions, destination: string): Promise<void> {
ensureDir(path.dirname(destination));
return new Promise<void>((c, e) => {
const fd = fs.openSync(destination, 'w');
const req = https.get(options, (res) => {
res.on('data', (chunk) => {
fs.writeSync(fd, chunk);
});
res.on('end', () => {
fs.closeSync(fd);
c();
});
});
req.on('error', (reqErr) => {
console.error(`request to ${options.host}${options.path} failed.`);
console.error(reqErr);
e(reqErr);
});
});
}
const MARKER_ARGUMENT = `_download_fork_`;
function base64encode(str: string): string {
return Buffer.from(str, 'utf8').toString('base64');
}
function base64decode(str: string): string {
return Buffer.from(str, 'base64').toString('utf8');
}
export interface IDownloadRequestOptions {
host: string;
path: string;
}
export interface IDownloadOptions {
requestOptions: IDownloadRequestOptions;
destinationPath: string;
}
export function downloadInExternalProcess(options: IDownloadOptions): Promise<void> {
const url = `https://${options.requestOptions.host}${options.requestOptions.path}`;
console.log(`Downloading ${url}...`);
return new Promise<void>((c, e) => {
const child = cp.fork(
__filename,
[MARKER_ARGUMENT, base64encode(JSON.stringify(options))],
{
stdio: ['pipe', 'pipe', 'pipe', 'ipc']
}
);
let stderr: Buffer[] = [];
child.stderr.on('data', (chunk) => {
stderr.push(typeof chunk === 'string' ? Buffer.from(chunk) : chunk);
});
child.on('exit', (code) => {
if (code === 0) {
// normal termination
console.log(`Finished downloading ${url}.`);
c();
} else {
// abnormal termination
console.error(Buffer.concat(stderr).toString());
e(new Error(`Download of ${url} failed.`));
}
});
});
}
function _downloadInExternalProcess() {
let options: IDownloadOptions;
try {
options = JSON.parse(base64decode(process.argv[3]));
} catch (err) {
console.error(`Cannot read arguments`);
console.error(err);
process.exit(-1);
return;
}
download(options.requestOptions, options.destinationPath).then(() => {
process.exit(0);
}, (err) => {
console.error(err);
process.exit(-2);
});
}
if (process.argv.length >= 4 && process.argv[2] === MARKER_ARGUMENT) {
// running as forked download script
_downloadInExternalProcess();
}

18
build/gulpfile.compile.js Normal file
View File

@@ -0,0 +1,18 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
const util = require('./lib/util');
const task = require('./lib/task');
const compilation = require('./lib/compilation');
const { compileExtensionsBuildTask } = require('./gulpfile.extensions');
// Full compile, including nls and inline sources in sourcemaps, for build
const compileClientBuildTask = task.define('compile-client-build', task.series(util.rimraf('out-build'), compilation.compileTask('src', 'out-build', true)));
// All Build
const compileBuildTask = task.define('compile-build', task.parallel(compileClientBuildTask, compileExtensionsBuildTask));
exports.compileBuildTask = compileBuildTask;

View File

@@ -6,6 +6,7 @@
const gulp = require('gulp');
const path = require('path');
const util = require('./lib/util');
const task = require('./lib/task');
const common = require('./lib/optimize');
const es = require('event-stream');
const File = require('vinyl');
@@ -28,7 +29,7 @@ var editorEntryPoints = [
name: 'vs/editor/editor.main',
include: [],
exclude: ['vs/css', 'vs/nls'],
prepend: ['out-build/vs/css.js', 'out-build/vs/nls.js'],
prepend: ['out-editor-build/vs/css.js', 'out-editor-build/vs/nls.js'],
},
{
name: 'vs/base/common/worker/simpleWorker',
@@ -48,9 +49,6 @@ var editorResources = [
'!**/test/**'
];
var editorOtherSources = [
];
var BUNDLED_FILE_HEADER = [
'/*!-----------------------------------------------------------',
' * Copyright (c) Microsoft Corporation. All rights reserved.',
@@ -63,8 +61,7 @@ var BUNDLED_FILE_HEADER = [
const languages = i18n.defaultLanguages.concat([]); // i18n.defaultLanguages.concat(process.env.VSCODE_QUALITY !== 'stable' ? i18n.extraLanguages : []);
gulp.task('clean-editor-src', util.rimraf('out-editor-src'));
gulp.task('extract-editor-src', ['clean-editor-src'], function () {
const extractEditorSrcTask = task.define('extract-editor-src', () => {
console.log(`If the build fails, consider tweaking shakeLevel below to a lower value.`);
const apiusages = monacoapi.execute().usageContent;
const extrausages = fs.readFileSync(path.join(root, 'build', 'monaco', 'monaco.usage.recipe')).toString();
@@ -79,35 +76,39 @@ gulp.task('extract-editor-src', ['clean-editor-src'], function () {
apiusages,
extrausages
],
typings: [
'typings/lib.ie11_safe_es6.d.ts',
'typings/thenable.d.ts',
'typings/es6-promise.d.ts',
'typings/require-monaco.d.ts',
"typings/lib.es2018.promise.d.ts",
'vs/monaco.d.ts'
],
libs: [
`lib.d.ts`,
`lib.es2015.collection.d.ts`
`lib.es5.d.ts`,
`lib.dom.d.ts`,
`lib.webworker.importscripts.d.ts`
],
redirects: {
'vs/base/browser/ui/octiconLabel/octiconLabel': 'vs/base/browser/ui/octiconLabel/octiconLabel.mock',
},
compilerOptions: {
module: 2, // ModuleKind.AMD
},
shakeLevel: 2, // 0-Files, 1-InnerFile, 2-ClassMembers
importIgnorePattern: /^vs\/css!/,
importIgnorePattern: /(^vs\/css!)|(promise-polyfill\/polyfill)/,
destRoot: path.join(root, 'out-editor-src')
});
});
// Full compile, including nls and inline sources in sourcemaps, for build
gulp.task('clean-editor-build', util.rimraf('out-editor-build'));
gulp.task('compile-editor-build', ['clean-editor-build', 'extract-editor-src'], compilation.compileTask('out-editor-src', 'out-editor-build', true));
const compileEditorAMDTask = task.define('compile-editor-amd', compilation.compileTask('out-editor-src', 'out-editor-build', true));
gulp.task('clean-optimized-editor', util.rimraf('out-editor'));
gulp.task('optimize-editor', ['clean-optimized-editor', 'compile-editor-build'], common.optimizeTask({
const optimizeEditorAMDTask = task.define('optimize-editor-amd', common.optimizeTask({
src: 'out-editor-build',
entryPoints: editorEntryPoints,
otherSources: editorOtherSources,
resources: editorResources,
loaderConfig: {
paths: {
'vs': 'out-editor-build/vs',
'vs/css': 'out-editor-build/vs/css.build',
'vs/nls': 'out-editor-build/vs/nls.build',
'vscode': 'empty:'
}
},
@@ -118,29 +119,45 @@ gulp.task('optimize-editor', ['clean-optimized-editor', 'compile-editor-build'],
languages: languages
}));
gulp.task('clean-minified-editor', util.rimraf('out-editor-min'));
gulp.task('minify-editor', ['clean-minified-editor', 'optimize-editor'], common.minifyTask('out-editor'));
const minifyEditorAMDTask = task.define('minify-editor-amd', common.minifyTask('out-editor'));
gulp.task('clean-editor-esm', util.rimraf('out-editor-esm'));
gulp.task('extract-editor-esm', ['clean-editor-esm', 'clean-editor-distro'], function () {
standalone.createESMSourcesAndResources({
entryPoints: [
'vs/editor/editor.main',
'vs/editor/editor.worker'
],
outFolder: './out-editor-esm/src',
const createESMSourcesAndResourcesTask = task.define('extract-editor-esm', () => {
standalone.createESMSourcesAndResources2({
srcFolder: './out-editor-src',
outFolder: './out-editor-esm',
outResourcesFolder: './out-monaco-editor-core/esm',
redirects: {
'vs/base/browser/ui/octiconLabel/octiconLabel': 'vs/base/browser/ui/octiconLabel/octiconLabel.mock',
'vs/nls': 'vs/nls.mock',
ignores: [
'inlineEntryPoint:0.ts',
'inlineEntryPoint:1.ts',
'vs/loader.js',
'vs/nls.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/css.js',
'vs/css.build.js',
'vs/css.d.ts',
'vs/base/worker/workerMain.ts',
],
renames: {
'vs/nls.mock.ts': 'vs/nls.ts'
}
});
});
gulp.task('compile-editor-esm', ['extract-editor-esm', 'clean-editor-distro'], function () {
const result = cp.spawnSync(`node`, [`../node_modules/.bin/tsc`], {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
const compileEditorESMTask = task.define('compile-editor-esm', () => {
if (process.platform === 'win32') {
const result = cp.spawnSync(`..\\node_modules\\.bin\\tsc.cmd`, {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
console.log(result.stderr.toString());
} else {
const result = cp.spawnSync(`node`, [`../node_modules/.bin/tsc`], {
cwd: path.join(__dirname, '../out-editor-esm')
});
console.log(result.stdout.toString());
console.log(result.stderr.toString());
}
});
function toExternalDTS(contents) {
@@ -178,8 +195,16 @@ function toExternalDTS(contents) {
return lines.join('\n');
}
gulp.task('clean-editor-distro', util.rimraf('out-monaco-editor-core'));
gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify-editor', 'optimize-editor'], function () {
function filterStream(testFunc) {
return es.through(function (data) {
if (!testFunc(data.relative)) {
return;
}
this.emit('data', data);
});
}
const finalEditorResourcesTask = task.define('final-editor-resources', () => {
return es.merge(
// other assets
es.merge(
@@ -194,7 +219,7 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
this.emit('data', new File({
path: data.path.replace(/monaco\.d\.ts/, 'editor.api.d.ts'),
base: data.base,
contents: new Buffer(toExternalDTS(data.contents.toString()))
contents: Buffer.from(toExternalDTS(data.contents.toString()))
}));
}))
.pipe(gulp.dest('out-monaco-editor-core/esm/vs/editor')),
@@ -209,6 +234,14 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
}))
.pipe(gulp.dest('out-monaco-editor-core')),
// version.txt
gulp.src('build/monaco/version.txt')
.pipe(es.through(function (data) {
data.contents = Buffer.from(`monaco-editor-core: https://github.com/Microsoft/vscode/tree/${sha1}`);
this.emit('data', data);
}))
.pipe(gulp.dest('out-monaco-editor-core')),
// README.md
gulp.src('build/monaco/README-npm.md')
.pipe(es.through(function (data) {
@@ -242,7 +275,7 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
var strContents = data.contents.toString();
var newStr = '//# sourceMappingURL=' + relativePathToMap.replace(/\\/g, '/');
strContents = strContents.replace(/\/\/\# sourceMappingURL=[^ ]+$/, newStr);
strContents = strContents.replace(/\/\/# sourceMappingURL=[^ ]+$/, newStr);
data.contents = Buffer.from(strContents);
this.emit('data', data);
@@ -258,59 +291,31 @@ gulp.task('editor-distro', ['clean-editor-distro', 'compile-editor-esm', 'minify
);
});
gulp.task('analyze-editor-distro', function () {
// @ts-ignore
var bundleInfo = require('../out-editor/bundleInfo.json');
var graph = bundleInfo.graph;
var bundles = bundleInfo.bundles;
var inverseGraph = {};
Object.keys(graph).forEach(function (module) {
var dependencies = graph[module];
dependencies.forEach(function (dep) {
inverseGraph[dep] = inverseGraph[dep] || [];
inverseGraph[dep].push(module);
});
});
var detailed = {};
Object.keys(bundles).forEach(function (entryPoint) {
var included = bundles[entryPoint];
var includedMap = {};
included.forEach(function (included) {
includedMap[included] = true;
});
var explanation = [];
included.map(function (included) {
if (included.indexOf('!') >= 0) {
return;
}
var reason = (inverseGraph[included] || []).filter(function (mod) {
return !!includedMap[mod];
});
explanation.push({
module: included,
reason: reason
});
});
detailed[entryPoint] = explanation;
});
console.log(JSON.stringify(detailed, null, '\t'));
});
function filterStream(testFunc) {
return es.through(function (data) {
if (!testFunc(data.relative)) {
return;
}
this.emit('data', data);
});
}
gulp.task('editor-distro',
task.series(
task.parallel(
util.rimraf('out-editor-src'),
util.rimraf('out-editor-build'),
util.rimraf('out-editor-esm'),
util.rimraf('out-monaco-editor-core'),
util.rimraf('out-editor'),
util.rimraf('out-editor-min')
),
extractEditorSrcTask,
task.parallel(
task.series(
compileEditorAMDTask,
optimizeEditorAMDTask,
minifyEditorAMDTask
),
task.series(
createESMSourcesAndResourcesTask,
compileEditorESMTask
)
),
finalEditorResourcesTask
)
);
//#region monaco type checking
@@ -330,6 +335,7 @@ function createTscCompileTask(watch) {
let errors = [];
let reporter = createReporter();
let report;
// eslint-disable-next-line no-control-regex
let magic = /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; // https://stackoverflow.com/questions/25245716/remove-all-ansi-colors-styles-from-strings
child.stdout.on('data', data => {
@@ -363,7 +369,10 @@ function createTscCompileTask(watch) {
};
}
gulp.task('monaco-typecheck-watch', createTscCompileTask(true));
gulp.task('monaco-typecheck', createTscCompileTask(false));
const monacoTypecheckWatchTask = task.define('monaco-typecheck-watch', createTscCompileTask(true));
exports.monacoTypecheckWatchTask = monacoTypecheckWatchTask;
const monacoTypecheckTask = task.define('monaco-typecheck', createTscCompileTask(false));
exports.monacoTypecheckTask = monacoTypecheckTask;
//#endregion

View File

@@ -11,8 +11,8 @@ const path = require('path');
const tsb = require('gulp-tsb');
const es = require('event-stream');
const filter = require('gulp-filter');
const rimraf = require('rimraf');
const util = require('./lib/util');
const task = require('./lib/task');
const watcher = require('./lib/watch');
const createReporter = require('./lib/reporter').createReporter;
const glob = require('glob');
@@ -21,6 +21,7 @@ const nlsDev = require('vscode-nls-dev');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
const plumber = require('gulp-plumber');
const _ = require('underscore');
const extensionsPath = path.join(path.dirname(__dirname), 'extensions');
@@ -35,22 +36,13 @@ const tasks = compilations.map(function (tsconfigFile) {
const absolutePath = path.join(extensionsPath, tsconfigFile);
const relativeDirname = path.dirname(tsconfigFile);
const tsOptions = require(absolutePath).compilerOptions;
const tsconfig = require(absolutePath);
const tsOptions = _.assign({}, tsconfig.extends ? require(path.join(extensionsPath, relativeDirname, tsconfig.extends)).compilerOptions : {}, tsconfig.compilerOptions);
tsOptions.verbose = false;
tsOptions.sourceMap = true;
const name = relativeDirname.replace(/\//g, '-');
// Tasks
const clean = 'clean-extension:' + name;
const compile = 'compile-extension:' + name;
const watch = 'watch-extension:' + name;
// Build Tasks
const cleanBuild = 'clean-extension-build:' + name;
const compileBuild = 'compile-extension-build:' + name;
const watchBuild = 'watch-extension-build:' + name;
const root = path.join('extensions', relativeDirname);
const srcBase = path.join(root, 'src');
const src = path.join(srcBase, '**');
@@ -109,18 +101,18 @@ const tasks = compilations.map(function (tsconfigFile) {
const srcOpts = { cwd: path.dirname(__dirname), base: srcBase };
gulp.task(clean, cb => rimraf(out, cb));
const cleanTask = task.define(`clean-extension-${name}`, util.rimraf(out));
gulp.task(compile, [clean], () => {
const compileTask = task.define(`compile-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false, true);
const input = gulp.src(src, srcOpts);
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
});
}));
gulp.task(watch, [clean], () => {
const watchTask = task.define(`watch-extension:${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(false);
const input = gulp.src(src, srcOpts);
const watchInput = watcher(src, srcOpts);
@@ -128,43 +120,35 @@ const tasks = compilations.map(function (tsconfigFile) {
return watchInput
.pipe(util.incremental(pipeline, input))
.pipe(gulp.dest(out));
});
}));
gulp.task(cleanBuild, cb => rimraf(out, cb));
gulp.task(compileBuild, [clean], () => {
const compileBuildTask = task.define(`compile-build-extension-${name}`, task.series(cleanTask, () => {
const pipeline = createPipeline(true, true);
const input = gulp.src(src, srcOpts);
return input
.pipe(pipeline())
.pipe(gulp.dest(out));
});
}));
gulp.task(watchBuild, [clean], () => {
const pipeline = createPipeline(true);
const input = gulp.src(src, srcOpts);
const watchInput = watcher(src, srcOpts);
return watchInput
.pipe(util.incremental(() => pipeline(), input))
.pipe(gulp.dest(out));
});
// Tasks
gulp.task(compileTask);
gulp.task(watchTask);
return {
clean: clean,
compile: compile,
watch: watch,
cleanBuild: cleanBuild,
compileBuild: compileBuild,
watchBuild: watchBuild
compileTask: compileTask,
watchTask: watchTask,
compileBuildTask: compileBuildTask
};
});
gulp.task('clean-extensions', tasks.map(t => t.clean));
gulp.task('compile-extensions', tasks.map(t => t.compile));
gulp.task('watch-extensions', tasks.map(t => t.watch));
const compileExtensionsTask = task.define('compile-extensions', task.parallel(...tasks.map(t => t.compileTask)));
gulp.task(compileExtensionsTask);
exports.compileExtensionsTask = compileExtensionsTask;
gulp.task('clean-extensions-build', tasks.map(t => t.cleanBuild));
gulp.task('compile-extensions-build', tasks.map(t => t.compileBuild));
gulp.task('watch-extensions-build', tasks.map(t => t.watchBuild));
const watchExtensionsTask = task.define('watch-extensions', task.parallel(...tasks.map(t => t.watchTask)));
gulp.task(watchExtensionsTask);
exports.watchExtensionsTask = watchExtensionsTask;
const compileExtensionsBuildTask = task.define('compile-extensions-build', task.parallel(...tasks.map(t => t.compileBuildTask)));
exports.compileExtensionsBuildTask = compileExtensionsBuildTask;

View File

@@ -42,12 +42,15 @@ const indentationFilter = [
// except specific files
'!ThirdPartyNotices.txt',
'!LICENSE.txt',
'!LICENSE.{txt,rtf}',
'!LICENSES.chromium.html',
'!**/LICENSE',
'!src/vs/nls.js',
'!src/vs/nls.build.js',
'!src/vs/css.js',
'!src/vs/css.build.js',
'!src/vs/loader.js',
'!src/vs/base/common/marked/marked.js',
'!src/vs/base/common/winjs.base.js',
'!src/vs/base/node/terminateProcess.sh',
'!src/vs/base/node/cpuUsage.sh',
'!test/assert.js',
@@ -78,12 +81,14 @@ const indentationFilter = [
'!src/vs/*/**/*.d.ts',
'!src/typings/**/*.d.ts',
'!extensions/**/*.d.ts',
'!**/*.{svg,exe,png,bmp,scpt,bat,cmd,cur,ttf,woff,eot,md,ps1,template,yaml,yml,d.ts.recipe}',
'!build/{lib,tslintRules}/**/*.js',
'!**/*.{svg,exe,png,bmp,scpt,bat,cmd,cur,ttf,woff,eot,md,ps1,template,yaml,yml,d.ts.recipe,ico,icns}',
'!build/{lib,tslintRules,download}/**/*.js',
'!build/**/*.sh',
'!build/tfs/**/*.js',
'!build/tfs/**/*.config',
'!build/azure-pipelines/**/*.js',
'!build/azure-pipelines/**/*.config',
'!**/Dockerfile',
'!**/*.Dockerfile',
'!**/*.dockerfile',
'!extensions/markdown-language-features/media/*.js'
];
@@ -96,6 +101,8 @@ const copyrightFilter = [
'!**/*.md',
'!**/*.bat',
'!**/*.cmd',
'!**/*.ico',
'!**/*.icns',
'!**/*.xml',
'!**/*.sh',
'!**/*.txt',
@@ -103,10 +110,12 @@ const copyrightFilter = [
'!**/*.opts',
'!**/*.disabled',
'!**/*.code-workspace',
'!**/promise-polyfill/polyfill.js',
'!build/**/*.init',
'!resources/linux/snap/snapcraft.yaml',
'!resources/linux/snap/electron-launch',
'!resources/win32/bin/code.js',
'!resources/completions/**',
'!extensions/markdown-language-features/media/highlight.css',
'!extensions/html-language-features/server/src/modes/typescript/*',
'!extensions/*/server/bin/*'
@@ -120,7 +129,6 @@ const eslintFilter = [
'!src/vs/nls.js',
'!src/vs/css.build.js',
'!src/vs/nls.build.js',
'!src/**/winjs.base.js',
'!src/**/marked.js',
'!**/test/**'
];
@@ -223,7 +231,7 @@ function hygiene(some) {
let formatted = result.dest.replace(/\r\n/gm, '\n');
if (original !== formatted) {
console.error('File not formatted:', file.relative);
console.error("File not formatted. Run the 'Format Document' command to fix it:", file.relative);
errorCount++;
}
cb(null, file);

View File

@@ -10,16 +10,15 @@ const json = require('gulp-json-editor');
const buffer = require('gulp-buffer');
const filter = require('gulp-filter');
const es = require('event-stream');
const util = require('./lib/util');
const remote = require('gulp-remote-src');
const zip = require('gulp-vinyl-zip');
const assign = require('object-assign');
const vfs = require('vinyl-fs');
const pkg = require('../package.json');
const cp = require('child_process');
const fancyLog = require('fancy-log');
const ansiColors = require('ansi-colors');
// {{SQL CARBON EDIT}}
const jeditor = require('gulp-json-editor');
const pkg = require('../package.json');
gulp.task('mixin', function () {
// {{SQL CARBON EDIT}}
const updateUrl = process.env['SQLOPS_UPDATEURL'];

View File

@@ -1,15 +0,0 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
const gulp = require('gulp');
const mocha = require('gulp-mocha');
gulp.task('test', function () {
return gulp.src('test/all.js')
.pipe(mocha({ ui: 'tdd', delay: true }))
.once('end', function () { process.exit(); });
});

View File

@@ -20,6 +20,7 @@ const filter = require('gulp-filter');
const json = require('gulp-json-editor');
const _ = require('underscore');
const util = require('./lib/util');
const task = require('./lib/task');
const ext = require('./lib/extensions');
const buildfile = require('../src/buildfile');
const common = require('./lib/optimize');
@@ -33,16 +34,18 @@ const i18n = require('./lib/i18n');
const serviceDownloader = require('service-downloader').ServiceDownloadProvider;
const platformInfo = require('service-downloader/out/platform').PlatformInformation;
const glob = require('glob');
// {{SQL CARBON EDIT}} - End
const deps = require('./dependencies');
const getElectronVersion = require('./lib/electron').getElectronVersion;
const createAsar = require('./lib/asar').createAsar;
const minimist = require('minimist');
const { compileBuildTask } = require('./gulpfile.compile');
const productionDependencies = deps.getProductionDependencies(path.dirname(__dirname));
// @ts-ignore
// {{SQL CARBON EDIT}}
var del = require('del');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
const baseModules = Object.keys(process.binding('natives')).filter(n => !/^_|\//.test(n));
// {{SQL CARBON EDIT}}
const nodeModules = [
@@ -56,30 +59,7 @@ const nodeModules = [
.concat(_.uniq(productionDependencies.map(d => d.name)))
.concat(baseModules);
// Build
const builtInExtensions = require('./builtInExtensions.json');
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
// {{SQL CARBON EDIT}}
'integration-tests',
];
// {{SQL CARBON EDIT}}
const vsce = require('vsce');
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'agent',
'import',
'profiler'
];
var azureExtensions = [ 'azurecore', 'mssql'];
const vscodeEntryPoints = _.flatten([
buildfile.entrypoint('vs/workbench/workbench.main'),
buildfile.base,
@@ -92,22 +72,25 @@ const vscodeResources = [
'out-build/cli.js',
'out-build/driver.js',
'out-build/bootstrap.js',
'out-build/bootstrap-fork.js',
'out-build/bootstrap-amd.js',
'out-build/bootstrap-window.js',
'out-build/paths.js',
'out-build/vs/**/*.{svg,png,cur,html}',
'out-build/vs/base/common/performance.js',
'out-build/vs/base/node/languagePacks.js',
'out-build/vs/base/node/{stdForkStart.js,terminateProcess.sh,cpuUsage.sh}',
'out-build/vs/base/browser/ui/octiconLabel/octicons/**',
'out-build/vs/workbench/browser/media/*-theme.css',
'out-build/vs/workbench/electron-browser/bootstrap/**',
'out-build/vs/workbench/parts/debug/**/*.json',
'out-build/vs/workbench/parts/execution/**/*.scpt',
'out-build/vs/workbench/parts/webview/electron-browser/webview-pre.js',
'out-build/vs/workbench/contrib/debug/**/*.json',
'out-build/vs/workbench/contrib/externalTerminal/**/*.scpt',
'out-build/vs/workbench/contrib/webview/electron-browser/webview-pre.js',
'out-build/vs/**/markdown.css',
'out-build/vs/workbench/parts/tasks/**/*.json',
'out-build/vs/workbench/parts/welcome/walkThrough/**/*.md',
'out-build/vs/workbench/contrib/tasks/**/*.json',
'out-build/vs/workbench/contrib/welcome/walkThrough/**/*.md',
'out-build/vs/workbench/services/files/**/*.exe',
'out-build/vs/workbench/services/files/**/*.md',
'out-build/vs/code/electron-browser/workbench/**',
'out-build/vs/code/electron-browser/sharedProcess/sharedProcess.js',
'out-build/vs/code/electron-browser/issue/issueReporter.js',
'out-build/vs/code/electron-browser/processExplorer/processExplorer.js',
@@ -131,7 +114,7 @@ const vscodeResources = [
'out-build/sql/parts/jobManagement/common/media/*.svg',
'out-build/sql/media/objectTypes/*.svg',
'out-build/sql/media/icons/*.svg',
'out-build/sql/parts/notebook/media/**/*.svg',
'out-build/sql/workbench/parts/notebook/media/**/*.svg',
'!**/test/**'
];
@@ -141,65 +124,84 @@ const BUNDLED_FILE_HEADER = [
' *--------------------------------------------------------*/'
].join('\n');
gulp.task('clean-optimized-vscode', util.rimraf('out-vscode'));
gulp.task('optimize-vscode', ['clean-optimized-vscode', 'compile-build', 'compile-extensions-build'], common.optimizeTask({
src: 'out-build',
entryPoints: vscodeEntryPoints,
otherSources: [],
resources: vscodeResources,
loaderConfig: common.loaderConfig(nodeModules),
header: BUNDLED_FILE_HEADER,
out: 'out-vscode',
bundleInfo: undefined
}));
const optimizeVSCodeTask = task.define('optimize-vscode', task.series(
task.parallel(
util.rimraf('out-vscode'),
compileBuildTask
),
common.optimizeTask({
src: 'out-build',
entryPoints: vscodeEntryPoints,
resources: vscodeResources,
loaderConfig: common.loaderConfig(nodeModules),
header: BUNDLED_FILE_HEADER,
out: 'out-vscode',
bundleInfo: undefined
})
));
gulp.task('optimize-index-js', ['optimize-vscode'], () => {
const fullpath = path.join(process.cwd(), 'out-vscode/vs/workbench/electron-browser/bootstrap/index.js');
const contents = fs.readFileSync(fullpath).toString();
const newContents = contents.replace('[/*BUILD->INSERT_NODE_MODULES*/]', JSON.stringify(nodeModules));
fs.writeFileSync(fullpath, newContents);
});
const optimizeIndexJSTask = task.define('optimize-index-js', task.series(
optimizeVSCodeTask,
() => {
const fullpath = path.join(process.cwd(), 'out-vscode/bootstrap-window.js');
const contents = fs.readFileSync(fullpath).toString();
const newContents = contents.replace('[/*BUILD->INSERT_NODE_MODULES*/]', JSON.stringify(nodeModules));
fs.writeFileSync(fullpath, newContents);
}
));
const baseUrl = `https://ticino.blob.core.windows.net/sourcemaps/${commit}/core`;
gulp.task('clean-minified-vscode', util.rimraf('out-vscode-min'));
gulp.task('minify-vscode', ['clean-minified-vscode', 'optimize-index-js'], common.minifyTask('out-vscode', baseUrl));
const sourceMappingURLBase = `https://ticino.blob.core.windows.net/sourcemaps/${commit}`;
const minifyVSCodeTask = task.define('minify-vscode', task.series(
task.parallel(
util.rimraf('out-vscode-min'),
optimizeIndexJSTask
),
common.minifyTask('out-vscode', `${sourceMappingURLBase}/core`)
));
// Package
// @ts-ignore JSON checking: darwinCredits is optional
const darwinCreditsTemplate = product.darwinCredits && _.template(fs.readFileSync(path.join(root, product.darwinCredits), 'utf8'));
function darwinBundleDocumentType(extensions, icon) {
return {
name: product.nameLong + ' document',
role: 'Editor',
ostypes: ["TEXT", "utxt", "TUTX", "****"],
extensions: extensions,
iconFile: icon
};
}
const config = {
version: getElectronVersion(),
productAppName: product.nameLong,
companyName: 'Microsoft Corporation',
copyright: 'Copyright (C) 2018 Microsoft. All rights reserved',
copyright: 'Copyright (C) 2019 Microsoft. All rights reserved',
darwinIcon: 'resources/darwin/code.icns',
darwinBundleIdentifier: product.darwinBundleIdentifier,
darwinApplicationCategoryType: 'public.app-category.developer-tools',
darwinHelpBookFolder: 'VS Code HelpBook',
darwinHelpBookName: 'VS Code HelpBook',
darwinBundleDocumentTypes: [{
name: product.nameLong + ' document',
role: 'Editor',
ostypes: ["TEXT", "utxt", "TUTX", "****"],
// {{SQL CARBON EDIT}}
extensions: ["csv", "json", "sqlplan", "sql", "xml"],
iconFile: 'resources/darwin/code_file.icns'
}],
darwinBundleDocumentTypes: [
// {{SQL CARBON EDIT}} - Remove most document types and replace with ours
darwinBundleDocumentType(["csv", "json", "sqlplan", "sql", "xml"], 'resources/darwin/code_file.icns'),
],
darwinBundleURLTypes: [{
role: 'Viewer',
name: product.nameLong,
urlSchemes: [product.urlProtocol]
}],
darwinCredits: darwinCreditsTemplate ? Buffer.from(darwinCreditsTemplate({ commit: commit, date: new Date().toISOString() })) : void 0,
darwinForceDarkModeSupport: true,
darwinCredits: darwinCreditsTemplate ? Buffer.from(darwinCreditsTemplate({ commit: commit, date: new Date().toISOString() })) : undefined,
linuxExecutableName: product.applicationName,
winIcon: 'resources/win32/code.ico',
token: process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || void 0,
token: process.env['VSCODE_MIXIN_PASSWORD'] || process.env['GITHUB_TOKEN'] || undefined,
// @ts-ignore JSON checking: electronRepository is optional
repo: product.electronRepository || void 0
repo: product.electronRepository || undefined
};
function getElectron(arch) {
@@ -212,18 +214,18 @@ function getElectron(arch) {
});
return gulp.src('package.json')
.pipe(json({ name: product.nameShort }))
.pipe(json({ name: product.nameShort }))
.pipe(electron(electronOpts))
.pipe(filter(['**', '!**/app/package.json']))
.pipe(vfs.dest('.build/electron'));
};
}
gulp.task('clean-electron', util.rimraf('.build/electron'));
gulp.task('electron', ['clean-electron'], getElectron(process.arch));
gulp.task('electron-ia32', ['clean-electron'], getElectron('ia32'));
gulp.task('electron-x64', ['clean-electron'], getElectron('x64'));
gulp.task(task.define('electron', task.series(util.rimraf('.build/electron'), getElectron(process.arch))));
gulp.task(task.define('electron-ia32', task.series(util.rimraf('.build/electron'), getElectron('ia32'))));
gulp.task(task.define('electron-x64', task.series(util.rimraf('.build/electron'), getElectron('x64'))));
gulp.task(task.define('electron-arm', task.series(util.rimraf('.build/electron'), getElectron('armv7l'))));
gulp.task(task.define('electron-arm64', task.series(util.rimraf('.build/electron'), getElectron('arm64'))));
/**
* Compute checksums for some files.
@@ -259,115 +261,36 @@ function computeChecksum(filename) {
return hash;
}
function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
const packagePath = path.join(path.dirname(root), element.name + '.vsix');
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
function packageExtensionTask(extensionName, platform, arch) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', extensionName);
} else {
destination = path.join(destination, 'resources', 'app', 'extensions', extensionName);
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => extensionName === name);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return ext.fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util.skipDirectories())
.pipe(util.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
function packageTask(platform, arch, opts) {
function packageTask(platform, arch, sourceFolderName, destinationFolderName, opts) {
opts = opts || {};
// {{SQL CARBON EDIT}}
const destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
const destination = path.join(path.dirname(root), destinationFolderName);
platform = platform || process.platform;
return () => {
const out = opts.minified ? 'out-vscode-min' : 'out-vscode';
const out = sourceFolderName;
const checksums = computeChecksums(out, [
'vs/workbench/workbench.main.js',
'vs/workbench/workbench.main.css',
'vs/workbench/electron-browser/bootstrap/index.html',
'vs/workbench/electron-browser/bootstrap/index.js',
'vs/workbench/electron-browser/bootstrap/preload.js'
'vs/code/electron-browser/workbench/workbench.html',
'vs/code/electron-browser/workbench/workbench.js'
]);
const src = gulp.src(out + '/**', { base: '.' })
.pipe(rename(function (path) { path.dirname = path.dirname.replace(new RegExp('^' + out), 'out'); }));
const root = path.resolve(path.join(__dirname, '..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1)
.filter(({ name }) => azureExtensions.indexOf(name) === -1);
packageBuiltInExtensions();
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return ext.fromLocal(extension.path)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
}));
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map(d => path.relative(root, d.path)).map(d => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util.cleanNodeModule('account-provider-azure', ['node_modules/date-utils/doc/**', 'node_modules/adal_node/node_modules/**'], undefined))
.pipe(util.cleanNodeModule('typescript', ['**/**'], undefined));
const sources = es.merge(src, localExtensions, localExtensionDependencies)
.pipe(rename(function (path) { path.dirname = path.dirname.replace(new RegExp('^' + out), 'out'); }))
.pipe(util.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
const root = path.resolve(path.join(__dirname, '..'));
// {{SQL CARBON EDIT}}
ext.packageBuiltInExtensions();
const sources = es.merge(src, ext.packageExtensionsStream({
sourceMappingURLBase: sourceMappingURLBase
}));
let version = packageJson.version;
// @ts-ignore JSON checking: quality is optional
const quality = product.quality;
@@ -378,8 +301,15 @@ function packageTask(platform, arch, opts) {
// {{SQL CARBON EDIT}}
const name = (platform === 'darwin') ? 'Azure Data Studio' : product.nameShort;
const packageJsonUpdates = { name, version };
// for linux url handling
if (platform === 'linux') {
packageJsonUpdates.desktopName = `${product.applicationName}-url-handler.desktop`;
}
const packageJsonStream = gulp.src(['package.json'], { base: '.' })
.pipe(json({ name, version }));
.pipe(json(packageJsonUpdates));
const date = new Date().toISOString();
const productJsonUpdate = { commit, date, checksums };
@@ -391,14 +321,13 @@ function packageTask(platform, arch, opts) {
const productJsonStream = gulp.src(['product.json'], { base: '.' })
.pipe(json(productJsonUpdate));
const license = gulp.src(['LICENSES.chromium.html', 'LICENSE.txt', 'ThirdPartyNotices.txt', 'licenses/**'], { base: '.' });
const watermark = gulp.src(['resources/letterpress.svg', 'resources/letterpress-dark.svg', 'resources/letterpress-hc.svg'], { base: '.' });
const license = gulp.src(['LICENSES.chromium.html', product.licenseFileName, 'ThirdPartyNotices.txt', 'licenses/**'], { base: '.', allowEmpty: true });
// TODO the API should be copied to `out` during compile, not here
const api = gulp.src('src/vs/vscode.d.ts').pipe(rename('out/vs/vscode.d.ts'));
// {{SQL CARBON EDIT}}
const dataApi = gulp.src('src/vs/data.d.ts').pipe(rename('out/sql/data.d.ts'));
// {{SQL CARBON EDIT}}
const dataApi = gulp.src('src/sql/azdata.d.ts').pipe(rename('out/sql/azdata.d.ts'));
const sqlopsAPI = gulp.src('src/sql/sqlops.d.ts').pipe(rename('out/sql/sqlops.d.ts'));
const depsSrc = [
..._.flatten(productionDependencies.map(d => path.relative(root, d.path)).map(d => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
@@ -409,16 +338,17 @@ function packageTask(platform, arch, opts) {
const deps = gulp.src(depsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util.cleanNodeModule('fsevents', ['binding.gyp', 'fsevents.cc', 'build/**', 'src/**', 'test/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('oniguruma', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node', 'src/*.js']))
.pipe(util.cleanNodeModule('vscode-sqlite3', ['binding.gyp', 'benchmark/**', 'cloudformation/**', 'deps/**', 'test/**', 'build/**', 'src/**'], ['build/Release/*.node']))
.pipe(util.cleanNodeModule('oniguruma', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['build/Release/*.node', 'src/*.js']))
.pipe(util.cleanNodeModule('windows-mutex', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-keymap', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-is-elevated', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-watchdog', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('spdlog', ['binding.gyp', 'build/**', 'deps/**', 'src/**', 'test/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('native-keymap', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['build/Release/*.node']))
.pipe(util.cleanNodeModule('native-is-elevated', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['build/Release/*.node']))
.pipe(util.cleanNodeModule('native-watchdog', ['binding.gyp', 'build/**', 'src/**'], ['build/Release/*.node']))
.pipe(util.cleanNodeModule('spdlog', ['binding.gyp', 'build/**', 'deps/**', 'src/**', 'test/**'], ['build/Release/*.node']))
.pipe(util.cleanNodeModule('jschardet', ['dist/**']))
.pipe(util.cleanNodeModule('windows-foreground-love', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('windows-process-tree', ['binding.gyp', 'build/**', 'src/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('gc-signals', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['**/*.node', 'src/index.js']))
.pipe(util.cleanNodeModule('gc-signals', ['binding.gyp', 'build/**', 'src/**', 'deps/**'], ['build/Release/*.node', 'src/index.js']))
.pipe(util.cleanNodeModule('keytar', ['binding.gyp', 'build/**', 'src/**', 'script/**', 'node_modules/**'], ['**/*.node']))
.pipe(util.cleanNodeModule('node-pty', ['binding.gyp', 'build/**', 'src/**', 'tools/**'], ['build/Release/*.exe', 'build/Release/*.dll', 'build/Release/*.node']))
// {{SQL CARBON EDIT}}
@@ -430,7 +360,10 @@ function packageTask(platform, arch, opts) {
.pipe(util.cleanNodeModule('slickgrid', ['node_modules/**', 'examples/**'], undefined))
.pipe(util.cleanNodeModule('nsfw', ['binding.gyp', 'build/**', 'src/**', 'openpa/**', 'includes/**'], ['**/*.node', '**/*.a']))
.pipe(util.cleanNodeModule('vscode-nsfw', ['binding.gyp', 'build/**', 'src/**', 'openpa/**', 'includes/**'], ['**/*.node', '**/*.a']))
// {{SQL CARBON EDIT}} - End
.pipe(util.cleanNodeModule('vsda', ['binding.gyp', 'README.md', 'build/**', '*.bat', '*.sh', '*.cpp', '*.h'], ['build/Release/vsda.node']))
.pipe(util.cleanNodeModule('vscode-windows-ca-certs', ['**/*'], ['package.json', '**/*.node']))
.pipe(util.cleanNodeModule('node-addon-api', ['**/*']))
.pipe(createAsar(path.join(process.cwd(), 'node_modules'), ['**/*.node', '**/vscode-ripgrep/bin/*', '**/node-pty/build/Release/*'], 'app/node_modules.asar'));
// {{SQL CARBON EDIT}}
@@ -440,24 +373,33 @@ function packageTask(platform, arch, opts) {
'node_modules/slickgrid/**/*.*',
'node_modules/underscore/**/*.*',
'node_modules/zone.js/**/*.*',
'node_modules/chart.js/**/*.*'
'node_modules/chart.js/**/*.*',
'node_modules/chartjs-color/**/*.*',
'node_modules/chartjs-color-string/**/*.*',
'node_modules/color-convert/**/*.*',
'node_modules/color-name/**/*.*',
'node_modules/moment/**/*.*'
], { base: '.', dot: true });
let all = es.merge(
packageJsonStream,
productJsonStream,
license,
watermark,
api,
// {{SQL CARBON EDIT}}
copiedModules,
dataApi,
sqlopsAPI,
sources,
deps
);
if (platform === 'win32') {
all = es.merge(all, gulp.src(['resources/win32/code_file.ico', 'resources/win32/code_70x70.png', 'resources/win32/code_150x150.png'], { base: '.' }));
all = es.merge(all, gulp.src([
// {{SQL CARBON EDIT}} remove unused icons
'resources/win32/code_70x70.png',
'resources/win32/code_150x150.png'
], { base: '.' }));
} else if (platform === 'linux') {
all = es.merge(all, gulp.src('resources/linux/code.png', { base: '.' }));
} else if (platform === 'darwin') {
@@ -473,8 +415,10 @@ function packageTask(platform, arch, opts) {
.pipe(electron(_.extend({}, config, { platform, arch, ffmpegChromium: true })))
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
// result = es.merge(result, gulp.src('resources/completions/**', { base: '.' }));
if (platform === 'win32') {
result = es.merge(result, gulp.src('resources/win32/bin/code.js', { base: 'resources/win32' }));
result = es.merge(result, gulp.src('resources/win32/bin/code.js', { base: 'resources/win32', allowEmpty: true }));
result = es.merge(result, gulp.src('resources/win32/bin/code.cmd', { base: 'resources/win32' })
.pipe(replace('@@NAME@@', product.nameShort))
@@ -482,6 +426,8 @@ function packageTask(platform, arch, opts) {
result = es.merge(result, gulp.src('resources/win32/bin/code.sh', { base: 'resources/win32' })
.pipe(replace('@@NAME@@', product.nameShort))
.pipe(replace('@@COMMIT@@', commit))
.pipe(replace('@@APPNAME@@', product.applicationName))
.pipe(rename(function (f) { f.basename = product.applicationName; f.extname = ''; })));
result = es.merge(result, gulp.src('resources/win32/VisualElementsManifest.xml', { base: 'resources/win32' })
@@ -492,41 +438,52 @@ function packageTask(platform, arch, opts) {
.pipe(rename('bin/' + product.applicationName)));
}
// submit all stats that have been collected
// during the build phase
if (opts.stats) {
result.on('end', () => {
const { submitAllStats } = require('./lib/stats');
submitAllStats(product, commit).then(() => console.log('Submitted bundle stats!'));
});
}
return result.pipe(vfs.dest(destination));
};
}
const buildRoot = path.dirname(root);
// {{SQL CARBON EDIT}}
gulp.task('vscode-win32-x64-azurecore', ['optimize-vscode'], packageExtensionTask('azurecore', 'win32', 'x64'));
gulp.task('vscode-darwin-azurecore', ['optimize-vscode'], packageExtensionTask('azurecore', 'darwin'));
gulp.task('vscode-linux-x64-azurecore', ['optimize-vscode'], packageExtensionTask('azurecore', 'linux', 'x64'));
const BUILD_TARGETS = [
{ platform: 'win32', arch: 'ia32' },
{ platform: 'win32', arch: 'x64' },
{ platform: 'darwin', arch: null, opts: { stats: true } },
{ platform: 'linux', arch: 'ia32' },
{ platform: 'linux', arch: 'x64' },
{ platform: 'linux', arch: 'arm' },
{ platform: 'linux', arch: 'arm64' },
];
BUILD_TARGETS.forEach(buildTarget => {
const dashed = (str) => (str ? `-${str}` : ``);
const platform = buildTarget.platform;
const arch = buildTarget.arch;
const opts = buildTarget.opts;
gulp.task('vscode-win32-x64-mssql', ['vscode-linux-x64-azurecore', 'optimize-vscode'], packageExtensionTask('mssql', 'win32', 'x64'));
gulp.task('vscode-darwin-mssql', ['vscode-linux-x64-azurecore', 'optimize-vscode'], packageExtensionTask('mssql', 'darwin'));
gulp.task('vscode-linux-x64-mssql', ['vscode-linux-x64-azurecore', 'optimize-vscode'], packageExtensionTask('mssql', 'linux', 'x64'));
['', 'min'].forEach(minified => {
const sourceFolderName = `out-vscode${dashed(minified)}`;
const destinationFolderName = `azuredatastudio${dashed(platform)}${dashed(arch)}`;
gulp.task('clean-vscode-win32-ia32', util.rimraf(path.join(buildRoot, 'azuredatastudio-win32-ia32')));
gulp.task('clean-vscode-win32-x64', util.rimraf(path.join(buildRoot, 'azuredatastudio-win32-x64')));
gulp.task('clean-vscode-darwin', util.rimraf(path.join(buildRoot, 'azuredatastudio-darwin')));
gulp.task('clean-vscode-linux-ia32', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-ia32')));
gulp.task('clean-vscode-linux-x64', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-x64')));
gulp.task('clean-vscode-linux-arm', util.rimraf(path.join(buildRoot, 'azuredatastudio-linux-arm')));
gulp.task('vscode-win32-ia32', ['optimize-vscode', 'clean-vscode-win32-ia32'], packageTask('win32', 'ia32'));
gulp.task('vscode-win32-x64', ['vscode-win32-x64-azurecore', 'vscode-win32-x64-mssql', 'optimize-vscode', 'clean-vscode-win32-x64'], packageTask('win32', 'x64'));
gulp.task('vscode-darwin', ['vscode-darwin-azurecore', 'vscode-darwin-mssql', 'optimize-vscode', 'clean-vscode-darwin'], packageTask('darwin'));
gulp.task('vscode-linux-ia32', ['optimize-vscode', 'clean-vscode-linux-ia32'], packageTask('linux', 'ia32'));
gulp.task('vscode-linux-x64', ['vscode-linux-x64-azurecore', 'vscode-linux-x64-mssql', 'optimize-vscode', 'clean-vscode-linux-x64'], packageTask('linux', 'x64'));
gulp.task('vscode-linux-arm', ['optimize-vscode', 'clean-vscode-linux-arm'], packageTask('linux', 'arm'));
gulp.task('vscode-win32-ia32-min', ['minify-vscode', 'clean-vscode-win32-ia32'], packageTask('win32', 'ia32', { minified: true }));
gulp.task('vscode-win32-x64-min', ['minify-vscode', 'clean-vscode-win32-x64'], packageTask('win32', 'x64', { minified: true }));
gulp.task('vscode-darwin-min', ['minify-vscode', 'clean-vscode-darwin'], packageTask('darwin', null, { minified: true }));
gulp.task('vscode-linux-ia32-min', ['minify-vscode', 'clean-vscode-linux-ia32'], packageTask('linux', 'ia32', { minified: true }));
gulp.task('vscode-linux-x64-min', ['minify-vscode', 'clean-vscode-linux-x64'], packageTask('linux', 'x64', { minified: true }));
gulp.task('vscode-linux-arm-min', ['minify-vscode', 'clean-vscode-linux-arm'], packageTask('linux', 'arm', { minified: true }));
const vscodeTask = task.define(`vscode${dashed(platform)}${dashed(arch)}${dashed(minified)}`, task.series(
task.parallel(
minified ? minifyVSCodeTask : optimizeVSCodeTask,
util.rimraf(path.join(buildRoot, destinationFolderName))
),
ext.packageExtensionTask('mssql', platform, arch),
ext.packageExtensionTask('azurecore', platform, arch),
packageTask(platform, arch, sourceFolderName, destinationFolderName, opts)
));
gulp.task(vscodeTask);
});
});
// Transifex Localizations
@@ -549,68 +506,78 @@ const apiHostname = process.env.TRANSIFEX_API_URL;
const apiName = process.env.TRANSIFEX_API_NAME;
const apiToken = process.env.TRANSIFEX_API_TOKEN;
gulp.task('vscode-translations-push', ['optimize-vscode'], function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
gulp.task(task.define(
'vscode-translations-push',
task.series(
optimizeVSCodeTask,
function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(i18n.pushXlfFiles(apiHostname, apiName, apiToken));
});
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(i18n.pushXlfFiles(apiHostname, apiName, apiToken));
}
)
));
gulp.task('vscode-translations-push-test', ['optimize-vscode'], function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
gulp.task(task.define(
'vscode-translations-export',
task.series(
optimizeVSCodeTask,
function () {
const pathToMetadata = './out-vscode/nls.metadata.json';
const pathToExtensions = './extensions/*';
const pathToSetup = 'build/win32/**/{Default.isl,messages.en.isl}';
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
// {{SQL CARBON EDIT}}
// disable since function makes calls to VS Code Transifex API
// ).pipe(i18n.findObsoleteResources(apiHostname, apiName, apiToken)
).pipe(vfs.dest('../vscode-transifex-input'));
});
return es.merge(
gulp.src(pathToMetadata).pipe(i18n.createXlfFilesForCoreBundle()),
gulp.src(pathToSetup).pipe(i18n.createXlfFilesForIsl()),
gulp.src(pathToExtensions).pipe(i18n.createXlfFilesForExtensions())
).pipe(vfs.dest('../vscode-translations-export'));
}
)
));
gulp.task('vscode-translations-pull', function () {
[...i18n.defaultLanguages, ...i18n.extraLanguages].forEach(language => {
i18n.pullCoreAndExtensionsXlfFiles(apiHostname, apiName, apiToken, language).pipe(vfs.dest(`../vscode-localization/${language.id}/build`));
return es.merge([...i18n.defaultLanguages, ...i18n.extraLanguages].map(language => {
let includeDefault = !!innoSetupConfig[language.id].defaultInfo;
i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-localization/${language.id}/setup`));
});
return i18n.pullSetupXlfFiles(apiHostname, apiName, apiToken, language, includeDefault).pipe(vfs.dest(`../vscode-translations-import/${language.id}/setup`));
}));
});
gulp.task('vscode-translations-import', function () {
// {{SQL CARBON EDIT}} - Replace function body with our own
[...i18n.defaultLanguages, ...i18n.extraLanguages].forEach(language => {
gulp.src(`../vscode-localization/${language.id}/build/*/*.xlf`)
.pipe(i18n.prepareI18nFiles())
.pipe(vfs.dest(`./i18n/${language.folderName}`));
// {{SQL CARBON EDIT}}
// gulp.src(`../vscode-localization/${language.id}/setup/*/*.xlf`)
// .pipe(i18n.prepareIslFiles(language, innoSetupConfig[language.id]))
// .pipe(vfs.dest(`./build/win32/i18n`));
});
// {{SQL CARBON EDIT}} - End
});
// Sourcemaps
gulp.task('upload-vscode-sourcemaps', ['minify-vscode'], () => {
gulp.task('upload-vscode-sourcemaps', () => {
const vs = gulp.src('out-vscode-min/**/*.map', { base: 'out-vscode-min' })
.pipe(es.mapSync(f => {
f.path = `${f.base}/core/${f.relative}`;
return f;
}));
const extensions = gulp.src('extensions/**/out/**/*.map', { base: '.' });
const extensionsOut = gulp.src('extensions/**/out/**/*.map', { base: '.' });
const extensionsDist = gulp.src('extensions/**/dist/**/*.map', { base: '.' });
return es.merge(vs, extensions)
return es.merge(vs, extensionsOut, extensionsDist)
.pipe(es.through(function (data) {
// debug
console.log('Uploading Sourcemap', data.relative);
this.emit('data', data);
}))
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
@@ -619,57 +586,8 @@ gulp.task('upload-vscode-sourcemaps', ['minify-vscode'], () => {
}));
});
const allConfigDetailsPath = path.join(os.tmpdir(), 'configuration.json');
gulp.task('upload-vscode-configuration', ['generate-vscode-configuration'], () => {
if (!shouldSetupSettingsSearch()) {
const branch = process.env.BUILD_SOURCEBRANCH;
console.log(`Only runs on master and release branches, not ${branch}`);
return;
}
if (!fs.existsSync(allConfigDetailsPath)) {
throw new Error(`configuration file at ${allConfigDetailsPath} does not exist`);
}
const settingsSearchBuildId = getSettingsSearchBuildId(packageJson);
if (!settingsSearchBuildId) {
throw new Error('Failed to compute build number');
}
return gulp.src(allConfigDetailsPath)
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'configuration',
prefix: `${settingsSearchBuildId}/${commit}/`
}));
});
function shouldSetupSettingsSearch() {
const branch = process.env.BUILD_SOURCEBRANCH;
return branch && (/\/master$/.test(branch) || branch.indexOf('/release/') >= 0);
}
function getSettingsSearchBuildId(packageJson) {
try {
const branch = process.env.BUILD_SOURCEBRANCH;
const branchId = branch.indexOf('/release/') >= 0 ? 0 :
/\/master$/.test(branch) ? 1 :
2; // Some unexpected branch
const out = cp.execSync(`git rev-list HEAD --count`);
const count = parseInt(out.toString());
// <version number><commit count><branchId (avoid unlikely conflicts)>
// 1.25.1, 1,234,567 commits, master = 1250112345671
return util.versionStringToNumber(packageJson.version) * 1e8 + count * 10 + branchId;
} catch (e) {
throw new Error('Could not determine build number: ' + e.toString());
}
}
// This task is only run for the MacOS build
gulp.task('generate-vscode-configuration', () => {
const generateVSCodeConfigurationTask = task.define('generate-vscode-configuration', () => {
return new Promise((resolve, reject) => {
const buildDir = process.env['AGENT_BUILDDIRECTORY'];
if (!buildDir) {
@@ -706,6 +624,61 @@ gulp.task('generate-vscode-configuration', () => {
});
});
const allConfigDetailsPath = path.join(os.tmpdir(), 'configuration.json');
gulp.task(task.define(
'upload-vscode-configuration',
task.series(
generateVSCodeConfigurationTask,
() => {
if (!shouldSetupSettingsSearch()) {
const branch = process.env.BUILD_SOURCEBRANCH;
console.log(`Only runs on master and release branches, not ${branch}`);
return;
}
if (!fs.existsSync(allConfigDetailsPath)) {
throw new Error(`configuration file at ${allConfigDetailsPath} does not exist`);
}
const settingsSearchBuildId = getSettingsSearchBuildId(packageJson);
if (!settingsSearchBuildId) {
throw new Error('Failed to compute build number');
}
return gulp.src(allConfigDetailsPath)
.pipe(azure.upload({
account: process.env.AZURE_STORAGE_ACCOUNT,
key: process.env.AZURE_STORAGE_ACCESS_KEY,
container: 'configuration',
prefix: `${settingsSearchBuildId}/${commit}/`
}));
}
)
));
function shouldSetupSettingsSearch() {
const branch = process.env.BUILD_SOURCEBRANCH;
return branch && (/\/master$/.test(branch) || branch.indexOf('/release/') >= 0);
}
function getSettingsSearchBuildId(packageJson) {
try {
const branch = process.env.BUILD_SOURCEBRANCH;
const branchId = branch.indexOf('/release/') >= 0 ? 0 :
/\/master$/.test(branch) ? 1 :
2; // Some unexpected branch
const out = cp.execSync(`git rev-list HEAD --count`);
const count = parseInt(out.toString());
// <version number><commit count><branchId (avoid unlikely conflicts)>
// 1.25.1, 1,234,567 commits, master = 1250112345671
return util.versionStringToNumber(packageJson.version) * 1e8 + count * 10 + branchId;
} catch (e) {
throw new Error('Could not determine build number: ' + e.toString());
}
}
// {{SQL CARBON EDIT}}
// Install service locally before building carbon
@@ -728,6 +701,5 @@ function installService() {
}
gulp.task('install-sqltoolsservice', () => {
return installService();
return installService();
});

View File

@@ -12,14 +12,18 @@ const shell = require('gulp-shell');
const es = require('event-stream');
const vfs = require('vinyl-fs');
const util = require('./lib/util');
const task = require('./lib/task');
const packageJson = require('../package.json');
const product = require('../product.json');
const rpmDependencies = require('../resources/linux/rpm/dependencies.json');
const path = require('path');
const root = path.dirname(__dirname);
const commit = util.getVersion(root);
const linuxPackageRevision = Math.floor(new Date().getTime() / 1000);
function getDebPackageArch(arch) {
return { x64: 'amd64', ia32: 'i386', arm: 'armhf' }[arch];
return { x64: 'amd64', ia32: 'i386', arm: 'armhf', arm64: "arm64" }[arch];
}
function prepareDebPackage(arch) {
@@ -30,11 +34,17 @@ function prepareDebPackage(arch) {
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename('usr/share/applications/' + product.applicationName + '.desktop'));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename('usr/share/applications/' + product.applicationName + '-url-handler.desktop'));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', product.applicationName))
.pipe(rename('usr/share/applications/' + product.applicationName + '.desktop'));
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const appdata = gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
@@ -43,7 +53,13 @@ function prepareDebPackage(arch) {
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('usr/share/pixmaps/' + product.applicationName + '.png'));
.pipe(rename('usr/share/pixmaps/' + product.linuxIconName + '.png'));
// const bash_completion = gulp.src('resources/completions/bash/code')
// .pipe(rename('usr/share/bash-completion/completions/code'));
// const zsh_completion = gulp.src('resources/completions/zsh/_code')
// .pipe(rename('usr/share/zsh/vendor-completions/_code'));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'usr/share/' + product.applicationName + '/' + p.dirname; }));
@@ -79,7 +95,7 @@ function prepareDebPackage(arch) {
.pipe(replace('@@UPDATEURL@@', product.updateUrl || '@@UPDATEURL@@'))
.pipe(rename('DEBIAN/postinst'));
const all = es.merge(control, postinst, postrm, prerm, desktop, appdata, icon, code);
const all = es.merge(control, postinst, postrm, prerm, desktops, appdata, icon, /* bash_completion, zsh_completion, */ code);
return all.pipe(vfs.dest(destination));
};
@@ -99,7 +115,7 @@ function getRpmBuildPath(rpmArch) {
}
function getRpmPackageArch(arch) {
return { x64: 'x86_64', ia32: 'i386', arm: 'armhf' }[arch];
return { x64: 'x86_64', ia32: 'i386', arm: 'armhf', arm64: "arm64" }[arch];
}
function prepareRpmPackage(arch) {
@@ -109,11 +125,17 @@ function prepareRpmPackage(arch) {
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '.desktop'));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '-url-handler.desktop'));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', product.applicationName))
.pipe(rename('BUILD/usr/share/applications/' + product.applicationName + '.desktop'));
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const appdata = gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
@@ -122,7 +144,13 @@ function prepareRpmPackage(arch) {
.pipe(rename('usr/share/appdata/' + product.applicationName + '.appdata.xml'));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename('BUILD/usr/share/pixmaps/' + product.applicationName + '.png'));
.pipe(rename('BUILD/usr/share/pixmaps/' + product.linuxIconName + '.png'));
// const bash_completion = gulp.src('resources/completions/bash/code')
// .pipe(rename('BUILD/usr/share/bash-completion/completions/code'));
// const zsh_completion = gulp.src('resources/completions/zsh/_code')
// .pipe(rename('BUILD/usr/share/zsh/site-functions/_code'));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'BUILD/usr/share/' + product.applicationName + '/' + p.dirname; }));
@@ -130,6 +158,7 @@ function prepareRpmPackage(arch) {
const spec = gulp.src('resources/linux/rpm/code.spec.template', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@ICON@@', product.linuxIconName))
.pipe(replace('@@VERSION@@', packageJson.version))
.pipe(replace('@@RELEASE@@', linuxPackageRevision))
.pipe(replace('@@ARCHITECTURE@@', rpmArch))
@@ -144,7 +173,7 @@ function prepareRpmPackage(arch) {
const specIcon = gulp.src('resources/linux/rpm/code.xpm', { base: '.' })
.pipe(rename('SOURCES/' + product.applicationName + '.xpm'));
const all = es.merge(code, desktop, appdata, icon, spec, specIcon);
const all = es.merge(code, desktops, appdata, icon, /* bash_completion, zsh_completion, */ spec, specIcon);
return all.pipe(vfs.dest(getRpmBuildPath(rpmArch)));
};
@@ -162,37 +191,45 @@ function buildRpmPackage(arch) {
'cp "' + rpmOut + '/$(ls ' + rpmOut + ')" ' + destination + '/'
]);
}
function getSnapBuildPath(arch) {
return `.build/linux/snap/${arch}/${product.applicationName}-${arch}`;
}
function prepareSnapPackage(arch) {
const binaryDir = '../VSCode-linux-' + arch;
// {{SQL CARBON EDIT}}
const binaryDir = '../azuredatastudio-linux-' + arch;
const destination = getSnapBuildPath(arch);
return function () {
const desktop = gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(rename(`usr/share/applications/${product.applicationName}.desktop`));
const desktopUrlHandler = gulp.src('resources/linux/code-url-handler.desktop', { base: '.' })
.pipe(rename(`usr/share/applications/${product.applicationName}-url-handler.desktop`));
const desktops = es.merge(desktop, desktopUrlHandler)
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@ICON@@', `/usr/share/pixmaps/${product.applicationName}.png`))
.pipe(rename(`usr/share/applications/${product.applicationName}.desktop`));
.pipe(replace('@@ICON@@', `/usr/share/pixmaps/${product.linuxIconName}.png`))
.pipe(replace('@@URLPROTOCOL@@', product.urlProtocol));
const icon = gulp.src('resources/linux/code.png', { base: '.' })
.pipe(rename(`usr/share/pixmaps/${product.applicationName}.png`));
.pipe(rename(`usr/share/pixmaps/${product.linuxIconName}.png`));
const code = gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) { p.dirname = 'usr/share/' + product.applicationName + '/' + p.dirname; }));
.pipe(rename(function (p) { p.dirname = `usr/share/${product.applicationName}/${p.dirname}`; }));
const snapcraft = gulp.src('resources/linux/snap/snapcraft.yaml', { base: '.' })
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(replace('@@VERSION@@', packageJson.version))
.pipe(replace('@@VERSION@@', commit.substr(0, 8)))
.pipe(rename('snap/snapcraft.yaml'));
const electronLaunch = gulp.src('resources/linux/snap/electron-launch', { base: '.' })
.pipe(rename('electron-launch'));
const all = es.merge(desktop, icon, code, snapcraft, electronLaunch);
const all = es.merge(desktops, icon, code, snapcraft, electronLaunch);
return all.pipe(vfs.dest(destination));
};
@@ -200,117 +237,39 @@ function prepareSnapPackage(arch) {
function buildSnapPackage(arch) {
const snapBuildPath = getSnapBuildPath(arch);
const snapFilename = `${product.applicationName}-${packageJson.version}-${linuxPackageRevision}-${arch}.snap`;
return shell.task([
`chmod +x ${snapBuildPath}/electron-launch`,
`cd ${snapBuildPath} && snapcraft snap --output ../${snapFilename}`
]);
return shell.task(`cd ${snapBuildPath} && snapcraft build`);
}
function getFlatpakArch(arch) {
return { x64: 'x86_64', ia32: 'i386', arm: 'arm' }[arch];
}
const BUILD_TARGETS = [
{ arch: 'ia32' },
{ arch: 'x64' },
{ arch: 'arm' },
{ arch: 'arm64' },
];
function prepareFlatpak(arch) {
// {{SQL CARBON EDIT}}
const binaryDir = '../azuredatastudio-linux-' + arch;
const flatpakArch = getFlatpakArch(arch);
const destination = '.build/linux/flatpak/' + flatpakArch;
BUILD_TARGETS.forEach((buildTarget) => {
const arch = buildTarget.arch;
return function () {
// This is not imported in the global scope to avoid requiring ImageMagick
// (or GraphicsMagick) when not building building Flatpak bundles.
const imgResize = require('gulp-image-resize');
const all = [16, 24, 32, 48, 64, 128, 192, 256, 512].map(function (size) {
return gulp.src('resources/linux/code.png', { base: '.' })
.pipe(imgResize({ width: size, height: size, format: "png", noProfile: true }))
.pipe(rename('share/icons/hicolor/' + size + 'x' + size + '/apps/' + flatpakManifest.appId + '.png'));
});
all.push(gulp.src('resources/linux/code.desktop', { base: '.' })
.pipe(replace('Exec=/usr/share/@@NAME@@/@@NAME@@', 'Exec=' + product.applicationName))
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME_SHORT@@', product.nameShort))
.pipe(replace('@@NAME@@', product.applicationName))
.pipe(rename('share/applications/' + flatpakManifest.appId + '.desktop')));
all.push(gulp.src('resources/linux/code.appdata.xml', { base: '.' })
.pipe(replace('@@NAME_LONG@@', product.nameLong))
.pipe(replace('@@NAME@@', flatpakManifest.appId))
.pipe(replace('@@LICENSE@@', product.licenseName))
.pipe(rename('share/appdata/' + flatpakManifest.appId + '.appdata.xml')));
all.push(gulp.src(binaryDir + '/**/*', { base: binaryDir })
.pipe(rename(function (p) {
p.dirname = 'share/' + product.applicationName + '/' + p.dirname;
})));
return es.merge(all).pipe(vfs.dest(destination));
};
}
function buildFlatpak(arch) {
const flatpakArch = getFlatpakArch(arch);
const manifest = {};
for (var k in flatpakManifest) {
manifest[k] = flatpakManifest[k];
{
const debArch = getDebPackageArch(arch);
const prepareDebTask = task.define(`vscode-linux-${arch}-prepare-deb`, task.series(util.rimraf(`.build/linux/deb/${debArch}`), prepareDebPackage(arch)));
// gulp.task(prepareDebTask);
const buildDebTask = task.define(`vscode-linux-${arch}-build-deb`, task.series(prepareDebTask, buildDebPackage(arch)));
gulp.task(buildDebTask);
}
manifest.files = [
['.build/linux/flatpak/' + flatpakArch, '/'],
];
const buildOptions = {
arch: flatpakArch,
subject: product.nameLong + ' ' + packageJson.version + '.' + linuxPackageRevision,
};
// If requested, use the configured path for the OSTree repository.
if (process.env.FLATPAK_REPO) {
buildOptions.repoDir = process.env.FLATPAK_REPO;
} else {
buildOptions.bundlePath = manifest.appId + '-' + flatpakArch + '.flatpak';
{
const rpmArch = getRpmPackageArch(arch);
const prepareRpmTask = task.define(`vscode-linux-${arch}-prepare-rpm`, task.series(util.rimraf(`.build/linux/rpm/${rpmArch}`), prepareRpmPackage(arch)));
// gulp.task(prepareRpmTask);
const buildRpmTask = task.define(`vscode-linux-${arch}-build-rpm`, task.series(prepareRpmTask, buildRpmPackage(arch)));
gulp.task(buildRpmTask);
}
// Setup PGP signing if requested.
if (process.env.GPG_KEY_ID !== undefined) {
buildOptions.gpgSign = process.env.GPG_KEY_ID;
if (process.env.GPG_HOMEDIR) {
buildOptions.gpgHomedir = process.env.GPG_HOME_DIR;
}
{
const prepareSnapTask = task.define(`vscode-linux-${arch}-prepare-snap`, task.series(util.rimraf(`.build/linux/snap/${arch}`), prepareSnapPackage(arch)));
gulp.task(prepareSnapTask);
const buildSnapTask = task.define(`vscode-linux-${arch}-build-snap`, task.series(prepareSnapTask, buildSnapPackage(arch)));
gulp.task(buildSnapTask);
}
return function (cb) {
require('flatpak-bundler').bundle(manifest, buildOptions, cb);
};
}
gulp.task('clean-vscode-linux-ia32-deb', util.rimraf('.build/linux/deb/i386'));
gulp.task('clean-vscode-linux-x64-deb', util.rimraf('.build/linux/deb/amd64'));
gulp.task('clean-vscode-linux-arm-deb', util.rimraf('.build/linux/deb/armhf'));
gulp.task('clean-vscode-linux-ia32-rpm', util.rimraf('.build/linux/rpm/i386'));
gulp.task('clean-vscode-linux-x64-rpm', util.rimraf('.build/linux/rpm/x86_64'));
gulp.task('clean-vscode-linux-arm-rpm', util.rimraf('.build/linux/rpm/armhf'));
gulp.task('clean-vscode-linux-ia32-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-x64-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-arm-snap', util.rimraf('.build/linux/snap/x64'));
gulp.task('clean-vscode-linux-ia32-flatpak', util.rimraf('.build/linux/flatpak/i386'));
gulp.task('clean-vscode-linux-x64-flatpak', util.rimraf('.build/linux/flatpak/x86_64'));
gulp.task('clean-vscode-linux-arm-flatpak', util.rimraf('.build/linux/flatpak/arm'));
gulp.task('vscode-linux-ia32-prepare-deb', ['clean-vscode-linux-ia32-deb'], prepareDebPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-deb', ['clean-vscode-linux-x64-deb'], prepareDebPackage('x64'));
gulp.task('vscode-linux-arm-prepare-deb', ['clean-vscode-linux-arm-deb'], prepareDebPackage('arm'));
gulp.task('vscode-linux-ia32-build-deb', ['vscode-linux-ia32-prepare-deb'], buildDebPackage('ia32'));
gulp.task('vscode-linux-x64-build-deb', ['vscode-linux-x64-prepare-deb'], buildDebPackage('x64'));
gulp.task('vscode-linux-arm-build-deb', ['vscode-linux-arm-prepare-deb'], buildDebPackage('arm'));
gulp.task('vscode-linux-ia32-prepare-rpm', ['clean-vscode-linux-ia32-rpm'], prepareRpmPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-rpm', ['clean-vscode-linux-x64-rpm'], prepareRpmPackage('x64'));
gulp.task('vscode-linux-arm-prepare-rpm', ['clean-vscode-linux-arm-rpm'], prepareRpmPackage('arm'));
gulp.task('vscode-linux-ia32-build-rpm', ['vscode-linux-ia32-prepare-rpm'], buildRpmPackage('ia32'));
gulp.task('vscode-linux-x64-build-rpm', ['vscode-linux-x64-prepare-rpm'], buildRpmPackage('x64'));
gulp.task('vscode-linux-arm-build-rpm', ['vscode-linux-arm-prepare-rpm'], buildRpmPackage('arm'));
gulp.task('vscode-linux-ia32-prepare-snap', ['clean-vscode-linux-ia32-snap'], prepareSnapPackage('ia32'));
gulp.task('vscode-linux-x64-prepare-snap', ['clean-vscode-linux-x64-snap'], prepareSnapPackage('x64'));
gulp.task('vscode-linux-arm-prepare-snap', ['clean-vscode-linux-arm-snap'], prepareSnapPackage('arm'));
gulp.task('vscode-linux-ia32-build-snap', ['vscode-linux-ia32-prepare-snap'], buildSnapPackage('ia32'));
gulp.task('vscode-linux-x64-build-snap', ['vscode-linux-x64-prepare-snap'], buildSnapPackage('x64'));
gulp.task('vscode-linux-arm-build-snap', ['vscode-linux-arm-prepare-snap'], buildSnapPackage('arm'));
});

View File

@@ -12,9 +12,11 @@ const assert = require('assert');
const cp = require('child_process');
const _7z = require('7zip')['7z'];
const util = require('./lib/util');
const task = require('./lib/task');
const pkg = require('../package.json');
const product = require('../product.json');
const vfs = require('vinyl-fs');
const rcedit = require('rcedit');
const mkdirp = require('mkdirp');
const repoPath = path.dirname(__dirname);
@@ -25,18 +27,21 @@ const zipPath = arch => path.join(zipDir(arch), `VSCode-win32-${arch}.zip`);
const setupDir = (arch, target) => path.join(repoPath, '.build', `win32-${arch}`, `${target}-setup`);
const issPath = path.join(__dirname, 'win32', 'code.iss');
const innoSetupPath = path.join(path.dirname(path.dirname(require.resolve('innosetup-compiler'))), 'bin', 'ISCC.exe');
const signPS1 = path.join(repoPath, 'build', 'tfs', 'win32', 'sign.ps1');
const signPS1 = path.join(repoPath, 'build', 'azure-pipelines', 'win32', 'sign.ps1');
function packageInnoSetup(iss, options, cb) {
options = options || {};
const definitions = options.definitions || {};
const debug = process.argv.some(arg => arg === '--debug-inno');
if (debug) {
if (process.argv.some(arg => arg === '--debug-inno')) {
definitions['Debug'] = 'true';
}
if (process.argv.some(arg => arg === '--sign')) {
definitions['Sign'] = 'true';
}
const keys = Object.keys(definitions);
keys.forEach(key => assert(typeof definitions[key] === 'string', `Missing value for '${key}' in Inno Setup package step`));
@@ -103,8 +108,8 @@ function buildWin32Setup(arch, target) {
}
function defineWin32SetupTasks(arch, target) {
gulp.task(`clean-vscode-win32-${arch}-${target}-setup`, util.rimraf(setupDir(arch, target)));
gulp.task(`vscode-win32-${arch}-${target}-setup`, [`clean-vscode-win32-${arch}-${target}-setup`], buildWin32Setup(arch, target));
const cleanTask = util.rimraf(setupDir(arch, target));
gulp.task(task.define(`vscode-win32-${arch}-${target}-setup`, task.series(cleanTask, buildWin32Setup(arch, target))));
}
defineWin32SetupTasks('ia32', 'system');
@@ -122,11 +127,8 @@ function archiveWin32Setup(arch) {
};
}
gulp.task('clean-vscode-win32-ia32-archive', util.rimraf(zipDir('ia32')));
gulp.task('vscode-win32-ia32-archive', ['clean-vscode-win32-ia32-archive'], archiveWin32Setup('ia32'));
gulp.task('clean-vscode-win32-x64-archive', util.rimraf(zipDir('x64')));
gulp.task('vscode-win32-x64-archive', ['clean-vscode-win32-x64-archive'], archiveWin32Setup('x64'));
gulp.task(task.define('vscode-win32-ia32-archive', task.series(util.rimraf(zipDir('ia32')), archiveWin32Setup('ia32'))));
gulp.task(task.define('vscode-win32-x64-archive', task.series(util.rimraf(zipDir('x64')), archiveWin32Setup('x64'))));
function copyInnoUpdater(arch) {
return () => {
@@ -135,5 +137,12 @@ function copyInnoUpdater(arch) {
};
}
gulp.task('vscode-win32-ia32-copy-inno-updater', copyInnoUpdater('ia32'));
gulp.task('vscode-win32-x64-copy-inno-updater', copyInnoUpdater('x64'));
function patchInnoUpdater(arch) {
return cb => {
const icon = path.join(repoPath, 'resources', 'win32', 'code.ico');
rcedit(path.join(buildPath(arch), 'tools', 'inno_updater.exe'), { icon }, cb);
};
}
gulp.task(task.define('vscode-win32-ia32-inno-updater', task.series(copyInnoUpdater('ia32'), patchInnoUpdater('ia32'))));
gulp.task(task.define('vscode-win32-x64-inno-updater', task.series(copyInnoUpdater('x64'), patchInnoUpdater('x64'))));

15
build/jsconfig.json Normal file
View File

@@ -0,0 +1,15 @@
{
"compilerOptions": {
"module": "commonjs",
"target": "es2017",
"jsx": "preserve",
"checkJs": true
},
"include": [
"**/*.js"
],
"exclude": [
"node_modules",
"**/node_modules/*"
]
}

View File

@@ -4,33 +4,33 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var es = require("event-stream");
var pickle = require("chromium-pickle-js");
var Filesystem = require("asar/lib/filesystem");
var VinylFile = require("vinyl");
var minimatch = require("minimatch");
const path = require("path");
const es = require("event-stream");
const pickle = require('chromium-pickle-js');
const Filesystem = require('asar/lib/filesystem');
const VinylFile = require("vinyl");
const minimatch = require("minimatch");
function createAsar(folderPath, unpackGlobs, destFilename) {
var shouldUnpackFile = function (file) {
for (var i = 0; i < unpackGlobs.length; i++) {
const shouldUnpackFile = (file) => {
for (let i = 0; i < unpackGlobs.length; i++) {
if (minimatch(file.relative, unpackGlobs[i])) {
return true;
}
}
return false;
};
var filesystem = new Filesystem(folderPath);
var out = [];
const filesystem = new Filesystem(folderPath);
const out = [];
// Keep track of pending inserts
var pendingInserts = 0;
var onFileInserted = function () { pendingInserts--; };
let pendingInserts = 0;
let onFileInserted = () => { pendingInserts--; };
// Do not insert twice the same directory
var seenDir = {};
var insertDirectoryRecursive = function (dir) {
const seenDir = {};
const insertDirectoryRecursive = (dir) => {
if (seenDir[dir]) {
return;
}
var lastSlash = dir.lastIndexOf('/');
let lastSlash = dir.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = dir.lastIndexOf('\\');
}
@@ -40,8 +40,8 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
seenDir[dir] = true;
filesystem.insertDirectory(dir);
};
var insertDirectoryForFile = function (file) {
var lastSlash = file.lastIndexOf('/');
const insertDirectoryForFile = (file) => {
let lastSlash = file.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = file.lastIndexOf('\\');
}
@@ -49,7 +49,7 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
insertDirectoryRecursive(file.substring(0, lastSlash));
}
};
var insertFile = function (relativePath, stat, shouldUnpack) {
const insertFile = (relativePath, stat, shouldUnpack) => {
insertDirectoryForFile(relativePath);
pendingInserts++;
filesystem.insertFile(relativePath, shouldUnpack, { stat: stat }, {}, onFileInserted);
@@ -59,13 +59,13 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
return;
}
if (!file.stat.isFile()) {
throw new Error("unknown item in stream!");
throw new Error(`unknown item in stream!`);
}
var shouldUnpack = shouldUnpackFile(file);
const shouldUnpack = shouldUnpackFile(file);
insertFile(file.relative, { size: file.contents.length, mode: file.stat.mode }, shouldUnpack);
if (shouldUnpack) {
// The file goes outside of xx.asar, in a folder xx.asar.unpacked
var relative = path.relative(folderPath, file.path);
const relative = path.relative(folderPath, file.path);
this.queue(new VinylFile({
cwd: folderPath,
base: folderPath,
@@ -79,34 +79,33 @@ function createAsar(folderPath, unpackGlobs, destFilename) {
out.push(file.contents);
}
}, function () {
var _this = this;
var finish = function () {
let finish = () => {
{
var headerPickle = pickle.createEmpty();
const headerPickle = pickle.createEmpty();
headerPickle.writeString(JSON.stringify(filesystem.header));
var headerBuf = headerPickle.toBuffer();
var sizePickle = pickle.createEmpty();
const headerBuf = headerPickle.toBuffer();
const sizePickle = pickle.createEmpty();
sizePickle.writeUInt32(headerBuf.length);
var sizeBuf = sizePickle.toBuffer();
const sizeBuf = sizePickle.toBuffer();
out.unshift(headerBuf);
out.unshift(sizeBuf);
}
var contents = Buffer.concat(out);
const contents = Buffer.concat(out);
out.length = 0;
_this.queue(new VinylFile({
this.queue(new VinylFile({
cwd: folderPath,
base: folderPath,
path: destFilename,
contents: contents
}));
_this.queue(null);
this.queue(null);
};
// Call finish() only when all file inserts have finished...
if (pendingInserts === 0) {
finish();
}
else {
onFileInserted = function () {
onFileInserted = () => {
pendingInserts--;
if (pendingInserts === 0) {
finish();

View File

@@ -7,8 +7,8 @@
import * as path from 'path';
import * as es from 'event-stream';
import * as pickle from 'chromium-pickle-js';
import * as Filesystem from 'asar/lib/filesystem';
const pickle = require('chromium-pickle-js');
const Filesystem = require('asar/lib/filesystem');
import * as VinylFile from 'vinyl';
import * as minimatch from 'minimatch';

View File

@@ -14,7 +14,8 @@ const es = require('event-stream');
const rename = require('gulp-rename');
const vfs = require('vinyl-fs');
const ext = require('./extensions');
const util = require('gulp-util');
const fancyLog = require('fancy-log');
const ansiColors = require('ansi-colors');
const root = path.dirname(path.dirname(__dirname));
const builtInExtensions = require('../builtInExtensions.json');
@@ -43,22 +44,22 @@ function isUpToDate(extension) {
function syncMarketplaceExtension(extension) {
if (isUpToDate(extension)) {
util.log(util.colors.blue('[marketplace]'), `${extension.name}@${extension.version}`, util.colors.green('✔︎'));
fancyLog(ansiColors.blue('[marketplace]'), `${extension.name}@${extension.version}`, ansiColors.green('✔︎'));
return es.readArray([]);
}
rimraf.sync(getExtensionPath(extension));
return ext.fromMarketplace(extension.name, extension.version)
return ext.fromMarketplace(extension.name, extension.version, extension.metadata)
.pipe(rename(p => p.dirname = `${extension.name}/${p.dirname}`))
.pipe(vfs.dest('.build/builtInExtensions'))
.on('end', () => util.log(util.colors.blue('[marketplace]'), extension.name, util.colors.green('✔︎')));
.on('end', () => fancyLog(ansiColors.blue('[marketplace]'), extension.name, ansiColors.green('✔︎')));
}
function syncExtension(extension, controlState) {
switch (controlState) {
case 'disabled':
util.log(util.colors.blue('[disabled]'), util.colors.gray(extension.name));
fancyLog(ansiColors.blue('[disabled]'), ansiColors.gray(extension.name));
return es.readArray([]);
case 'marketplace':
@@ -66,15 +67,15 @@ function syncExtension(extension, controlState) {
default:
if (!fs.existsSync(controlState)) {
util.log(util.colors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but that path does not exist.`));
fancyLog(ansiColors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but that path does not exist.`));
return es.readArray([]);
} else if (!fs.existsSync(path.join(controlState, 'package.json'))) {
util.log(util.colors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but there is no 'package.json' file in that directory.`));
fancyLog(ansiColors.red(`Error: Built-in extension '${extension.name}' is configured to run from '${controlState}' but there is no 'package.json' file in that directory.`));
return es.readArray([]);
}
util.log(util.colors.blue('[local]'), `${extension.name}: ${util.colors.cyan(controlState)}`, util.colors.green('✔︎'));
fancyLog(ansiColors.blue('[local]'), `${extension.name}: ${ansiColors.cyan(controlState)}`, ansiColors.green('✔︎'));
return es.readArray([]);
}
}
@@ -93,8 +94,8 @@ function writeControlFile(control) {
}
function main() {
util.log('Syncronizing built-in extensions...');
util.log(`You can manage built-in extensions with the ${util.colors.cyan('--builtin')} flag`);
fancyLog('Syncronizing built-in extensions...');
fancyLog(`You can manage built-in extensions with the ${ansiColors.cyan('--builtin')} flag`);
const control = readControlFile();
const streams = [];

View File

@@ -4,19 +4,19 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var fs = require("fs");
var path = require("path");
var vm = require("vm");
const fs = require("fs");
const path = require("path");
const vm = require("vm");
/**
* Bundle `entryPoints` given config `config`.
*/
function bundle(entryPoints, config, callback) {
var entryPointsMap = {};
entryPoints.forEach(function (module) {
const entryPointsMap = {};
entryPoints.forEach((module) => {
entryPointsMap[module.name] = module;
});
var allMentionedModulesMap = {};
entryPoints.forEach(function (module) {
const allMentionedModulesMap = {};
entryPoints.forEach((module) => {
allMentionedModulesMap[module.name] = true;
(module.include || []).forEach(function (includedModule) {
allMentionedModulesMap[includedModule] = true;
@@ -25,26 +25,30 @@ function bundle(entryPoints, config, callback) {
allMentionedModulesMap[excludedModule] = true;
});
});
var code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
var r = vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
var loaderModule = { exports: {} };
const code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
const r = vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
const loaderModule = { exports: {} };
r.call({}, require, loaderModule, loaderModule.exports);
var loader = loaderModule.exports;
const loader = loaderModule.exports;
config.isBuild = true;
config.paths = config.paths || {};
config.paths['vs/nls'] = 'out-build/vs/nls.build';
config.paths['vs/css'] = 'out-build/vs/css.build';
if (!config.paths['vs/nls']) {
config.paths['vs/nls'] = 'out-build/vs/nls.build';
}
if (!config.paths['vs/css']) {
config.paths['vs/css'] = 'out-build/vs/css.build';
}
loader.config(config);
loader(['require'], function (localRequire) {
var resolvePath = function (path) {
var r = localRequire.toUrl(path);
loader(['require'], (localRequire) => {
const resolvePath = (path) => {
const r = localRequire.toUrl(path);
if (!/\.js/.test(r)) {
return r + '.js';
}
return r;
};
for (var moduleId in entryPointsMap) {
var entryPoint = entryPointsMap[moduleId];
for (const moduleId in entryPointsMap) {
const entryPoint = entryPointsMap[moduleId];
if (entryPoint.append) {
entryPoint.append = entryPoint.append.map(resolvePath);
}
@@ -53,59 +57,59 @@ function bundle(entryPoints, config, callback) {
}
}
});
loader(Object.keys(allMentionedModulesMap), function () {
var modules = loader.getBuildInfo();
var partialResult = emitEntryPoints(modules, entryPointsMap);
var cssInlinedResources = loader('vs/css').getInlinedResources();
loader(Object.keys(allMentionedModulesMap), () => {
const modules = loader.getBuildInfo();
const partialResult = emitEntryPoints(modules, entryPointsMap);
const cssInlinedResources = loader('vs/css').getInlinedResources();
callback(null, {
files: partialResult.files,
cssInlinedResources: cssInlinedResources,
bundleData: partialResult.bundleData
});
}, function (err) { return callback(err, null); });
}, (err) => callback(err, null));
}
exports.bundle = bundle;
function emitEntryPoints(modules, entryPoints) {
var modulesMap = {};
modules.forEach(function (m) {
const modulesMap = {};
modules.forEach((m) => {
modulesMap[m.id] = m;
});
var modulesGraph = {};
modules.forEach(function (m) {
const modulesGraph = {};
modules.forEach((m) => {
modulesGraph[m.id] = m.dependencies;
});
var sortedModules = topologicalSort(modulesGraph);
var result = [];
var usedPlugins = {};
var bundleData = {
const sortedModules = topologicalSort(modulesGraph);
let result = [];
const usedPlugins = {};
const bundleData = {
graph: modulesGraph,
bundles: {}
};
Object.keys(entryPoints).forEach(function (moduleToBundle) {
var info = entryPoints[moduleToBundle];
var rootNodes = [moduleToBundle].concat(info.include || []);
var allDependencies = visit(rootNodes, modulesGraph);
var excludes = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach(function (excludeRoot) {
var allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach(function (exclude) {
Object.keys(entryPoints).forEach((moduleToBundle) => {
const info = entryPoints[moduleToBundle];
const rootNodes = [moduleToBundle].concat(info.include || []);
const allDependencies = visit(rootNodes, modulesGraph);
const excludes = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach((excludeRoot) => {
const allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach((exclude) => {
delete allDependencies[exclude];
});
});
var includedModules = sortedModules.filter(function (module) {
const includedModules = sortedModules.filter((module) => {
return allDependencies[module];
});
bundleData.bundles[moduleToBundle] = includedModules;
var res = emitEntryPoint(modulesMap, modulesGraph, moduleToBundle, includedModules, info.prepend, info.append, info.dest);
const res = emitEntryPoint(modulesMap, modulesGraph, moduleToBundle, includedModules, info.prepend || [], info.append || [], info.dest);
result = result.concat(res.files);
for (var pluginName in res.usedPlugins) {
for (const pluginName in res.usedPlugins) {
usedPlugins[pluginName] = usedPlugins[pluginName] || res.usedPlugins[pluginName];
}
});
Object.keys(usedPlugins).forEach(function (pluginName) {
var plugin = usedPlugins[pluginName];
Object.keys(usedPlugins).forEach((pluginName) => {
const plugin = usedPlugins[pluginName];
if (typeof plugin.finishBuild === 'function') {
var write = function (filename, contents) {
const write = (filename, contents) => {
result.push({
dest: filename,
sources: [{
@@ -124,16 +128,16 @@ function emitEntryPoints(modules, entryPoints) {
};
}
function extractStrings(destFiles) {
var parseDefineCall = function (moduleMatch, depsMatch) {
var module = moduleMatch.replace(/^"|"$/g, '');
var deps = depsMatch.split(',');
deps = deps.map(function (dep) {
const parseDefineCall = (moduleMatch, depsMatch) => {
const module = moduleMatch.replace(/^"|"$/g, '');
let deps = depsMatch.split(',');
deps = deps.map((dep) => {
dep = dep.trim();
dep = dep.replace(/^"|"$/g, '');
dep = dep.replace(/^'|'$/g, '');
var prefix = null;
var _path = null;
var pieces = dep.split('!');
let prefix = null;
let _path = null;
const pieces = dep.split('!');
if (pieces.length > 1) {
prefix = pieces[0] + '!';
_path = pieces[1];
@@ -143,7 +147,7 @@ function extractStrings(destFiles) {
_path = pieces[0];
}
if (/^\.\//.test(_path) || /^\.\.\//.test(_path)) {
var res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
const res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
return prefix + res;
}
return prefix + _path;
@@ -153,7 +157,7 @@ function extractStrings(destFiles) {
deps: deps
};
};
destFiles.forEach(function (destFile, index) {
destFiles.forEach((destFile) => {
if (!/\.js$/.test(destFile.dest)) {
return;
}
@@ -161,44 +165,44 @@ function extractStrings(destFiles) {
return;
}
// Do one pass to record the usage counts for each module id
var useCounts = {};
destFile.sources.forEach(function (source) {
var matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
const useCounts = {};
destFile.sources.forEach((source) => {
const matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
if (!matches) {
return;
}
var defineCall = parseDefineCall(matches[1], matches[2]);
const defineCall = parseDefineCall(matches[1], matches[2]);
useCounts[defineCall.module] = (useCounts[defineCall.module] || 0) + 1;
defineCall.deps.forEach(function (dep) {
defineCall.deps.forEach((dep) => {
useCounts[dep] = (useCounts[dep] || 0) + 1;
});
});
var sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort(function (a, b) {
const sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort((a, b) => {
return useCounts[b] - useCounts[a];
});
var replacementMap = {};
sortedByUseModules.forEach(function (module, index) {
const replacementMap = {};
sortedByUseModules.forEach((module, index) => {
replacementMap[module] = index;
});
destFile.sources.forEach(function (source) {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, function (_, moduleMatch, depsMatch) {
var defineCall = parseDefineCall(moduleMatch, depsMatch);
return "define(__m[" + replacementMap[defineCall.module] + "/*" + defineCall.module + "*/], __M([" + defineCall.deps.map(function (dep) { return replacementMap[dep] + '/*' + dep + '*/'; }).join(',') + "])";
destFile.sources.forEach((source) => {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, (_, moduleMatch, depsMatch) => {
const defineCall = parseDefineCall(moduleMatch, depsMatch);
return `define(__m[${replacementMap[defineCall.module]}/*${defineCall.module}*/], __M([${defineCall.deps.map(dep => replacementMap[dep] + '/*' + dep + '*/').join(',')}])`;
});
});
destFile.sources.unshift({
path: null,
contents: [
'(function() {',
"var __m = " + JSON.stringify(sortedByUseModules) + ";",
"var __M = function(deps) {",
" var result = [];",
" for (var i = 0, len = deps.length; i < len; i++) {",
" result[i] = __m[deps[i]];",
" }",
" return result;",
"};"
`var __m = ${JSON.stringify(sortedByUseModules)};`,
`var __M = function(deps) {`,
` var result = [];`,
` for (var i = 0, len = deps.length; i < len; i++) {`,
` result[i] = __m[deps[i]];`,
` }`,
` return result;`,
`};`
].join('\n')
});
destFile.sources.push({
@@ -210,7 +214,7 @@ function extractStrings(destFiles) {
}
function removeDuplicateTSBoilerplate(destFiles) {
// Taken from typescript compiler => emitFiles
var BOILERPLATE = [
const BOILERPLATE = [
{ start: /^var __extends/, end: /^}\)\(\);$/ },
{ start: /^var __assign/, end: /^};$/ },
{ start: /^var __decorate/, end: /^};$/ },
@@ -219,14 +223,14 @@ function removeDuplicateTSBoilerplate(destFiles) {
{ start: /^var __awaiter/, end: /^};$/ },
{ start: /^var __generator/, end: /^};$/ },
];
destFiles.forEach(function (destFile) {
var SEEN_BOILERPLATE = [];
destFile.sources.forEach(function (source) {
var lines = source.contents.split(/\r\n|\n|\r/);
var newLines = [];
var IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE;
for (var i = 0; i < lines.length; i++) {
var line = lines[i];
destFiles.forEach((destFile) => {
const SEEN_BOILERPLATE = [];
destFile.sources.forEach((source) => {
const lines = source.contents.split(/\r\n|\n|\r/);
const newLines = [];
let IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (IS_REMOVING_BOILERPLATE) {
newLines.push('');
if (END_BOILERPLATE.test(line)) {
@@ -234,8 +238,8 @@ function removeDuplicateTSBoilerplate(destFiles) {
}
}
else {
for (var j = 0; j < BOILERPLATE.length; j++) {
var boilerplate = BOILERPLATE[j];
for (let j = 0; j < BOILERPLATE.length; j++) {
const boilerplate = BOILERPLATE[j];
if (boilerplate.start.test(line)) {
if (SEEN_BOILERPLATE[j]) {
IS_REMOVING_BOILERPLATE = true;
@@ -263,45 +267,45 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
if (!dest) {
dest = entryPoint + '.js';
}
var mainResult = {
const mainResult = {
sources: [],
dest: dest
}, results = [mainResult];
var usedPlugins = {};
var getLoaderPlugin = function (pluginName) {
const usedPlugins = {};
const getLoaderPlugin = (pluginName) => {
if (!usedPlugins[pluginName]) {
usedPlugins[pluginName] = modulesMap[pluginName].exports;
}
return usedPlugins[pluginName];
};
includedModules.forEach(function (c) {
var bangIndex = c.indexOf('!');
includedModules.forEach((c) => {
const bangIndex = c.indexOf('!');
if (bangIndex >= 0) {
var pluginName = c.substr(0, bangIndex);
var plugin = getLoaderPlugin(pluginName);
const pluginName = c.substr(0, bangIndex);
const plugin = getLoaderPlugin(pluginName);
mainResult.sources.push(emitPlugin(entryPoint, plugin, pluginName, c.substr(bangIndex + 1)));
return;
}
var module = modulesMap[c];
const module = modulesMap[c];
if (module.path === 'empty:') {
return;
}
var contents = readFileAndRemoveBOM(module.path);
const contents = readFileAndRemoveBOM(module.path);
if (module.shim) {
mainResult.sources.push(emitShimmedModule(c, deps[c], module.shim, module.path, contents));
}
else {
mainResult.sources.push(emitNamedModule(c, deps[c], module.defineLocation, module.path, contents));
mainResult.sources.push(emitNamedModule(c, module.defineLocation, module.path, contents));
}
});
Object.keys(usedPlugins).forEach(function (pluginName) {
var plugin = usedPlugins[pluginName];
Object.keys(usedPlugins).forEach((pluginName) => {
const plugin = usedPlugins[pluginName];
if (typeof plugin.writeFile === 'function') {
var req = (function () {
const req = (() => {
throw new Error('no-no!');
});
req.toUrl = function (something) { return something; };
var write = function (filename, contents) {
req.toUrl = something => something;
const write = (filename, contents) => {
results.push({
dest: filename,
sources: [{
@@ -313,15 +317,15 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
plugin.writeFile(pluginName, entryPoint, req, write, {});
}
});
var toIFile = function (path) {
var contents = readFileAndRemoveBOM(path);
const toIFile = (path) => {
const contents = readFileAndRemoveBOM(path);
return {
path: path,
contents: contents
};
};
var toPrepend = (prepend || []).map(toIFile);
var toAppend = (append || []).map(toIFile);
const toPrepend = (prepend || []).map(toIFile);
const toAppend = (append || []).map(toIFile);
mainResult.sources = toPrepend.concat(mainResult.sources).concat(toAppend);
return {
files: results,
@@ -329,8 +333,8 @@ function emitEntryPoint(modulesMap, deps, entryPoint, includedModules, prepend,
};
}
function readFileAndRemoveBOM(path) {
var BOM_CHAR_CODE = 65279;
var contents = fs.readFileSync(path, 'utf8');
const BOM_CHAR_CODE = 65279;
let contents = fs.readFileSync(path, 'utf8');
// Remove BOM
if (contents.charCodeAt(0) === BOM_CHAR_CODE) {
contents = contents.substring(1);
@@ -338,15 +342,15 @@ function readFileAndRemoveBOM(path) {
return contents;
}
function emitPlugin(entryPoint, plugin, pluginName, moduleName) {
var result = '';
let result = '';
if (typeof plugin.write === 'function') {
var write = (function (what) {
const write = ((what) => {
result += what;
});
write.getEntryPoint = function () {
write.getEntryPoint = () => {
return entryPoint;
};
write.asModule = function (moduleId, code) {
write.asModule = (moduleId, code) => {
code = code.replace(/^define\(/, 'define("' + moduleId + '",');
result += code;
};
@@ -357,20 +361,20 @@ function emitPlugin(entryPoint, plugin, pluginName, moduleName) {
contents: result
};
}
function emitNamedModule(moduleId, myDeps, defineCallPosition, path, contents) {
function emitNamedModule(moduleId, defineCallPosition, path, contents) {
// `defineCallPosition` is the position in code: |define()
var defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
const defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
// `parensOffset` is the position in code: define|()
var parensOffset = contents.indexOf('(', defineCallOffset);
var insertStr = '"' + moduleId + '", ';
const parensOffset = contents.indexOf('(', defineCallOffset);
const insertStr = '"' + moduleId + '", ';
return {
path: path,
contents: contents.substr(0, parensOffset + 1) + insertStr + contents.substr(parensOffset + 1)
};
}
function emitShimmedModule(moduleId, myDeps, factory, path, contents) {
var strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
var strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
const strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
const strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
return {
path: path,
contents: contents + '\n;\n' + strDefine
@@ -383,7 +387,8 @@ function positionToOffset(str, desiredLine, desiredCol) {
if (desiredLine === 1) {
return desiredCol - 1;
}
var line = 1, lastNewLineOffset = -1;
let line = 1;
let lastNewLineOffset = -1;
do {
if (desiredLine === line) {
return lastNewLineOffset + 1 + desiredCol - 1;
@@ -397,14 +402,15 @@ function positionToOffset(str, desiredLine, desiredCol) {
* Return a set of reachable nodes in `graph` starting from `rootNodes`
*/
function visit(rootNodes, graph) {
var result = {}, queue = rootNodes;
rootNodes.forEach(function (node) {
const result = {};
const queue = rootNodes;
rootNodes.forEach((node) => {
result[node] = true;
});
while (queue.length > 0) {
var el = queue.shift();
var myEdges = graph[el] || [];
myEdges.forEach(function (toNode) {
const el = queue.shift();
const myEdges = graph[el] || [];
myEdges.forEach((toNode) => {
if (!result[toNode]) {
result[toNode] = true;
queue.push(toNode);
@@ -417,11 +423,11 @@ function visit(rootNodes, graph) {
* Perform a topological sort on `graph`
*/
function topologicalSort(graph) {
var allNodes = {}, outgoingEdgeCount = {}, inverseEdges = {};
Object.keys(graph).forEach(function (fromNode) {
const allNodes = {}, outgoingEdgeCount = {}, inverseEdges = {};
Object.keys(graph).forEach((fromNode) => {
allNodes[fromNode] = true;
outgoingEdgeCount[fromNode] = graph[fromNode].length;
graph[fromNode].forEach(function (toNode) {
graph[fromNode].forEach((toNode) => {
allNodes[toNode] = true;
outgoingEdgeCount[toNode] = outgoingEdgeCount[toNode] || 0;
inverseEdges[toNode] = inverseEdges[toNode] || [];
@@ -429,8 +435,8 @@ function topologicalSort(graph) {
});
});
// https://en.wikipedia.org/wiki/Topological_sorting
var S = [], L = [];
Object.keys(allNodes).forEach(function (node) {
const S = [], L = [];
Object.keys(allNodes).forEach((node) => {
if (outgoingEdgeCount[node] === 0) {
delete outgoingEdgeCount[node];
S.push(node);
@@ -439,10 +445,10 @@ function topologicalSort(graph) {
while (S.length > 0) {
// Ensure the exact same order all the time with the same inputs
S.sort();
var n = S.shift();
const n = S.shift();
L.push(n);
var myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach(function (m) {
const myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach((m) => {
outgoingEdgeCount[m]--;
if (outgoingEdgeCount[m] === 0) {
delete outgoingEdgeCount[m];

View File

@@ -3,9 +3,9 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
import fs = require('fs');
import path = require('path');
import vm = require('vm');
import * as fs from 'fs';
import * as path from 'path';
import * as vm from 'vm';
interface IPosition {
line: number;
@@ -46,7 +46,7 @@ export interface IEntryPoint {
name: string;
include?: string[];
exclude?: string[];
prepend: string[];
prepend?: string[];
append?: string[];
dest?: string;
}
@@ -64,7 +64,7 @@ interface INodeSet {
}
export interface IFile {
path: string;
path: string | null;
contents: string;
}
@@ -97,13 +97,13 @@ export interface ILoaderConfig {
/**
* Bundle `entryPoints` given config `config`.
*/
export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callback: (err: any, result: IBundleResult) => void): void {
let entryPointsMap: IEntryPointMap = {};
export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callback: (err: any, result: IBundleResult | null) => void): void {
const entryPointsMap: IEntryPointMap = {};
entryPoints.forEach((module: IEntryPoint) => {
entryPointsMap[module.name] = module;
});
let allMentionedModulesMap: { [modules: string]: boolean; } = {};
const allMentionedModulesMap: { [modules: string]: boolean; } = {};
entryPoints.forEach((module: IEntryPoint) => {
allMentionedModulesMap[module.name] = true;
(module.include || []).forEach(function (includedModule) {
@@ -115,28 +115,32 @@ export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callba
});
var code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
var r: Function = <any>vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
var loaderModule = { exports: {} };
const code = require('fs').readFileSync(path.join(__dirname, '../../src/vs/loader.js'));
const r: Function = <any>vm.runInThisContext('(function(require, module, exports) { ' + code + '\n});');
const loaderModule = { exports: {} };
r.call({}, require, loaderModule, loaderModule.exports);
var loader: any = loaderModule.exports;
const loader: any = loaderModule.exports;
config.isBuild = true;
config.paths = config.paths || {};
config.paths['vs/nls'] = 'out-build/vs/nls.build';
config.paths['vs/css'] = 'out-build/vs/css.build';
if (!config.paths['vs/nls']) {
config.paths['vs/nls'] = 'out-build/vs/nls.build';
}
if (!config.paths['vs/css']) {
config.paths['vs/css'] = 'out-build/vs/css.build';
}
loader.config(config);
loader(['require'], (localRequire) => {
let resolvePath = (path: string) => {
let r = localRequire.toUrl(path);
loader(['require'], (localRequire: any) => {
const resolvePath = (path: string) => {
const r = localRequire.toUrl(path);
if (!/\.js/.test(r)) {
return r + '.js';
}
return r;
};
for (let moduleId in entryPointsMap) {
let entryPoint = entryPointsMap[moduleId];
for (const moduleId in entryPointsMap) {
const entryPoint = entryPointsMap[moduleId];
if (entryPoint.append) {
entryPoint.append = entryPoint.append.map(resolvePath);
}
@@ -147,76 +151,76 @@ export function bundle(entryPoints: IEntryPoint[], config: ILoaderConfig, callba
});
loader(Object.keys(allMentionedModulesMap), () => {
let modules = <IBuildModuleInfo[]>loader.getBuildInfo();
let partialResult = emitEntryPoints(modules, entryPointsMap);
let cssInlinedResources = loader('vs/css').getInlinedResources();
const modules = <IBuildModuleInfo[]>loader.getBuildInfo();
const partialResult = emitEntryPoints(modules, entryPointsMap);
const cssInlinedResources = loader('vs/css').getInlinedResources();
callback(null, {
files: partialResult.files,
cssInlinedResources: cssInlinedResources,
bundleData: partialResult.bundleData
});
}, (err) => callback(err, null));
}, (err: any) => callback(err, null));
}
function emitEntryPoints(modules: IBuildModuleInfo[], entryPoints: IEntryPointMap): IPartialBundleResult {
let modulesMap: IBuildModuleInfoMap = {};
const modulesMap: IBuildModuleInfoMap = {};
modules.forEach((m: IBuildModuleInfo) => {
modulesMap[m.id] = m;
});
let modulesGraph: IGraph = {};
const modulesGraph: IGraph = {};
modules.forEach((m: IBuildModuleInfo) => {
modulesGraph[m.id] = m.dependencies;
});
let sortedModules = topologicalSort(modulesGraph);
const sortedModules = topologicalSort(modulesGraph);
let result: IConcatFile[] = [];
let usedPlugins: IPluginMap = {};
let bundleData: IBundleData = {
const usedPlugins: IPluginMap = {};
const bundleData: IBundleData = {
graph: modulesGraph,
bundles: {}
};
Object.keys(entryPoints).forEach((moduleToBundle: string) => {
let info = entryPoints[moduleToBundle];
let rootNodes = [moduleToBundle].concat(info.include || []);
let allDependencies = visit(rootNodes, modulesGraph);
let excludes: string[] = ['require', 'exports', 'module'].concat(info.exclude || []);
const info = entryPoints[moduleToBundle];
const rootNodes = [moduleToBundle].concat(info.include || []);
const allDependencies = visit(rootNodes, modulesGraph);
const excludes: string[] = ['require', 'exports', 'module'].concat(info.exclude || []);
excludes.forEach((excludeRoot: string) => {
let allExcludes = visit([excludeRoot], modulesGraph);
const allExcludes = visit([excludeRoot], modulesGraph);
Object.keys(allExcludes).forEach((exclude: string) => {
delete allDependencies[exclude];
});
});
let includedModules = sortedModules.filter((module: string) => {
const includedModules = sortedModules.filter((module: string) => {
return allDependencies[module];
});
bundleData.bundles[moduleToBundle] = includedModules;
let res = emitEntryPoint(
const res = emitEntryPoint(
modulesMap,
modulesGraph,
moduleToBundle,
includedModules,
info.prepend,
info.append,
info.prepend || [],
info.append || [],
info.dest
);
result = result.concat(res.files);
for (let pluginName in res.usedPlugins) {
for (const pluginName in res.usedPlugins) {
usedPlugins[pluginName] = usedPlugins[pluginName] || res.usedPlugins[pluginName];
}
});
Object.keys(usedPlugins).forEach((pluginName: string) => {
let plugin = usedPlugins[pluginName];
const plugin = usedPlugins[pluginName];
if (typeof plugin.finishBuild === 'function') {
let write = (filename: string, contents: string) => {
const write = (filename: string, contents: string) => {
result.push({
dest: filename,
sources: [{
@@ -237,16 +241,16 @@ function emitEntryPoints(modules: IBuildModuleInfo[], entryPoints: IEntryPointMa
}
function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
let parseDefineCall = (moduleMatch: string, depsMatch: string) => {
let module = moduleMatch.replace(/^"|"$/g, '');
const parseDefineCall = (moduleMatch: string, depsMatch: string) => {
const module = moduleMatch.replace(/^"|"$/g, '');
let deps = depsMatch.split(',');
deps = deps.map((dep) => {
dep = dep.trim();
dep = dep.replace(/^"|"$/g, '');
dep = dep.replace(/^'|'$/g, '');
let prefix: string = null;
let _path: string = null;
let pieces = dep.split('!');
let prefix: string | null = null;
let _path: string | null = null;
const pieces = dep.split('!');
if (pieces.length > 1) {
prefix = pieces[0] + '!';
_path = pieces[1];
@@ -256,7 +260,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
}
if (/^\.\//.test(_path) || /^\.\.\//.test(_path)) {
let res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
const res = path.join(path.dirname(module), _path).replace(/\\/g, '/');
return prefix + res;
}
return prefix + _path;
@@ -267,7 +271,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
};
};
destFiles.forEach((destFile, index) => {
destFiles.forEach((destFile) => {
if (!/\.js$/.test(destFile.dest)) {
return;
}
@@ -276,33 +280,33 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
}
// Do one pass to record the usage counts for each module id
let useCounts: { [moduleId: string]: number; } = {};
const useCounts: { [moduleId: string]: number; } = {};
destFile.sources.forEach((source) => {
let matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
const matches = source.contents.match(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/);
if (!matches) {
return;
}
let defineCall = parseDefineCall(matches[1], matches[2]);
const defineCall = parseDefineCall(matches[1], matches[2]);
useCounts[defineCall.module] = (useCounts[defineCall.module] || 0) + 1;
defineCall.deps.forEach((dep) => {
useCounts[dep] = (useCounts[dep] || 0) + 1;
});
});
let sortedByUseModules = Object.keys(useCounts);
const sortedByUseModules = Object.keys(useCounts);
sortedByUseModules.sort((a, b) => {
return useCounts[b] - useCounts[a];
});
let replacementMap: { [moduleId: string]: number; } = {};
const replacementMap: { [moduleId: string]: number; } = {};
sortedByUseModules.forEach((module, index) => {
replacementMap[module] = index;
});
destFile.sources.forEach((source) => {
source.contents = source.contents.replace(/define\(("[^"]+"),\s*\[(((, )?("|')[^"']+("|'))+)\]/, (_, moduleMatch, depsMatch) => {
let defineCall = parseDefineCall(moduleMatch, depsMatch);
const defineCall = parseDefineCall(moduleMatch, depsMatch);
return `define(__m[${replacementMap[defineCall.module]}/*${defineCall.module}*/], __M([${defineCall.deps.map(dep => replacementMap[dep] + '/*' + dep + '*/').join(',')}])`;
});
});
@@ -332,7 +336,7 @@ function extractStrings(destFiles: IConcatFile[]): IConcatFile[] {
function removeDuplicateTSBoilerplate(destFiles: IConcatFile[]): IConcatFile[] {
// Taken from typescript compiler => emitFiles
let BOILERPLATE = [
const BOILERPLATE = [
{ start: /^var __extends/, end: /^}\)\(\);$/ },
{ start: /^var __assign/, end: /^};$/ },
{ start: /^var __decorate/, end: /^};$/ },
@@ -343,22 +347,22 @@ function removeDuplicateTSBoilerplate(destFiles: IConcatFile[]): IConcatFile[] {
];
destFiles.forEach((destFile) => {
let SEEN_BOILERPLATE = [];
const SEEN_BOILERPLATE: boolean[] = [];
destFile.sources.forEach((source) => {
let lines = source.contents.split(/\r\n|\n|\r/);
let newLines: string[] = [];
const lines = source.contents.split(/\r\n|\n|\r/);
const newLines: string[] = [];
let IS_REMOVING_BOILERPLATE = false, END_BOILERPLATE: RegExp;
for (let i = 0; i < lines.length; i++) {
let line = lines[i];
const line = lines[i];
if (IS_REMOVING_BOILERPLATE) {
newLines.push('');
if (END_BOILERPLATE.test(line)) {
if (END_BOILERPLATE!.test(line)) {
IS_REMOVING_BOILERPLATE = false;
}
} else {
for (let j = 0; j < BOILERPLATE.length; j++) {
let boilerplate = BOILERPLATE[j];
const boilerplate = BOILERPLATE[j];
if (boilerplate.start.test(line)) {
if (SEEN_BOILERPLATE[j]) {
IS_REMOVING_BOILERPLATE = true;
@@ -398,19 +402,19 @@ function emitEntryPoint(
includedModules: string[],
prepend: string[],
append: string[],
dest: string
dest: string | undefined
): IEmitEntryPointResult {
if (!dest) {
dest = entryPoint + '.js';
}
let mainResult: IConcatFile = {
const mainResult: IConcatFile = {
sources: [],
dest: dest
},
results: IConcatFile[] = [mainResult];
let usedPlugins: IPluginMap = {};
let getLoaderPlugin = (pluginName: string): ILoaderPlugin => {
const usedPlugins: IPluginMap = {};
const getLoaderPlugin = (pluginName: string): ILoaderPlugin => {
if (!usedPlugins[pluginName]) {
usedPlugins[pluginName] = modulesMap[pluginName].exports;
}
@@ -418,39 +422,39 @@ function emitEntryPoint(
};
includedModules.forEach((c: string) => {
let bangIndex = c.indexOf('!');
const bangIndex = c.indexOf('!');
if (bangIndex >= 0) {
let pluginName = c.substr(0, bangIndex);
let plugin = getLoaderPlugin(pluginName);
const pluginName = c.substr(0, bangIndex);
const plugin = getLoaderPlugin(pluginName);
mainResult.sources.push(emitPlugin(entryPoint, plugin, pluginName, c.substr(bangIndex + 1)));
return;
}
let module = modulesMap[c];
const module = modulesMap[c];
if (module.path === 'empty:') {
return;
}
let contents = readFileAndRemoveBOM(module.path);
const contents = readFileAndRemoveBOM(module.path);
if (module.shim) {
mainResult.sources.push(emitShimmedModule(c, deps[c], module.shim, module.path, contents));
} else {
mainResult.sources.push(emitNamedModule(c, deps[c], module.defineLocation, module.path, contents));
mainResult.sources.push(emitNamedModule(c, module.defineLocation, module.path, contents));
}
});
Object.keys(usedPlugins).forEach((pluginName: string) => {
let plugin = usedPlugins[pluginName];
const plugin = usedPlugins[pluginName];
if (typeof plugin.writeFile === 'function') {
let req: ILoaderPluginReqFunc = <any>(() => {
const req: ILoaderPluginReqFunc = <any>(() => {
throw new Error('no-no!');
});
req.toUrl = something => something;
let write = (filename: string, contents: string) => {
const write = (filename: string, contents: string) => {
results.push({
dest: filename,
sources: [{
@@ -463,16 +467,16 @@ function emitEntryPoint(
}
});
let toIFile = (path): IFile => {
let contents = readFileAndRemoveBOM(path);
const toIFile = (path: string): IFile => {
const contents = readFileAndRemoveBOM(path);
return {
path: path,
contents: contents
};
};
let toPrepend = (prepend || []).map(toIFile);
let toAppend = (append || []).map(toIFile);
const toPrepend = (prepend || []).map(toIFile);
const toAppend = (append || []).map(toIFile);
mainResult.sources = toPrepend.concat(mainResult.sources).concat(toAppend);
@@ -483,8 +487,8 @@ function emitEntryPoint(
}
function readFileAndRemoveBOM(path: string): string {
var BOM_CHAR_CODE = 65279;
var contents = fs.readFileSync(path, 'utf8');
const BOM_CHAR_CODE = 65279;
let contents = fs.readFileSync(path, 'utf8');
// Remove BOM
if (contents.charCodeAt(0) === BOM_CHAR_CODE) {
contents = contents.substring(1);
@@ -495,7 +499,7 @@ function readFileAndRemoveBOM(path: string): string {
function emitPlugin(entryPoint: string, plugin: ILoaderPlugin, pluginName: string, moduleName: string): IFile {
let result = '';
if (typeof plugin.write === 'function') {
let write: ILoaderPluginWriteFunc = <any>((what) => {
const write: ILoaderPluginWriteFunc = <any>((what: string) => {
result += what;
});
write.getEntryPoint = () => {
@@ -513,15 +517,15 @@ function emitPlugin(entryPoint: string, plugin: ILoaderPlugin, pluginName: strin
};
}
function emitNamedModule(moduleId: string, myDeps: string[], defineCallPosition: IPosition, path: string, contents: string): IFile {
function emitNamedModule(moduleId: string, defineCallPosition: IPosition, path: string, contents: string): IFile {
// `defineCallPosition` is the position in code: |define()
let defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
const defineCallOffset = positionToOffset(contents, defineCallPosition.line, defineCallPosition.col);
// `parensOffset` is the position in code: define|()
let parensOffset = contents.indexOf('(', defineCallOffset);
const parensOffset = contents.indexOf('(', defineCallOffset);
let insertStr = '"' + moduleId + '", ';
const insertStr = '"' + moduleId + '", ';
return {
path: path,
@@ -530,8 +534,8 @@ function emitNamedModule(moduleId: string, myDeps: string[], defineCallPosition:
}
function emitShimmedModule(moduleId: string, myDeps: string[], factory: string, path: string, contents: string): IFile {
let strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
let strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
const strDeps = (myDeps.length > 0 ? '"' + myDeps.join('", "') + '"' : '');
const strDefine = 'define("' + moduleId + '", [' + strDeps + '], ' + factory + ');';
return {
path: path,
contents: contents + '\n;\n' + strDefine
@@ -546,8 +550,8 @@ function positionToOffset(str: string, desiredLine: number, desiredCol: number):
return desiredCol - 1;
}
let line = 1,
lastNewLineOffset = -1;
let line = 1;
let lastNewLineOffset = -1;
do {
if (desiredLine === line) {
@@ -565,16 +569,16 @@ function positionToOffset(str: string, desiredLine: number, desiredCol: number):
* Return a set of reachable nodes in `graph` starting from `rootNodes`
*/
function visit(rootNodes: string[], graph: IGraph): INodeSet {
let result: INodeSet = {},
queue = rootNodes;
const result: INodeSet = {};
const queue = rootNodes;
rootNodes.forEach((node) => {
result[node] = true;
});
while (queue.length > 0) {
let el = queue.shift();
let myEdges = graph[el] || [];
const el = queue.shift();
const myEdges = graph[el!] || [];
myEdges.forEach((toNode) => {
if (!result[toNode]) {
result[toNode] = true;
@@ -591,7 +595,7 @@ function visit(rootNodes: string[], graph: IGraph): INodeSet {
*/
function topologicalSort(graph: IGraph): string[] {
let allNodes: INodeSet = {},
const allNodes: INodeSet = {},
outgoingEdgeCount: { [node: string]: number; } = {},
inverseEdges: IGraph = {};
@@ -609,7 +613,7 @@ function topologicalSort(graph: IGraph): string[] {
});
// https://en.wikipedia.org/wiki/Topological_sorting
let S: string[] = [],
const S: string[] = [],
L: string[] = [];
Object.keys(allNodes).forEach((node: string) => {
@@ -623,10 +627,10 @@ function topologicalSort(graph: IGraph): string[] {
// Ensure the exact same order all the time with the same inputs
S.sort();
let n: string = S.shift();
const n: string = S.shift()!;
L.push(n);
let myInverseEdges = inverseEdges[n] || [];
const myInverseEdges = inverseEdges[n] || [];
myInverseEdges.forEach((m: string) => {
outgoingEdgeCount[m]--;
if (outgoingEdgeCount[m] === 0) {

View File

@@ -4,44 +4,54 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var gulp = require("gulp");
var tsb = require("gulp-tsb");
var es = require("event-stream");
var watch = require('./watch');
var nls = require("./nls");
var util = require("./util");
var reporter_1 = require("./reporter");
var path = require("path");
var bom = require("gulp-bom");
var sourcemaps = require("gulp-sourcemaps");
var _ = require("underscore");
var monacodts = require("../monaco/api");
var fs = require("fs");
var reporter = reporter_1.createReporter();
const es = require("event-stream");
const fs = require("fs");
const gulp = require("gulp");
const bom = require("gulp-bom");
const sourcemaps = require("gulp-sourcemaps");
const tsb = require("gulp-tsb");
const path = require("path");
const _ = require("underscore");
const monacodts = require("../monaco/api");
const nls = require("./nls");
const reporter_1 = require("./reporter");
const util = require("./util");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const watch = require('./watch');
const reporter = reporter_1.createReporter();
function getTypeScriptCompilerOptions(src) {
var rootDir = path.join(__dirname, "../../" + src);
var options = require("../../" + src + "/tsconfig.json").compilerOptions;
const rootDir = path.join(__dirname, `../../${src}`);
const tsconfig = require(`../../${src}/tsconfig.json`);
let options;
if (tsconfig.extends) {
options = Object.assign({}, require(path.join(rootDir, tsconfig.extends)).compilerOptions, tsconfig.compilerOptions);
}
else {
options = tsconfig.compilerOptions;
}
options.verbose = false;
options.sourceMap = true;
if (process.env['VSCODE_NO_SOURCEMAP']) { // To be used by developers in a hurry
options.sourceMap = false;
}
options.rootDir = rootDir;
options.baseUrl = rootDir;
options.sourceRoot = util.toFileUri(rootDir);
options.newLine = /\r\n/.test(fs.readFileSync(__filename, 'utf8')) ? 'CRLF' : 'LF';
return options;
}
function createCompile(src, build, emitError) {
var opts = _.clone(getTypeScriptCompilerOptions(src));
const opts = _.clone(getTypeScriptCompilerOptions(src));
opts.inlineSources = !!build;
opts.noFilesystemLookup = true;
var ts = tsb.create(opts, null, null, function (err) { return reporter(err.toString()); });
const ts = tsb.create(opts, true, undefined, err => reporter(err.toString()));
return function (token) {
var utf8Filter = util.filter(function (data) { return /(\/|\\)test(\/|\\).*utf8/.test(data.path); });
var tsFilter = util.filter(function (data) { return /\.ts$/.test(data.path); });
var noDeclarationsFilter = util.filter(function (data) { return !(/\.d\.ts$/.test(data.path)); });
var input = es.through();
var output = input
const utf8Filter = util.filter(data => /(\/|\\)test(\/|\\).*utf8/.test(data.path));
const tsFilter = util.filter(data => /\.ts$/.test(data.path));
const noDeclarationsFilter = util.filter(data => !(/\.d\.ts$/.test(data.path)));
const input = es.through();
const output = input
.pipe(utf8Filter)
.pipe(bom())
.pipe(utf8Filter.restore)
@@ -57,91 +67,136 @@ function createCompile(src, build, emitError) {
sourceRoot: opts.sourceRoot
}))
.pipe(tsFilter.restore)
.pipe(reporter.end(emitError));
.pipe(reporter.end(!!emitError));
return es.duplex(input, output);
};
}
const typesDts = [
'node_modules/typescript/lib/*.d.ts',
'node_modules/@types/**/*.d.ts',
'!node_modules/@types/webpack/**/*',
'!node_modules/@types/uglify-js/**/*',
];
function compileTask(src, out, build) {
return function () {
var compile = createCompile(src, build, true);
var srcPipe = es.merge(gulp.src(src + "/**", { base: "" + src }), gulp.src('node_modules/typescript/lib/lib.d.ts'));
// Do not write .d.ts files to disk, as they are not needed there.
var dtsFilter = util.filter(function (data) { return !/\.d\.ts$/.test(data.path); });
const compile = createCompile(src, build, true);
const srcPipe = es.merge(gulp.src(`${src}/**`, { base: `${src}` }), gulp.src(typesDts));
let generator = new MonacoGenerator(false);
if (src === 'src') {
generator.execute();
}
return srcPipe
.pipe(generator.stream)
.pipe(compile())
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(src !== 'src' ? es.through() : monacodtsTask(out, false));
.pipe(gulp.dest(out));
};
}
exports.compileTask = compileTask;
function watchTask(out, build) {
return function () {
var compile = createCompile('src', build);
var src = es.merge(gulp.src('src/**', { base: 'src' }), gulp.src('node_modules/typescript/lib/lib.d.ts'));
var watchSrc = watch('src/**', { base: 'src' });
// Do not write .d.ts files to disk, as they are not needed there.
var dtsFilter = util.filter(function (data) { return !/\.d\.ts$/.test(data.path); });
const compile = createCompile('src', build);
const src = es.merge(gulp.src('src/**', { base: 'src' }), gulp.src(typesDts));
const watchSrc = watch('src/**', { base: 'src' });
let generator = new MonacoGenerator(true);
generator.execute();
return watchSrc
.pipe(generator.stream)
.pipe(util.incremental(compile, src, true))
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(monacodtsTask(out, true));
.pipe(gulp.dest(out));
};
}
exports.watchTask = watchTask;
function monacodtsTask(out, isWatch) {
var basePath = path.resolve(process.cwd(), out);
var neededFiles = {};
monacodts.getFilesToWatch(out).forEach(function (filePath) {
filePath = path.normalize(filePath);
neededFiles[filePath] = true;
});
var inputFiles = {};
for (var filePath in neededFiles) {
if (/\bsrc(\/|\\)vs\b/.test(filePath)) {
// This file is needed from source => simply read it now
inputFiles[filePath] = fs.readFileSync(filePath).toString();
const REPO_SRC_FOLDER = path.join(__dirname, '../../src');
class MonacoGenerator {
constructor(isWatch) {
this._executeSoonTimer = null;
this._isWatch = isWatch;
this.stream = es.through();
this._watchers = [];
this._watchedFiles = {};
let onWillReadFile = (moduleId, filePath) => {
if (!this._isWatch) {
return;
}
if (this._watchedFiles[filePath]) {
return;
}
this._watchedFiles[filePath] = true;
const watcher = fs.watch(filePath);
watcher.addListener('change', () => {
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
watcher.addListener('error', (err) => {
console.error(`Encountered error while watching ${filePath}.`);
console.log(err);
delete this._watchedFiles[filePath];
for (let i = 0; i < this._watchers.length; i++) {
if (this._watchers[i] === watcher) {
this._watchers.splice(i, 1);
break;
}
}
watcher.close();
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
this._watchers.push(watcher);
};
this._fsProvider = new class extends monacodts.FSProvider {
readFileSync(moduleId, filePath) {
onWillReadFile(moduleId, filePath);
return super.readFileSync(moduleId, filePath);
}
};
this._declarationResolver = new monacodts.DeclarationResolver(this._fsProvider);
if (this._isWatch) {
const recipeWatcher = fs.watch(monacodts.RECIPE_PATH);
recipeWatcher.addListener('change', () => {
this._executeSoon();
});
this._watchers.push(recipeWatcher);
}
}
var setInputFile = function (filePath, contents) {
if (inputFiles[filePath] === contents) {
// no change
_executeSoon() {
if (this._executeSoonTimer !== null) {
clearTimeout(this._executeSoonTimer);
this._executeSoonTimer = null;
}
this._executeSoonTimer = setTimeout(() => {
this._executeSoonTimer = null;
this.execute();
}, 20);
}
dispose() {
this._watchers.forEach(watcher => watcher.close());
}
_run() {
let r = monacodts.run3(this._declarationResolver);
if (!r && !this._isWatch) {
// The build must always be able to generate the monaco.d.ts
throw new Error(`monaco.d.ts generation error - Cannot continue`);
}
return r;
}
_log(message, ...rest) {
fancyLog(ansiColors.cyan('[monaco.d.ts]'), message, ...rest);
}
execute() {
const startTime = Date.now();
const result = this._run();
if (!result) {
// nothing really changed
return;
}
inputFiles[filePath] = contents;
var neededInputFilesCount = Object.keys(neededFiles).length;
var availableInputFilesCount = Object.keys(inputFiles).length;
if (neededInputFilesCount === availableInputFilesCount) {
run();
if (result.isTheSame) {
return;
}
};
var run = function () {
var result = monacodts.run(out, inputFiles);
if (!result.isTheSame) {
if (isWatch) {
fs.writeFileSync(result.filePath, result.content);
}
else {
fs.writeFileSync(result.filePath, result.content);
resultStream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
fs.writeFileSync(result.filePath, result.content);
fs.writeFileSync(path.join(REPO_SRC_FOLDER, 'vs/editor/common/standalone/standaloneEnums.ts'), result.enums);
this._log(`monaco.d.ts is changed - total time took ${Date.now() - startTime} ms`);
if (!this._isWatch) {
this.stream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
};
var resultStream;
if (isWatch) {
watch('build/monaco/*').pipe(es.through(function () {
run();
}));
}
resultStream = es.through(function (data) {
var filePath = path.normalize(path.resolve(basePath, data.relative));
if (neededFiles[filePath]) {
setInputFile(filePath, data.contents.toString());
}
this.emit('data', data);
});
return resultStream;
}

View File

@@ -5,31 +5,41 @@
'use strict';
import * as gulp from 'gulp';
import * as tsb from 'gulp-tsb';
import * as es from 'event-stream';
const watch = require('./watch');
import * as nls from './nls';
import * as util from './util';
import { createReporter } from './reporter';
import * as path from 'path';
import * as fs from 'fs';
import * as gulp from 'gulp';
import * as bom from 'gulp-bom';
import * as sourcemaps from 'gulp-sourcemaps';
import * as tsb from 'gulp-tsb';
import * as path from 'path';
import * as _ from 'underscore';
import * as monacodts from '../monaco/api';
import * as fs from 'fs';
import * as nls from './nls';
import { createReporter } from './reporter';
import * as util from './util';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
const watch = require('./watch');
const reporter = createReporter();
function getTypeScriptCompilerOptions(src: string) {
const rootDir = path.join(__dirname, `../../${src}`);
const options = require(`../../${src}/tsconfig.json`).compilerOptions;
const tsconfig = require(`../../${src}/tsconfig.json`);
let options: { [key: string]: any };
if (tsconfig.extends) {
options = Object.assign({}, require(path.join(rootDir, tsconfig.extends)).compilerOptions, tsconfig.compilerOptions);
} else {
options = tsconfig.compilerOptions;
}
options.verbose = false;
options.sourceMap = true;
if (process.env['VSCODE_NO_SOURCEMAP']) { // To be used by developers in a hurry
options.sourceMap = false;
}
options.rootDir = rootDir;
options.baseUrl = rootDir;
options.sourceRoot = util.toFileUri(rootDir);
options.newLine = /\r\n/.test(fs.readFileSync(__filename, 'utf8')) ? 'CRLF' : 'LF';
return options;
@@ -40,7 +50,7 @@ function createCompile(src: string, build: boolean, emitError?: boolean): (token
opts.inlineSources = !!build;
opts.noFilesystemLookup = true;
const ts = tsb.create(opts, null, null, err => reporter(err.toString()));
const ts = tsb.create(opts, true, undefined, err => reporter(err.toString()));
return function (token?: util.ICancellationToken) {
@@ -65,12 +75,19 @@ function createCompile(src: string, build: boolean, emitError?: boolean): (token
sourceRoot: opts.sourceRoot
}))
.pipe(tsFilter.restore)
.pipe(reporter.end(emitError));
.pipe(reporter.end(!!emitError));
return es.duplex(input, output);
};
}
const typesDts = [
'node_modules/typescript/lib/*.d.ts',
'node_modules/@types/**/*.d.ts',
'!node_modules/@types/webpack/**/*',
'!node_modules/@types/uglify-js/**/*',
];
export function compileTask(src: string, out: string, build: boolean): () => NodeJS.ReadWriteStream {
return function () {
@@ -78,18 +95,18 @@ export function compileTask(src: string, out: string, build: boolean): () => Nod
const srcPipe = es.merge(
gulp.src(`${src}/**`, { base: `${src}` }),
gulp.src('node_modules/typescript/lib/lib.d.ts'),
gulp.src(typesDts),
);
// Do not write .d.ts files to disk, as they are not needed there.
const dtsFilter = util.filter(data => !/\.d\.ts$/.test(data.path));
let generator = new MonacoGenerator(false);
if (src === 'src') {
generator.execute();
}
return srcPipe
.pipe(generator.stream)
.pipe(compile())
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(src !== 'src' ? es.through() : monacodtsTask(out, false));
.pipe(gulp.dest(out));
};
}
@@ -100,80 +117,128 @@ export function watchTask(out: string, build: boolean): () => NodeJS.ReadWriteSt
const src = es.merge(
gulp.src('src/**', { base: 'src' }),
gulp.src('node_modules/typescript/lib/lib.d.ts'),
gulp.src(typesDts),
);
const watchSrc = watch('src/**', { base: 'src' });
// Do not write .d.ts files to disk, as they are not needed there.
const dtsFilter = util.filter(data => !/\.d\.ts$/.test(data.path));
let generator = new MonacoGenerator(true);
generator.execute();
return watchSrc
.pipe(generator.stream)
.pipe(util.incremental(compile, src, true))
.pipe(dtsFilter)
.pipe(gulp.dest(out))
.pipe(dtsFilter.restore)
.pipe(monacodtsTask(out, true));
.pipe(gulp.dest(out));
};
}
function monacodtsTask(out: string, isWatch: boolean): NodeJS.ReadWriteStream {
const REPO_SRC_FOLDER = path.join(__dirname, '../../src');
const basePath = path.resolve(process.cwd(), out);
class MonacoGenerator {
private readonly _isWatch: boolean;
public readonly stream: NodeJS.ReadWriteStream;
const neededFiles: { [file: string]: boolean; } = {};
monacodts.getFilesToWatch(out).forEach(function (filePath) {
filePath = path.normalize(filePath);
neededFiles[filePath] = true;
});
private readonly _watchers: fs.FSWatcher[];
private readonly _watchedFiles: { [filePath: string]: boolean; };
private readonly _fsProvider: monacodts.FSProvider;
private readonly _declarationResolver: monacodts.DeclarationResolver;
const inputFiles: { [file: string]: string; } = {};
for (let filePath in neededFiles) {
if (/\bsrc(\/|\\)vs\b/.test(filePath)) {
// This file is needed from source => simply read it now
inputFiles[filePath] = fs.readFileSync(filePath).toString();
constructor(isWatch: boolean) {
this._isWatch = isWatch;
this.stream = es.through();
this._watchers = [];
this._watchedFiles = {};
let onWillReadFile = (moduleId: string, filePath: string) => {
if (!this._isWatch) {
return;
}
if (this._watchedFiles[filePath]) {
return;
}
this._watchedFiles[filePath] = true;
const watcher = fs.watch(filePath);
watcher.addListener('change', () => {
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
watcher.addListener('error', (err) => {
console.error(`Encountered error while watching ${filePath}.`);
console.log(err);
delete this._watchedFiles[filePath];
for (let i = 0; i < this._watchers.length; i++) {
if (this._watchers[i] === watcher) {
this._watchers.splice(i, 1);
break;
}
}
watcher.close();
this._declarationResolver.invalidateCache(moduleId);
this._executeSoon();
});
this._watchers.push(watcher);
};
this._fsProvider = new class extends monacodts.FSProvider {
public readFileSync(moduleId: string, filePath: string): Buffer {
onWillReadFile(moduleId, filePath);
return super.readFileSync(moduleId, filePath);
}
};
this._declarationResolver = new monacodts.DeclarationResolver(this._fsProvider);
if (this._isWatch) {
const recipeWatcher = fs.watch(monacodts.RECIPE_PATH);
recipeWatcher.addListener('change', () => {
this._executeSoon();
});
this._watchers.push(recipeWatcher);
}
}
const setInputFile = (filePath: string, contents: string) => {
if (inputFiles[filePath] === contents) {
// no change
private _executeSoonTimer: NodeJS.Timer | null = null;
private _executeSoon(): void {
if (this._executeSoonTimer !== null) {
clearTimeout(this._executeSoonTimer);
this._executeSoonTimer = null;
}
this._executeSoonTimer = setTimeout(() => {
this._executeSoonTimer = null;
this.execute();
}, 20);
}
public dispose(): void {
this._watchers.forEach(watcher => watcher.close());
}
private _run(): monacodts.IMonacoDeclarationResult | null {
let r = monacodts.run3(this._declarationResolver);
if (!r && !this._isWatch) {
// The build must always be able to generate the monaco.d.ts
throw new Error(`monaco.d.ts generation error - Cannot continue`);
}
return r;
}
private _log(message: any, ...rest: any[]): void {
fancyLog(ansiColors.cyan('[monaco.d.ts]'), message, ...rest);
}
public execute(): void {
const startTime = Date.now();
const result = this._run();
if (!result) {
// nothing really changed
return;
}
inputFiles[filePath] = contents;
const neededInputFilesCount = Object.keys(neededFiles).length;
const availableInputFilesCount = Object.keys(inputFiles).length;
if (neededInputFilesCount === availableInputFilesCount) {
run();
if (result.isTheSame) {
return;
}
};
const run = () => {
const result = monacodts.run(out, inputFiles);
if (!result.isTheSame) {
if (isWatch) {
fs.writeFileSync(result.filePath, result.content);
} else {
fs.writeFileSync(result.filePath, result.content);
resultStream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
fs.writeFileSync(result.filePath, result.content);
fs.writeFileSync(path.join(REPO_SRC_FOLDER, 'vs/editor/common/standalone/standaloneEnums.ts'), result.enums);
this._log(`monaco.d.ts is changed - total time took ${Date.now() - startTime} ms`);
if (!this._isWatch) {
this.stream.emit('error', 'monaco.d.ts is no longer up to date. Please run gulp watch and commit the new file.');
}
};
let resultStream: NodeJS.ReadWriteStream;
if (isWatch) {
watch('build/monaco/*').pipe(es.through(function () {
run();
}));
}
resultStream = es.through(function (data) {
const filePath = path.normalize(path.resolve(basePath, data.relative));
if (neededFiles[filePath]) {
setInputFile(filePath, data.contents.toString());
}
this.emit('data', data);
});
return resultStream;
}

View File

@@ -11,6 +11,7 @@ const root = path.dirname(path.dirname(__dirname));
function getElectronVersion() {
const yarnrc = fs.readFileSync(path.join(root, '.yarnrc'), 'utf8');
// @ts-ignore
const target = /^target "(.*)"$/m.exec(yarnrc)[1];
return target;
@@ -19,6 +20,7 @@ function getElectronVersion() {
module.exports.getElectronVersion = getElectronVersion;
// returns 0 if the right version of electron is in .build/electron
// @ts-ignore
if (require.main === module) {
const version = getElectronVersion();
const versionFile = path.join(root, '.build', 'electron', 'version');

View File

@@ -4,116 +4,319 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var es = require("event-stream");
var assign = require("object-assign");
var remote = require("gulp-remote-src");
var flatmap = require('gulp-flatmap');
var vzip = require('gulp-vinyl-zip');
var filter = require('gulp-filter');
var rename = require('gulp-rename');
var util = require('gulp-util');
var buffer = require('gulp-buffer');
var json = require('gulp-json-editor');
var fs = require("fs");
var path = require("path");
var vsce = require("vsce");
var File = require("vinyl");
function fromLocal(extensionPath) {
var result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
.then(function (fileNames) {
var files = fileNames
.map(function (fileName) { return path.join(extensionPath, fileName); })
.map(function (filePath) { return new File({
const es = require("event-stream");
const fs = require("fs");
const glob = require("glob");
const gulp = require("gulp");
const path = require("path");
const File = require("vinyl");
const vsce = require("vsce");
const stats_1 = require("./stats");
const util2 = require("./util");
const remote = require("gulp-remote-src");
const vzip = require('gulp-vinyl-zip');
const filter = require("gulp-filter");
const rename = require("gulp-rename");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const buffer = require('gulp-buffer');
const json = require("gulp-json-editor");
const webpack = require('webpack');
const webpackGulp = require('webpack-stream');
const root = path.resolve(path.join(__dirname, '..', '..'));
// {{SQL CARBON EDIT}}
const _ = require("underscore");
const vfs = require("vinyl-fs");
const deps = require('../dependencies');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
const packagePath = path.join(path.dirname(root), element.name + '.vsix');
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
exports.packageBuiltInExtensions = packageBuiltInExtensions;
function packageExtensionTask(extensionName, platform, arch) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', extensionName);
}
else {
destination = path.join(destination, 'resources', 'app', 'extensions', extensionName);
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '../..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => extensionName === name);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util2.skipDirectories())
.pipe(util2.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
exports.packageExtensionTask = packageExtensionTask;
// {{SQL CARBON EDIT}} - End
function fromLocal(extensionPath, sourceMappingURLBase) {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
if (fs.existsSync(webpackFilename)) {
return fromLocalWebpack(extensionPath, sourceMappingURLBase);
}
else {
return fromLocalNormal(extensionPath);
}
}
function fromLocalWebpack(extensionPath, sourceMappingURLBase) {
const result = es.through();
const packagedDependencies = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
}
}
}
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn, packagedDependencies }).then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath)
}); });
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = glob.sync(path.join(extensionPath, '/**/extension.webpack.config.js'), { ignore: ['**/node_modules'] });
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => () => {
const webpackDone = (err, stats) => {
fancyLog(`Bundled extension: ${ansiColors.yellow(path.join(path.basename(extensionPath), path.relative(extensionPath, webpackConfigPath)))}...`);
if (err) {
result.emit('error', err);
}
const { compilation } = stats;
if (compilation.errors.length > 0) {
result.emit('error', compilation.errors.join('\n'));
}
if (compilation.warnings.length > 0) {
result.emit('error', compilation.warnings.join('\n'));
}
};
const webpackConfig = Object.assign({}, require(webpackConfigPath), { mode: 'production' });
const relativeOutputPath = path.relative(extensionPath, webpackConfig.output.path);
return webpackGulp(webpackConfig, webpack, webpackDone)
.pipe(es.through(function (data) {
data.stat = data.stat || {};
data.base = extensionPath;
this.emit('data', data);
}))
.pipe(es.through(function (data) {
// source map handling:
// * rewrite sourceMappingURL
// * save to disk so that upload-task picks this up
if (sourceMappingURLBase) {
const contents = data.contents.toString('utf8');
data.contents = Buffer.from(contents.replace(/\n\/\/# sourceMappingURL=(.*)$/gm, function (_m, g1) {
return `\n//# sourceMappingURL=${sourceMappingURLBase}/extensions/${path.basename(extensionPath)}/${relativeOutputPath}/${g1}`;
}), 'utf8');
if (/\.js\.map$/.test(data.path)) {
if (!fs.existsSync(path.dirname(data.path))) {
fs.mkdirSync(path.dirname(data.path));
}
fs.writeFileSync(data.path, data.contents);
}
}
this.emit('data', data);
}));
});
es.merge(sequence(webpackStreams), patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
// this.emit('data', data);
// }))
.pipe(result);
}).catch(err => {
console.error(extensionPath);
console.error(packagedDependencies);
result.emit('error', err);
});
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
}
function fromLocalNormal(extensionPath) {
const result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
.then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath)
}));
es.readArray(files).pipe(result);
})
.catch(function (err) { return result.emit('error', err); });
return result;
.catch(err => result.emit('error', err));
return result.pipe(stats_1.createStatsStream(path.basename(extensionPath)));
}
exports.fromLocal = fromLocal;
function error(err) {
var result = es.through();
setTimeout(function () { return result.emit('error', err); });
return result;
}
var baseHeaders = {
const baseHeaders = {
'X-Market-Client-Id': 'VSCode Build',
'User-Agent': 'VSCode Build',
'X-Market-User-Id': '291C1CD0-051A-4123-9B4B-30D60EF52EE2',
};
function fromMarketplace(extensionName, version) {
var filterType = 7;
var value = extensionName;
var criterium = { filterType: filterType, value: value };
var criteria = [criterium];
var pageNumber = 1;
var pageSize = 1;
var sortBy = 0;
var sortOrder = 0;
var flags = 0x1 | 0x2 | 0x80;
var assetTypes = ['Microsoft.VisualStudio.Services.VSIXPackage'];
var filters = [{ criteria: criteria, pageNumber: pageNumber, pageSize: pageSize, sortBy: sortBy, sortOrder: sortOrder }];
var body = JSON.stringify({ filters: filters, assetTypes: assetTypes, flags: flags });
var headers = assign({}, baseHeaders, {
'Content-Type': 'application/json',
'Accept': 'application/json;api-version=3.0-preview.1',
'Content-Length': body.length
});
var options = {
base: 'https://marketplace.visualstudio.com/_apis/public/gallery',
function fromMarketplace(extensionName, version, metadata) {
const [publisher, name] = extensionName.split('.');
const url = `https://marketplace.visualstudio.com/_apis/public/gallery/publishers/${publisher}/vsextensions/${name}/${version}/vspackage`;
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: url,
requestOptions: {
method: 'POST',
gzip: true,
headers: headers,
body: body
headers: baseHeaders
}
};
return remote('/extensionquery', options)
.pipe(flatmap(function (stream, f) {
var rawResult = f.contents.toString('utf8');
var result = JSON.parse(rawResult);
var extension = result.results[0].extensions[0];
if (!extension) {
return error("No such extension: " + extension);
}
var metadata = {
id: extension.extensionId,
publisherId: extension.publisher,
publisherDisplayName: extension.publisher.displayName
};
var extensionVersion = extension.versions.filter(function (v) { return v.version === version; })[0];
if (!extensionVersion) {
return error("No such extension version: " + extensionName + " @ " + version);
}
var asset = extensionVersion.files.filter(function (f) { return f.assetType === 'Microsoft.VisualStudio.Services.VSIXPackage'; })[0];
if (!asset) {
return error("No VSIX found for extension version: " + extensionName + " @ " + version);
}
util.log('Downloading extension:', util.colors.yellow(extensionName + "@" + version), '...');
var options = {
base: asset.source,
requestOptions: {
gzip: true,
headers: baseHeaders
}
};
return remote('', options)
.pipe(flatmap(function (stream) {
var packageJsonFilter = filter('package.json', { restore: true });
return stream
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(function (p) { return p.dirname = p.dirname.replace(/^extension\/?/, ''); }))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}));
}));
const packageJsonFilter = filter('package.json', { restore: true });
return remote('', options)
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}
exports.fromMarketplace = fromMarketplace;
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
// {{SQL CARBON EDIT}}
'integration-tests'
];
// {{SQL CARBON EDIT}}
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'admin-tool-ext-win',
'agent',
'import',
'profiler',
'admin-pack',
'big-data-cluster',
'dacpac'
];
var azureExtensions = ['azurecore', 'mssql'];
const builtInExtensions = require('../builtInExtensions.json');
/**
* We're doing way too much stuff at once, with webpack et al. So much stuff
* that while downloading extensions from the marketplace, node js doesn't get enough
* stack frames to complete the download in under 2 minutes, at which point the
* marketplace server cuts off the http request. So, we sequentialize the extensino tasks.
*/
function sequence(streamProviders) {
const result = es.through();
function pop() {
if (streamProviders.length === 0) {
result.emit('end');
}
else {
const fn = streamProviders.shift();
fn()
.on('end', function () { setTimeout(pop, 0); })
.pipe(result, { end: false });
}
}
pop();
return result;
}
function packageExtensionsStream(optsIn) {
const opts = optsIn || {};
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1)
.filter(({ name }) => azureExtensions.indexOf(name) === -1);
const localExtensions = () => sequence([...localExtensionDescriptions.map(extension => () => {
return fromLocal(extension.path, opts.sourceMappingURLBase)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
})]);
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map((d) => path.relative(root, d.path)).map((d) => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = () => gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util2.cleanNodeModule('account-provider-azure', ['node_modules/date-utils/doc/**', 'node_modules/adal_node/node_modules/**'], undefined))
.pipe(util2.cleanNodeModule('typescript', ['**/**'], undefined));
// Original code commented out here
// const localExtensionDependencies = () => gulp.src('extensions/node_modules/**', { base: '.' });
// const marketplaceExtensions = () => es.merge(
// ...builtInExtensions
// .filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
// .map(extension => {
// return fromMarketplace(extension.name, extension.version, extension.metadata)
// .pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
// })
// );
return sequence([localExtensions, localExtensionDependencies,])
.pipe(util2.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
// {{SQL CARBON EDIT}} - End
}
exports.packageExtensionsStream = packageExtensionsStream;

View File

@@ -4,22 +4,222 @@
*--------------------------------------------------------------------------------------------*/
import * as es from 'event-stream';
import { Stream } from 'stream';
import assign = require('object-assign');
import remote = require('gulp-remote-src');
const flatmap = require('gulp-flatmap');
const vzip = require('gulp-vinyl-zip');
const filter = require('gulp-filter');
const rename = require('gulp-rename');
const util = require('gulp-util');
const buffer = require('gulp-buffer');
const json = require('gulp-json-editor');
import * as fs from 'fs';
import * as glob from 'glob';
import * as gulp from 'gulp';
import * as path from 'path';
import * as vsce from 'vsce';
import { Stream } from 'stream';
import * as File from 'vinyl';
import * as vsce from 'vsce';
import { createStatsStream } from './stats';
import * as util2 from './util';
import remote = require('gulp-remote-src');
const vzip = require('gulp-vinyl-zip');
import filter = require('gulp-filter');
import rename = require('gulp-rename');
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
const buffer = require('gulp-buffer');
import json = require('gulp-json-editor');
const webpack = require('webpack');
const webpackGulp = require('webpack-stream');
export function fromLocal(extensionPath: string): Stream {
const root = path.resolve(path.join(__dirname, '..', '..'));
// {{SQL CARBON EDIT}}
import * as _ from 'underscore';
import * as vfs from 'vinyl-fs';
const deps = require('../dependencies');
const extensionsRoot = path.join(root, 'extensions');
const extensionsProductionDependencies = deps.getProductionDependencies(extensionsRoot);
export function packageBuiltInExtensions() {
const sqlBuiltInLocalExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) >= 0);
sqlBuiltInLocalExtensionDescriptions.forEach(element => {
const packagePath = path.join(path.dirname(root), element.name + '.vsix');
console.info('Creating vsix for ' + element.path + ' result:' + packagePath);
vsce.createVSIX({
cwd: element.path,
packagePath: packagePath,
useYarn: true
});
});
}
export function packageExtensionTask(extensionName: string, platform: string, arch: string) {
var destination = path.join(path.dirname(root), 'azuredatastudio') + (platform ? '-' + platform : '') + (arch ? '-' + arch : '');
if (platform === 'darwin') {
destination = path.join(destination, 'Azure Data Studio.app', 'Contents', 'Resources', 'app', 'extensions', extensionName);
} else {
destination = path.join(destination, 'resources', 'app', 'extensions', extensionName);
}
platform = platform || process.platform;
return () => {
const root = path.resolve(path.join(__dirname, '../..'));
const localExtensionDescriptions = glob.sync('extensions/*/package.json')
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => extensionName === name);
const localExtensions = es.merge(...localExtensionDescriptions.map(extension => {
return fromLocal(extension.path);
}));
let result = localExtensions
.pipe(util2.skipDirectories())
.pipe(util2.fixWin32DirectoryPermissions())
.pipe(filter(['**', '!LICENSE', '!LICENSES.chromium.html', '!version']));
return result.pipe(vfs.dest(destination));
};
}
// {{SQL CARBON EDIT}} - End
function fromLocal(extensionPath: string, sourceMappingURLBase?: string): Stream {
const webpackFilename = path.join(extensionPath, 'extension.webpack.config.js');
if (fs.existsSync(webpackFilename)) {
return fromLocalWebpack(extensionPath, sourceMappingURLBase);
} else {
return fromLocalNormal(extensionPath);
}
}
function fromLocalWebpack(extensionPath: string, sourceMappingURLBase: string | undefined): Stream {
const result = es.through();
const packagedDependencies: string[] = [];
const packageJsonConfig = require(path.join(extensionPath, 'package.json'));
if (packageJsonConfig.dependencies) {
const webpackRootConfig = require(path.join(extensionPath, 'extension.webpack.config.js'));
for (const key in webpackRootConfig.externals) {
if (key in packageJsonConfig.dependencies) {
packagedDependencies.push(key);
}
}
}
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn, packagedDependencies }).then(fileNames => {
const files = fileNames
.map(fileName => path.join(extensionPath, fileName))
.map(filePath => new File({
path: filePath,
stat: fs.statSync(filePath),
base: extensionPath,
contents: fs.createReadStream(filePath) as any
}));
const filesStream = es.readArray(files);
// check for a webpack configuration files, then invoke webpack
// and merge its output with the files stream. also rewrite the package.json
// file to a new entry point
const webpackConfigLocations = (<string[]>glob.sync(
path.join(extensionPath, '/**/extension.webpack.config.js'),
{ ignore: ['**/node_modules'] }
));
const packageJsonFilter = filter(f => {
if (path.basename(f.path) === 'package.json') {
// only modify package.json's next to the webpack file.
// to be safe, use existsSync instead of path comparison.
return fs.existsSync(path.join(path.dirname(f.path), 'extension.webpack.config.js'));
}
return false;
}, { restore: true });
const patchFilesStream = filesStream
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json((data: any) => {
if (data.main) {
// hardcoded entry point directory!
data.main = data.main.replace('/out/', /dist/);
}
return data;
}))
.pipe(packageJsonFilter.restore);
const webpackStreams = webpackConfigLocations.map(webpackConfigPath => () => {
const webpackDone = (err: any, stats: any) => {
fancyLog(`Bundled extension: ${ansiColors.yellow(path.join(path.basename(extensionPath), path.relative(extensionPath, webpackConfigPath)))}...`);
if (err) {
result.emit('error', err);
}
const { compilation } = stats;
if (compilation.errors.length > 0) {
result.emit('error', compilation.errors.join('\n'));
}
if (compilation.warnings.length > 0) {
result.emit('error', compilation.warnings.join('\n'));
}
};
const webpackConfig = {
...require(webpackConfigPath),
...{ mode: 'production' }
};
const relativeOutputPath = path.relative(extensionPath, webpackConfig.output.path);
return webpackGulp(webpackConfig, webpack, webpackDone)
.pipe(es.through(function (data) {
data.stat = data.stat || {};
data.base = extensionPath;
this.emit('data', data);
}))
.pipe(es.through(function (data: File) {
// source map handling:
// * rewrite sourceMappingURL
// * save to disk so that upload-task picks this up
if (sourceMappingURLBase) {
const contents = (<Buffer>data.contents).toString('utf8');
data.contents = Buffer.from(contents.replace(/\n\/\/# sourceMappingURL=(.*)$/gm, function (_m, g1) {
return `\n//# sourceMappingURL=${sourceMappingURLBase}/extensions/${path.basename(extensionPath)}/${relativeOutputPath}/${g1}`;
}), 'utf8');
if (/\.js\.map$/.test(data.path)) {
if (!fs.existsSync(path.dirname(data.path))) {
fs.mkdirSync(path.dirname(data.path));
}
fs.writeFileSync(data.path, data.contents);
}
}
this.emit('data', data);
}));
});
es.merge(sequence(webpackStreams), patchFilesStream)
// .pipe(es.through(function (data) {
// // debug
// console.log('out', data.path, data.contents.length);
// this.emit('data', data);
// }))
.pipe(result);
}).catch(err => {
console.error(extensionPath);
console.error(packagedDependencies);
result.emit('error', err);
});
return result.pipe(createStatsStream(path.basename(extensionPath)));
}
function fromLocalNormal(extensionPath: string): Stream {
const result = es.through();
vsce.listFiles({ cwd: extensionPath, packageManager: vsce.PackageManager.Yarn })
@@ -37,13 +237,7 @@ export function fromLocal(extensionPath: string): Stream {
})
.catch(err => result.emit('error', err));
return result;
}
function error(err: any): Stream {
const result = es.through();
setTimeout(() => result.emit('error', err));
return result;
return result.pipe(createStatsStream(path.basename(extensionPath)));
}
const baseHeaders = {
@@ -52,82 +246,142 @@ const baseHeaders = {
'X-Market-User-Id': '291C1CD0-051A-4123-9B4B-30D60EF52EE2',
};
export function fromMarketplace(extensionName: string, version: string): Stream {
const filterType = 7;
const value = extensionName;
const criterium = { filterType, value };
const criteria = [criterium];
const pageNumber = 1;
const pageSize = 1;
const sortBy = 0;
const sortOrder = 0;
const flags = 0x1 | 0x2 | 0x80;
const assetTypes = ['Microsoft.VisualStudio.Services.VSIXPackage'];
const filters = [{ criteria, pageNumber, pageSize, sortBy, sortOrder }];
const body = JSON.stringify({ filters, assetTypes, flags });
const headers: any = assign({}, baseHeaders, {
'Content-Type': 'application/json',
'Accept': 'application/json;api-version=3.0-preview.1',
'Content-Length': body.length
});
export function fromMarketplace(extensionName: string, version: string, metadata: any): Stream {
const [publisher, name] = extensionName.split('.');
const url = `https://marketplace.visualstudio.com/_apis/public/gallery/publishers/${publisher}/vsextensions/${name}/${version}/vspackage`;
fancyLog('Downloading extension:', ansiColors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: 'https://marketplace.visualstudio.com/_apis/public/gallery',
base: url,
requestOptions: {
method: 'POST',
gzip: true,
headers,
body: body
headers: baseHeaders
}
};
return remote('/extensionquery', options)
.pipe(flatmap((stream, f) => {
const rawResult = f.contents.toString('utf8');
const result = JSON.parse(rawResult);
const extension = result.results[0].extensions[0];
if (!extension) {
return error(`No such extension: ${extension}`);
}
const packageJsonFilter = filter('package.json', { restore: true });
const metadata = {
id: extension.extensionId,
publisherId: extension.publisher,
publisherDisplayName: extension.publisher.displayName
};
const extensionVersion = extension.versions.filter(v => v.version === version)[0];
if (!extensionVersion) {
return error(`No such extension version: ${extensionName} @ ${version}`);
}
const asset = extensionVersion.files.filter(f => f.assetType === 'Microsoft.VisualStudio.Services.VSIXPackage')[0];
if (!asset) {
return error(`No VSIX found for extension version: ${extensionName} @ ${version}`);
}
util.log('Downloading extension:', util.colors.yellow(`${extensionName}@${version}`), '...');
const options = {
base: asset.source,
requestOptions: {
gzip: true,
headers: baseHeaders
}
};
return remote('', options)
.pipe(flatmap(stream => {
const packageJsonFilter = filter('package.json', { restore: true });
return stream
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}));
}));
return remote('', options)
.pipe(vzip.src())
.pipe(filter('extension/**'))
.pipe(rename(p => p.dirname = p.dirname!.replace(/^extension\/?/, '')))
.pipe(packageJsonFilter)
.pipe(buffer())
.pipe(json({ __metadata: metadata }))
.pipe(packageJsonFilter.restore);
}
interface IPackageExtensionsOptions {
/**
* Set to undefined to package all of them.
*/
desiredExtensions?: string[];
sourceMappingURLBase?: string;
}
const excludedExtensions = [
'vscode-api-tests',
'vscode-colorize-tests',
'ms-vscode.node-debug',
'ms-vscode.node-debug2',
// {{SQL CARBON EDIT}}
'integration-tests'
];
// {{SQL CARBON EDIT}}
const sqlBuiltInExtensions = [
// Add SQL built-in extensions here.
// the extension will be excluded from SQLOps package and will have separate vsix packages
'admin-tool-ext-win',
'agent',
'import',
'profiler',
'admin-pack',
'big-data-cluster',
'dacpac'
];
var azureExtensions = ['azurecore', 'mssql'];
// {{SQL CARBON EDIT}} - End
interface IBuiltInExtension {
name: string;
version: string;
repo: string;
metadata: any;
}
const builtInExtensions: IBuiltInExtension[] = require('../builtInExtensions.json');
/**
* We're doing way too much stuff at once, with webpack et al. So much stuff
* that while downloading extensions from the marketplace, node js doesn't get enough
* stack frames to complete the download in under 2 minutes, at which point the
* marketplace server cuts off the http request. So, we sequentialize the extensino tasks.
*/
function sequence(streamProviders: { (): Stream }[]): Stream {
const result = es.through();
function pop() {
if (streamProviders.length === 0) {
result.emit('end');
} else {
const fn = streamProviders.shift()!;
fn()
.on('end', function () { setTimeout(pop, 0); })
.pipe(result, { end: false });
}
}
pop();
return result;
}
export function packageExtensionsStream(optsIn?: IPackageExtensionsOptions): NodeJS.ReadWriteStream {
const opts = optsIn || {};
const localExtensionDescriptions = (<string[]>glob.sync('extensions/*/package.json'))
.map(manifestPath => {
const extensionPath = path.dirname(path.join(root, manifestPath));
const extensionName = path.basename(extensionPath);
return { name: extensionName, path: extensionPath };
})
.filter(({ name }) => excludedExtensions.indexOf(name) === -1)
.filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
.filter(({ name }) => builtInExtensions.every(b => b.name !== name))
// {{SQL CARBON EDIT}}
.filter(({ name }) => sqlBuiltInExtensions.indexOf(name) === -1)
.filter(({ name }) => azureExtensions.indexOf(name) === -1);
const localExtensions = () => sequence([...localExtensionDescriptions.map(extension => () => {
return fromLocal(extension.path, opts.sourceMappingURLBase)
.pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
})]);
// {{SQL CARBON EDIT}}
const extensionDepsSrc = [
..._.flatten(extensionsProductionDependencies.map((d: any) => path.relative(root, d.path)).map((d: any) => [`${d}/**`, `!${d}/**/{test,tests}/**`])),
];
const localExtensionDependencies = () => gulp.src(extensionDepsSrc, { base: '.', dot: true })
.pipe(filter(['**', '!**/package-lock.json']))
.pipe(util2.cleanNodeModule('account-provider-azure', ['node_modules/date-utils/doc/**', 'node_modules/adal_node/node_modules/**'], undefined))
.pipe(util2.cleanNodeModule('typescript', ['**/**'], undefined));
// Original code commented out here
// const localExtensionDependencies = () => gulp.src('extensions/node_modules/**', { base: '.' });
// const marketplaceExtensions = () => es.merge(
// ...builtInExtensions
// .filter(({ name }) => opts.desiredExtensions ? opts.desiredExtensions.indexOf(name) >= 0 : true)
// .map(extension => {
// return fromMarketplace(extension.name, extension.version, extension.metadata)
// .pipe(rename(p => p.dirname = `extensions/${extension.name}/${p.dirname}`));
// })
// );
return sequence([localExtensions, localExtensionDependencies, /*marketplaceExtensions*/])
.pipe(util2.setExecutableBit(['**/*.sh']))
.pipe(filter(['**', '!**/*.js.map']));
// {{SQL CARBON EDIT}} - End
}

View File

@@ -4,47 +4,47 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var fs = require("fs");
const path = require("path");
const fs = require("fs");
/**
* Returns the sha1 commit version of a repository or undefined in case of failure.
*/
function getVersion(repo) {
var git = path.join(repo, '.git');
var headPath = path.join(git, 'HEAD');
var head;
const git = path.join(repo, '.git');
const headPath = path.join(git, 'HEAD');
let head;
try {
head = fs.readFileSync(headPath, 'utf8').trim();
}
catch (e) {
return void 0;
return undefined;
}
if (/^[0-9a-f]{40}$/i.test(head)) {
return head;
}
var refMatch = /^ref: (.*)$/.exec(head);
const refMatch = /^ref: (.*)$/.exec(head);
if (!refMatch) {
return void 0;
return undefined;
}
var ref = refMatch[1];
var refPath = path.join(git, ref);
const ref = refMatch[1];
const refPath = path.join(git, ref);
try {
return fs.readFileSync(refPath, 'utf8').trim();
}
catch (e) {
// noop
}
var packedRefsPath = path.join(git, 'packed-refs');
var refsRaw;
const packedRefsPath = path.join(git, 'packed-refs');
let refsRaw;
try {
refsRaw = fs.readFileSync(packedRefsPath, 'utf8').trim();
}
catch (e) {
return void 0;
return undefined;
}
var refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
var refsMatch;
var refs = {};
const refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
let refsMatch;
let refs = {};
while (refsMatch = refsRegex.exec(refsRaw)) {
refs[refsMatch[2]] = refsMatch[1];
}

View File

@@ -10,7 +10,7 @@ import * as fs from 'fs';
/**
* Returns the sha1 commit version of a repository or undefined in case of failure.
*/
export function getVersion(repo: string): string {
export function getVersion(repo: string): string | undefined {
const git = path.join(repo, '.git');
const headPath = path.join(git, 'HEAD');
let head: string;
@@ -18,7 +18,7 @@ export function getVersion(repo: string): string {
try {
head = fs.readFileSync(headPath, 'utf8').trim();
} catch (e) {
return void 0;
return undefined;
}
if (/^[0-9a-f]{40}$/i.test(head)) {
@@ -28,7 +28,7 @@ export function getVersion(repo: string): string {
const refMatch = /^ref: (.*)$/.exec(head);
if (!refMatch) {
return void 0;
return undefined;
}
const ref = refMatch[1];
@@ -46,11 +46,11 @@ export function getVersion(repo: string): string {
try {
refsRaw = fs.readFileSync(packedRefsPath, 'utf8').trim();
} catch (e) {
return void 0;
return undefined;
}
const refsRegex = /^([0-9a-f]{40})\s+(.+)$/gm;
let refsMatch: RegExpExecArray;
let refsMatch: RegExpExecArray | null;
let refs: { [ref: string]: string } = {};
while (refsMatch = refsRegex.exec(refsRaw)) {

File diff suppressed because it is too large Load Diff

View File

@@ -27,135 +27,155 @@
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/cli",
"name": "vs/workbench/api/common",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/codeEditor",
"name": "vs/workbench/contrib/cli",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/comments",
"name": "vs/workbench/contrib/codeEditor",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/debug",
"name": "vs/workbench/contrib/codeinset",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/emmet",
"name": "vs/workbench/contrib/callHierarchy",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/execution",
"name": "vs/workbench/contrib/comments",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/extensions",
"name": "vs/workbench/contrib/debug",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/feedback",
"name": "vs/workbench/contrib/emmet",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/files",
"name": "vs/workbench/contrib/extensions",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/html",
"name": "vs/workbench/contrib/externalTerminal",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/markers",
"name": "vs/workbench/contrib/feedback",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/localizations",
"name": "vs/workbench/contrib/files",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/logs",
"name": "vs/workbench/contrib/html",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/navigation",
"name": "vs/workbench/contrib/issue",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/output",
"name": "vs/workbench/contrib/markers",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/performance",
"name": "vs/workbench/contrib/localizations",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/preferences",
"name": "vs/workbench/contrib/logs",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/quickopen",
"name": "vs/workbench/contrib/output",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/relauncher",
"name": "vs/workbench/contrib/performance",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/scm",
"name": "vs/workbench/contrib/preferences",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/search",
"name": "vs/workbench/contrib/quickopen",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/snippets",
"name": "vs/workbench/contrib/relauncher",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/surveys",
"name": "vs/workbench/contrib/scm",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/tasks",
"name": "vs/workbench/contrib/search",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/terminal",
"name": "vs/workbench/contrib/snippets",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/themes",
"name": "vs/workbench/contrib/format",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/trust",
"name": "vs/workbench/contrib/stats",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/update",
"name": "vs/workbench/contrib/surveys",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/url",
"name": "vs/workbench/contrib/tasks",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/watermark",
"name": "vs/workbench/contrib/terminal",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/webview",
"name": "vs/workbench/contrib/themes",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/welcome",
"name": "vs/workbench/contrib/trust",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/parts/outline",
"name": "vs/workbench/contrib/update",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/url",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/watermark",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/webview",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/welcome",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/contrib/outline",
"project": "vscode-workbench"
},
{
@@ -166,6 +186,10 @@
"name": "vs/workbench/services/bulkEdit",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/commands",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/configuration",
"project": "vscode-workbench"
@@ -191,13 +215,21 @@
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/jsonschemas",
"name": "vs/workbench/services/extensionManagement",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/files",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/files2",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/integrity",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/keybinding",
"project": "vscode-workbench"
@@ -210,6 +242,10 @@
"name": "vs/workbench/services/progress",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/remote",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/textfile",
"project": "vscode-workbench"
@@ -230,6 +266,10 @@
"name": "vs/workbench/services/decorations",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/label",
"project": "vscode-workbench"
},
{
"name": "vs/workbench/services/preferences",
"project": "vscode-preferences"

View File

@@ -7,25 +7,25 @@ import * as path from 'path';
import * as fs from 'fs';
import { through, readable, ThroughStream } from 'event-stream';
import File = require('vinyl');
import * as File from 'vinyl';
import * as Is from 'is';
import * as xml2js from 'xml2js';
import * as glob from 'glob';
import * as https from 'https';
import * as gulp from 'gulp';
var util = require('gulp-util');
var iconv = require('iconv-lite');
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as iconv from 'iconv-lite';
const NUMBER_OF_CONCURRENT_DOWNLOADS = 4;
function log(message: any, ...rest: any[]): void {
util.log(util.colors.green('[i18n]'), message, ...rest);
fancyLog(ansiColors.green('[i18n]'), message, ...rest);
}
export interface Language {
id: string; // laguage id, e.g. zh-tw, de
transifexId?: string; // language id used in transifex, e.g zh-hant, de (optional, if not set, the id is used)
translationId?: string; // language id used in translation tools, e.g zh-hant, de (optional, if not set, the id is used)
folderName?: string; // language specific folder name, e.g. cht, deu (optional, if not set, the id is used)
}
@@ -38,8 +38,8 @@ export interface InnoSetup {
}
export const defaultLanguages: Language[] = [
{ id: 'zh-tw', folderName: 'cht', transifexId: 'zh-hant' },
{ id: 'zh-cn', folderName: 'chs', transifexId: 'zh-hans' },
{ id: 'zh-tw', folderName: 'cht', translationId: 'zh-hant' },
{ id: 'zh-cn', folderName: 'chs', translationId: 'zh-hans' },
{ id: 'ja', folderName: 'jpn' },
{ id: 'ko', folderName: 'kor' },
{ id: 'de', folderName: 'deu' },
@@ -57,7 +57,7 @@ export const extraLanguages: Language[] = [
];
// non built-in extensions also that are transifex and need to be part of the language packs
const externalExtensionsWithTranslations = {
export const externalExtensionsWithTranslations = {
'vscode-chrome-debug': 'msjsdiag.debugger-for-chrome',
'vscode-node-debug': 'ms-vscode.node-debug',
'vscode-node-debug2': 'ms-vscode.node-debug2'
@@ -71,7 +71,7 @@ interface Map<V> {
interface Item {
id: string;
message: string;
comment: string;
comment?: string;
}
export interface Resource {
@@ -137,27 +137,6 @@ module PackageJsonFormat {
}
}
interface ModuleJsonFormat {
messages: string[];
keys: (string | LocalizeInfo)[];
}
module ModuleJsonFormat {
export function is(value: any): value is ModuleJsonFormat {
let candidate = value as ModuleJsonFormat;
return Is.defined(candidate)
&& Is.array(candidate.messages) && candidate.messages.every(message => Is.string(message))
&& Is.array(candidate.keys) && candidate.keys.every(key => Is.string(key) || LocalizeInfo.is(key));
}
}
interface BundledExtensionHeaderFormat {
id: string;
type: string;
hash: string;
outDir: string;
}
interface BundledExtensionFormat {
[key: string]: {
messages: string[];
@@ -165,10 +144,19 @@ interface BundledExtensionFormat {
};
}
interface I18nFormat {
version: string;
contents: {
[module: string]: {
[messageKey: string]: string;
};
};
}
export class Line {
private buffer: string[] = [];
constructor(private indent: number = 0) {
constructor(indent: number = 0) {
if (indent > 0) {
this.buffer.push(new Array(indent + 1).join(' '));
}
@@ -235,8 +223,8 @@ export class XLF {
let existingKeys = new Set<string>();
for (let i = 0; i < keys.length; i++) {
let key = keys[i];
let realKey: string;
let comment: string;
let realKey: string | undefined;
let comment: string | undefined;
if (Is.string(key)) {
realKey = key;
comment = undefined;
@@ -286,17 +274,17 @@ export class XLF {
}
static parsePseudo = function (xlfString: string): Promise<ParsedXLF[]> {
return new Promise((resolve, reject) => {
return new Promise((resolve) => {
let parser = new xml2js.Parser();
let files: { messages: Map<string>, originalFilePath: string, language: string }[] = [];
parser.parseString(xlfString, function (err, result) {
parser.parseString(xlfString, function (_err: any, result: any) {
const fileNodes: any[] = result['xliff']['file'];
fileNodes.forEach(file => {
const originalFilePath = file.$.original;
const messages: Map<string> = {};
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach(unit => {
transUnits.forEach((unit: any) => {
const key = unit.$.id;
const val = pseudify(unit.source[0]['_'].toString());
if (key && val) {
@@ -317,7 +305,7 @@ export class XLF {
let files: { messages: Map<string>, originalFilePath: string, language: string }[] = [];
parser.parseString(xlfString, function (err, result) {
parser.parseString(xlfString, function (err: any, result: any) {
if (err) {
reject(new Error(`XLF parsing error: Failed to parse XLIFF string. ${err}`));
}
@@ -340,17 +328,20 @@ export class XLF {
const transUnits = file.body[0]['trans-unit'];
if (transUnits) {
transUnits.forEach(unit => {
transUnits.forEach((unit: any) => {
const key = unit.$.id;
if (!unit.target) {
return; // No translation available
}
const val = unit.target.toString();
let val = unit.target[0];
if (typeof val !== 'string') {
val = val._;
}
if (key && val) {
messages[key] = decodeEntities(val);
} else {
reject(new Error(`XLF parsing error: XLIFF file does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
reject(new Error(`XLF parsing error: XLIFF file ${originalFilePath} does not contain full localization data. ID or target translation for one of the trans-unit nodes is not present.`));
}
});
files.push({ messages: messages, originalFilePath: originalFilePath, language: language.toLowerCase() });
@@ -369,7 +360,7 @@ export interface ITask<T> {
interface ILimitedTaskFactory<T> {
factory: ITask<Promise<T>>;
c: (value?: T | Thenable<T>) => void;
c: (value?: T | Promise<T>) => void;
e: (error?: any) => void;
}
@@ -391,7 +382,7 @@ export class Limiter<T> {
private consume(): void {
while (this.outstandingPromises.length && this.runningPromises < this.maxDegreeOfParalellism) {
const iLimitedTask = this.outstandingPromises.shift();
const iLimitedTask = this.outstandingPromises.shift()!;
this.runningPromises++;
const promise = iLimitedTask.factory();
@@ -419,8 +410,8 @@ function stripComments(content: string): string {
* Third matches block comments
* Fourth matches line comments
*/
var regexp: RegExp = /("(?:[^\\\"]*(?:\\.)?)*")|('(?:[^\\\']*(?:\\.)?)*')|(\/\*(?:\r?\n|.)*?\*\/)|(\/{2,}.*?(?:(?:\r?\n)|$))/g;
let result = content.replace(regexp, (match, m1, m2, m3, m4) => {
const regexp = /("(?:[^\\\"]*(?:\\.)?)*")|('(?:[^\\\']*(?:\\.)?)*')|(\/\*(?:\r?\n|.)*?\*\/)|(\/{2,}.*?(?:(?:\r?\n)|$))/g;
let result = content.replace(regexp, (match, _m1, _m2, m3, m4) => {
// Only one of m1, m2, m3, m4 matches
if (m3) {
// A block comment. Replace with nothing
@@ -442,9 +433,9 @@ function stripComments(content: string): string {
}
function escapeCharacters(value: string): string {
var result: string[] = [];
for (var i = 0; i < value.length; i++) {
var ch = value.charAt(i);
const result: string[] = [];
for (let i = 0; i < value.length; i++) {
const ch = value.charAt(i);
switch (ch) {
case '\'':
result.push('\\\'');
@@ -484,7 +475,6 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
let statistics: Map<number> = Object.create(null);
let total: number = 0;
let defaultMessages: Map<Map<string>> = Object.create(null);
let modules = Object.keys(keysSection);
modules.forEach((module) => {
@@ -497,7 +487,6 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
let messageMap: Map<string> = Object.create(null);
defaultMessages[module] = messageMap;
keys.map((key, i) => {
total++;
if (typeof key === 'string') {
messageMap[key] = messages[i];
} else {
@@ -506,7 +495,11 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
});
});
let languageDirectory = path.join(__dirname, '..', '..', 'i18n');
let languageDirectory = path.join(__dirname, '..', '..', '..', 'vscode-loc', 'i18n');
if (!fs.existsSync(languageDirectory)) {
log(`No VS Code localization repository found. Looking at ${languageDirectory}`);
log(`To bundle translations please check out the vscode-loc repository as a sibling of the vscode repository.`);
}
let sortedLanguages = sortLanguages(languages);
sortedLanguages.forEach((language) => {
if (process.env['VSCODE_BUILD_VERBOSE']) {
@@ -515,31 +508,35 @@ function processCoreBundleFormat(fileHeader: string, languages: Language[], json
statistics[language.id] = 0;
let localizedModules: Map<string[]> = Object.create(null);
let languageFolderName = language.folderName || language.id;
let cwd = path.join(languageDirectory, languageFolderName, 'src');
let languageFolderName = language.translationId || language.id;
let i18nFile = path.join(languageDirectory, `vscode-language-pack-${languageFolderName}`, 'translations', 'main.i18n.json');
let allMessages: I18nFormat | undefined;
if (fs.existsSync(i18nFile)) {
let content = stripComments(fs.readFileSync(i18nFile, 'utf8'));
allMessages = JSON.parse(content);
}
modules.forEach((module) => {
let order = keysSection[module];
let i18nFile = path.join(cwd, module) + '.i18n.json';
let messages: Map<string> = null;
if (fs.existsSync(i18nFile)) {
let content = stripComments(fs.readFileSync(i18nFile, 'utf8'));
messages = JSON.parse(content);
} else {
let moduleMessage: { [messageKey: string]: string } | undefined;
if (allMessages) {
moduleMessage = allMessages.contents[module];
}
if (!moduleMessage) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log(`No localized messages found for module ${module}. Using default messages.`);
}
messages = defaultMessages[module];
statistics[language.id] = statistics[language.id] + Object.keys(messages).length;
moduleMessage = defaultMessages[module];
statistics[language.id] = statistics[language.id] + Object.keys(moduleMessage).length;
}
let localizedMessages: string[] = [];
order.forEach((keyInfo) => {
let key: string = null;
let key: string | null = null;
if (typeof keyInfo === 'string') {
key = keyInfo;
} else {
key = keyInfo.key;
}
let message: string = messages[key];
let message: string = moduleMessage![key];
if (!message) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log(`No localized message found for key ${key} in module ${module}. Using default message.`);
@@ -625,7 +622,7 @@ export function getResource(sourceFile: string): Resource {
return { name: 'vs/base', project: editorProject };
} else if (/^vs\/code/.test(sourceFile)) {
return { name: 'vs/code', project: workbenchProject };
} else if (/^vs\/workbench\/parts/.test(sourceFile)) {
} else if (/^vs\/workbench\/contrib/.test(sourceFile)) {
resource = sourceFile.split('/', 4).join('/');
return { name: resource, project: workbenchProject };
} else if (/^vs\/workbench\/services/.test(sourceFile)) {
@@ -712,7 +709,7 @@ export function createXlfFilesForExtensions(): ThroughStream {
}
return _xlf;
}
gulp.src([`./extensions/${extensionName}/package.nls.json`, `./extensions/${extensionName}/**/nls.metadata.json`]).pipe(through(function (file: File) {
gulp.src([`./extensions/${extensionName}/package.nls.json`, `./extensions/${extensionName}/**/nls.metadata.json`], { allowEmpty: true }).pipe(through(function (file: File) {
if (file.isBuffer()) {
const buffer: Buffer = file.contents as Buffer;
const basename = path.basename(file.path);
@@ -824,8 +821,8 @@ export function createXlfFilesForIsl(): ThroughStream {
}
export function pushXlfFiles(apiHostname: string, username: string, password: string): ThroughStream {
let tryGetPromises = [];
let updateCreatePromises = [];
let tryGetPromises: Array<Promise<boolean>> = [];
let updateCreatePromises: Array<Promise<boolean>> = [];
return through(function (this: ThroughStream, file: File) {
const project = path.dirname(file.relative);
@@ -890,7 +887,7 @@ function getAllResources(project: string, apiHostname: string, username: string,
export function findObsoleteResources(apiHostname: string, username: string, password: string): ThroughStream {
let resourcesByProject: Map<string[]> = Object.create(null);
resourcesByProject[extensionsProject] = [].concat(externalExtensionsWithTranslations); // clone
resourcesByProject[extensionsProject] = ([] as any[]).concat(externalExtensionsWithTranslations); // clone
return through(function (this: ThroughStream, file: File) {
const project = path.dirname(file.relative);
@@ -907,7 +904,7 @@ export function findObsoleteResources(apiHostname: string, username: string, pas
const json = JSON.parse(fs.readFileSync('./build/lib/i18n.resources.json', 'utf8'));
let i18Resources = [...json.editor, ...json.workbench].map((r: Resource) => r.project + '/' + r.name.replace(/\//g, '_'));
let extractedResources = [];
let extractedResources: string[] = [];
for (let project of [workbenchProject, editorProject]) {
for (let resource of resourcesByProject[project]) {
if (resource !== 'setup_messages') {
@@ -920,7 +917,7 @@ export function findObsoleteResources(apiHostname: string, username: string, pas
console.log(`[i18n] Missing resources in file 'build/lib/i18n.resources.json': JSON.stringify(${extractedResources.filter(p => i18Resources.indexOf(p) === -1)})`);
}
let promises = [];
let promises: Array<Promise<void>> = [];
for (let project in resourcesByProject) {
promises.push(
getAllResources(project, apiHostname, username, password).then(resources => {
@@ -965,7 +962,7 @@ function tryGetResource(project: string, slug: string, apiHostname: string, cred
}
function createResource(project: string, slug: string, xlfFile: File, apiHostname: string, credentials: any): Promise<any> {
return new Promise((resolve, reject) => {
return new Promise((_resolve, reject) => {
const data = JSON.stringify({
'content': xlfFile.contents.toString(),
'name': slug,
@@ -1056,8 +1053,8 @@ export function pullCoreAndExtensionsXlfFiles(apiHostname: string, username: str
// extensions
let extensionsToLocalize = Object.create(null);
glob.sync('./extensions/**/*.nls.json', ).forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/*/node_modules/vscode-nls', ).forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/**/*.nls.json').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
glob.sync('./extensions/*/node_modules/vscode-nls').forEach(extension => extensionsToLocalize[extension.split('/')[2]] = true);
Object.keys(extensionsToLocalize).forEach(extension => {
_coreAndExtensionResources.push({ name: extension, project: extensionsProject });
@@ -1085,7 +1082,7 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
let expectedTranslationsCount = resources.length;
let translationsRetrieved = 0, called = false;
return readable(function (count, callback) {
return readable(function (_count: any, callback: any) {
// Mark end of stream when all resources were retrieved
if (translationsRetrieved === expectedTranslationsCount) {
return this.emit('end');
@@ -1095,7 +1092,7 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
called = true;
const stream = this;
resources.map(function (resource) {
retrieveResource(language, resource, apiHostname, credentials).then((file: File) => {
retrieveResource(language, resource, apiHostname, credentials).then((file: File | null) => {
if (file) {
stream.emit('data', file);
}
@@ -1107,13 +1104,13 @@ function pullXlfFiles(apiHostname: string, username: string, password: string, l
callback();
});
}
const limiter = new Limiter<File>(NUMBER_OF_CONCURRENT_DOWNLOADS);
const limiter = new Limiter<File | null>(NUMBER_OF_CONCURRENT_DOWNLOADS);
function retrieveResource(language: Language, resource: Resource, apiHostname, credentials): Promise<File> {
return limiter.queue(() => new Promise<File>((resolve, reject) => {
function retrieveResource(language: Language, resource: Resource, apiHostname: string, credentials: string): Promise<File | null> {
return limiter.queue(() => new Promise<File | null>((resolve, reject) => {
const slug = resource.name.replace(/\//g, '_');
const project = resource.project;
let transifexLanguageId = language.id === 'ps' ? 'en' : language.transifexId || language.id;
let transifexLanguageId = language.id === 'ps' ? 'en' : language.translationId || language.id;
const options = {
hostname: apiHostname,
path: `/api/2/project/${project}/resource/${slug}/translation/${transifexLanguageId}?file&mode=onlyreviewed`,
@@ -1212,10 +1209,10 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
let parsePromises: Promise<ParsedXLF[]>[] = [];
let mainPack: I18nPack = { version: i18nPackVersion, contents: {} };
let extensionsPacks: Map<I18nPack> = {};
let errors: any[] = [];
return through(function (this: ThroughStream, xlf: File) {
let stream = this;
let project = path.dirname(xlf.path);
let resource = path.basename(xlf.path, '.xlf');
let project = path.basename(path.dirname(xlf.relative));
let resource = path.basename(xlf.relative, '.xlf');
let contents = xlf.contents.toString();
let parsePromise = pseudo ? XLF.parsePseudo(contents) : XLF.parse(contents);
parsePromises.push(parsePromise);
@@ -1242,10 +1239,15 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
}
});
}
);
).catch(reason => {
errors.push(reason);
});
}, function () {
Promise.all(parsePromises)
.then(() => {
if (errors.length > 0) {
throw errors;
}
const translatedMainFile = createI18nFile('./main', mainPack);
resultingTranslationPaths.push({ id: 'vscode', resourceName: 'main.i18n.json' });
@@ -1264,7 +1266,9 @@ export function prepareI18nPackFiles(externalExtensions: Map<string>, resultingT
}
this.queue(null);
})
.catch(reason => { throw new Error(reason); });
.catch((reason) => {
this.emit('error', reason);
});
});
}
@@ -1285,11 +1289,15 @@ export function prepareIslFiles(language: Language, innoSetupConfig: InnoSetup):
stream.queue(translatedFile);
});
}
);
).catch(reason => {
this.emit('error', reason);
});
}, function () {
Promise.all(parsePromises)
.then(() => { this.queue(null); })
.catch(reason => { throw new Error(reason); });
.catch(reason => {
this.emit('error', reason);
});
});
}
@@ -1306,7 +1314,7 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
let firstChar = line.charAt(0);
if (firstChar === '[' || firstChar === ';') {
if (line === '; *** Inno Setup version 5.5.3+ English messages ***') {
content.push(`; *** Inno Setup version 5.5.3+ ${innoSetup.defaultInfo.name} messages ***`);
content.push(`; *** Inno Setup version 5.5.3+ ${innoSetup.defaultInfo!.name} messages ***`);
} else {
content.push(line);
}
@@ -1316,9 +1324,9 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
let translated = line;
if (key) {
if (key === 'LanguageName') {
translated = `${key}=${innoSetup.defaultInfo.name}`;
translated = `${key}=${innoSetup.defaultInfo!.name}`;
} else if (key === 'LanguageID') {
translated = `${key}=${innoSetup.defaultInfo.id}`;
translated = `${key}=${innoSetup.defaultInfo!.id}`;
} else if (key === 'LanguageCodePage') {
translated = `${key}=${innoSetup.codePage.substr(2)}`;
} else {
@@ -1339,14 +1347,14 @@ function createIslFile(originalFilePath: string, messages: Map<string>, language
return new File({
path: filePath,
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8'), innoSetup.codePage)
contents: iconv.encode(Buffer.from(content.join('\r\n'), 'utf8').toString(), innoSetup.codePage)
});
}
function encodeEntities(value: string): string {
var result: string[] = [];
for (var i = 0; i < value.length; i++) {
var ch = value[i];
let result: string[] = [];
for (let i = 0; i < value.length; i++) {
let ch = value[i];
switch (ch) {
case '<':
result.push('&lt;');

View File

@@ -3,13 +3,12 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
var ts = require("typescript");
var lazy = require("lazy.js");
var event_stream_1 = require("event-stream");
var File = require("vinyl");
var sm = require("source-map");
var assign = require("object-assign");
var path = require("path");
const ts = require("typescript");
const lazy = require("lazy.js");
const event_stream_1 = require("event-stream");
const File = require("vinyl");
const sm = require("source-map");
const path = require("path");
var CollectStepResult;
(function (CollectStepResult) {
CollectStepResult[CollectStepResult["Yes"] = 0] = "Yes";
@@ -18,9 +17,9 @@ var CollectStepResult;
CollectStepResult[CollectStepResult["NoAndRecurse"] = 3] = "NoAndRecurse";
})(CollectStepResult || (CollectStepResult = {}));
function collect(node, fn) {
var result = [];
const result = [];
function loop(node) {
var stepResult = fn(node);
const stepResult = fn(node);
if (stepResult === CollectStepResult.Yes || stepResult === CollectStepResult.YesAndRecurse) {
result.push(node);
}
@@ -32,43 +31,45 @@ function collect(node, fn) {
return result;
}
function clone(object) {
var result = {};
for (var id in object) {
const result = {};
for (const id in object) {
result[id] = object[id];
}
return result;
}
function template(lines) {
var indent = '', wrap = '';
let indent = '', wrap = '';
if (lines.length > 1) {
indent = '\t';
wrap = '\n';
}
return "/*---------------------------------------------------------\n * Copyright (C) Microsoft Corporation. All rights reserved.\n *--------------------------------------------------------*/\ndefine([], [" + (wrap + lines.map(function (l) { return indent + l; }).join(',\n') + wrap) + "]);";
return `/*---------------------------------------------------------
* Copyright (C) Microsoft Corporation. All rights reserved.
*--------------------------------------------------------*/
define([], [${wrap + lines.map(l => indent + l).join(',\n') + wrap}]);`;
}
/**
* Returns a stream containing the patched JavaScript and source maps.
*/
function nls() {
var input = event_stream_1.through();
var output = input.pipe(event_stream_1.through(function (f) {
var _this = this;
const input = event_stream_1.through();
const output = input.pipe(event_stream_1.through(function (f) {
if (!f.sourceMap) {
return this.emit('error', new Error("File " + f.relative + " does not have sourcemaps."));
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
}
var source = f.sourceMap.sources[0];
let source = f.sourceMap.sources[0];
if (!source) {
return this.emit('error', new Error("File " + f.relative + " does not have a source in the source map."));
return this.emit('error', new Error(`File ${f.relative} does not have a source in the source map.`));
}
var root = f.sourceMap.sourceRoot;
const root = f.sourceMap.sourceRoot;
if (root) {
source = path.join(root, source);
}
var typescript = f.sourceMap.sourcesContent[0];
const typescript = f.sourceMap.sourcesContent[0];
if (!typescript) {
return this.emit('error', new Error("File " + f.relative + " does not have the original content in the source map."));
return this.emit('error', new Error(`File ${f.relative} does not have the original content in the source map.`));
}
nls.patchFiles(f, typescript).forEach(function (f) { return _this.emit('data', f); });
nls.patchFiles(f, typescript).forEach(f => this.emit('data', f));
}));
return event_stream_1.duplex(input, output);
}
@@ -76,8 +77,7 @@ function isImportNode(node) {
return node.kind === ts.SyntaxKind.ImportDeclaration || node.kind === ts.SyntaxKind.ImportEqualsDeclaration;
}
(function (nls_1) {
function fileFrom(file, contents, path) {
if (path === void 0) { path = file.path; }
function fileFrom(file, contents, path = file.path) {
return new File({
contents: Buffer.from(contents),
base: file.base,
@@ -87,29 +87,27 @@ function isImportNode(node) {
}
nls_1.fileFrom = fileFrom;
function mappedPositionFrom(source, lc) {
return { source: source, line: lc.line + 1, column: lc.character };
return { source, line: lc.line + 1, column: lc.character };
}
nls_1.mappedPositionFrom = mappedPositionFrom;
function lcFrom(position) {
return { line: position.line - 1, character: position.column };
}
nls_1.lcFrom = lcFrom;
var SingleFileServiceHost = /** @class */ (function () {
function SingleFileServiceHost(options, filename, contents) {
var _this = this;
class SingleFileServiceHost {
constructor(options, filename, contents) {
this.options = options;
this.filename = filename;
this.getCompilationSettings = function () { return _this.options; };
this.getScriptFileNames = function () { return [_this.filename]; };
this.getScriptVersion = function () { return '1'; };
this.getScriptSnapshot = function (name) { return name === _this.filename ? _this.file : _this.lib; };
this.getCurrentDirectory = function () { return ''; };
this.getDefaultLibFileName = function () { return 'lib.d.ts'; };
this.getCompilationSettings = () => this.options;
this.getScriptFileNames = () => [this.filename];
this.getScriptVersion = () => '1';
this.getScriptSnapshot = (name) => name === this.filename ? this.file : this.lib;
this.getCurrentDirectory = () => '';
this.getDefaultLibFileName = () => 'lib.d.ts';
this.file = ts.ScriptSnapshot.fromString(contents);
this.lib = ts.ScriptSnapshot.fromString('');
}
return SingleFileServiceHost;
}());
}
nls_1.SingleFileServiceHost = SingleFileServiceHost;
function isCallExpressionWithinTextSpanCollectStep(textSpan, node) {
if (!ts.textSpanContainsTextSpan({ start: node.pos, length: node.end - node.pos }, textSpan)) {
@@ -117,97 +115,96 @@ function isImportNode(node) {
}
return node.kind === ts.SyntaxKind.CallExpression ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse;
}
function analyze(contents, options) {
if (options === void 0) { options = {}; }
var filename = 'file.ts';
var serviceHost = new SingleFileServiceHost(assign(clone(options), { noResolve: true }), filename, contents);
var service = ts.createLanguageService(serviceHost);
var sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
function analyze(contents, options = {}) {
const filename = 'file.ts';
const serviceHost = new SingleFileServiceHost(Object.assign(clone(options), { noResolve: true }), filename, contents);
const service = ts.createLanguageService(serviceHost);
const sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
// all imports
var imports = lazy(collect(sourceFile, function (n) { return isImportNode(n) ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse; }));
const imports = lazy(collect(sourceFile, n => isImportNode(n) ? CollectStepResult.YesAndRecurse : CollectStepResult.NoAndRecurse));
// import nls = require('vs/nls');
var importEqualsDeclarations = imports
.filter(function (n) { return n.kind === ts.SyntaxKind.ImportEqualsDeclaration; })
.map(function (n) { return n; })
.filter(function (d) { return d.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference; })
.filter(function (d) { return d.moduleReference.expression.getText() === '\'vs/nls\''; });
const importEqualsDeclarations = imports
.filter(n => n.kind === ts.SyntaxKind.ImportEqualsDeclaration)
.map(n => n)
.filter(d => d.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference)
.filter(d => d.moduleReference.expression.getText() === '\'vs/nls\'');
// import ... from 'vs/nls';
var importDeclarations = imports
.filter(function (n) { return n.kind === ts.SyntaxKind.ImportDeclaration; })
.map(function (n) { return n; })
.filter(function (d) { return d.moduleSpecifier.kind === ts.SyntaxKind.StringLiteral; })
.filter(function (d) { return d.moduleSpecifier.getText() === '\'vs/nls\''; })
.filter(function (d) { return !!d.importClause && !!d.importClause.namedBindings; });
var nlsExpressions = importEqualsDeclarations
.map(function (d) { return d.moduleReference.expression; })
.concat(importDeclarations.map(function (d) { return d.moduleSpecifier; }))
.map(function (d) { return ({
const importDeclarations = imports
.filter(n => n.kind === ts.SyntaxKind.ImportDeclaration)
.map(n => n)
.filter(d => d.moduleSpecifier.kind === ts.SyntaxKind.StringLiteral)
.filter(d => d.moduleSpecifier.getText() === '\'vs/nls\'')
.filter(d => !!d.importClause && !!d.importClause.namedBindings);
const nlsExpressions = importEqualsDeclarations
.map(d => d.moduleReference.expression)
.concat(importDeclarations.map(d => d.moduleSpecifier))
.map(d => ({
start: ts.getLineAndCharacterOfPosition(sourceFile, d.getStart()),
end: ts.getLineAndCharacterOfPosition(sourceFile, d.getEnd())
}); });
}));
// `nls.localize(...)` calls
var nlsLocalizeCallExpressions = importDeclarations
.filter(function (d) { return d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport; })
.map(function (d) { return d.importClause.namedBindings.name; })
.concat(importEqualsDeclarations.map(function (d) { return d.name; }))
const nlsLocalizeCallExpressions = importDeclarations
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport))
.map(d => d.importClause.namedBindings.name)
.concat(importEqualsDeclarations.map(d => d.name))
// find read-only references to `nls`
.map(function (n) { return service.getReferencesAtPosition(filename, n.pos + 1); })
.map(n => service.getReferencesAtPosition(filename, n.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; })
.filter(r => !r.isWriteAccess)
// find the deepest call expressions AST nodes that contain those references
.map(function (r) { return collect(sourceFile, function (n) { return isCallExpressionWithinTextSpanCollectStep(r.textSpan, n); }); })
.map(function (a) { return lazy(a).last(); })
.filter(function (n) { return !!n; })
.map(function (n) { return n; })
.map(r => collect(sourceFile, n => isCallExpressionWithinTextSpanCollectStep(r.textSpan, n)))
.map(a => lazy(a).last())
.filter(n => !!n)
.map(n => n)
// only `localize` calls
.filter(function (n) { return n.expression.kind === ts.SyntaxKind.PropertyAccessExpression && n.expression.name.getText() === 'localize'; });
.filter(n => n.expression.kind === ts.SyntaxKind.PropertyAccessExpression && n.expression.name.getText() === 'localize');
// `localize` named imports
var allLocalizeImportDeclarations = importDeclarations
.filter(function (d) { return d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports; })
.map(function (d) { return [].concat(d.importClause.namedBindings.elements); })
const allLocalizeImportDeclarations = importDeclarations
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports))
.map(d => [].concat(d.importClause.namedBindings.elements))
.flatten();
// `localize` read-only references
var localizeReferences = allLocalizeImportDeclarations
.filter(function (d) { return d.name.getText() === 'localize'; })
.map(function (n) { return service.getReferencesAtPosition(filename, n.pos + 1); })
const localizeReferences = allLocalizeImportDeclarations
.filter(d => d.name.getText() === 'localize')
.map(n => service.getReferencesAtPosition(filename, n.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; });
.filter(r => !r.isWriteAccess);
// custom named `localize` read-only references
var namedLocalizeReferences = allLocalizeImportDeclarations
.filter(function (d) { return d.propertyName && d.propertyName.getText() === 'localize'; })
.map(function (n) { return service.getReferencesAtPosition(filename, n.name.pos + 1); })
const namedLocalizeReferences = allLocalizeImportDeclarations
.filter(d => d.propertyName && d.propertyName.getText() === 'localize')
.map(n => service.getReferencesAtPosition(filename, n.name.pos + 1))
.flatten()
.filter(function (r) { return !r.isWriteAccess; });
.filter(r => !r.isWriteAccess);
// find the deepest call expressions AST nodes that contain those references
var localizeCallExpressions = localizeReferences
const localizeCallExpressions = localizeReferences
.concat(namedLocalizeReferences)
.map(function (r) { return collect(sourceFile, function (n) { return isCallExpressionWithinTextSpanCollectStep(r.textSpan, n); }); })
.map(function (a) { return lazy(a).last(); })
.filter(function (n) { return !!n; })
.map(function (n) { return n; });
.map(r => collect(sourceFile, n => isCallExpressionWithinTextSpanCollectStep(r.textSpan, n)))
.map(a => lazy(a).last())
.filter(n => !!n)
.map(n => n);
// collect everything
var localizeCalls = nlsLocalizeCallExpressions
const localizeCalls = nlsLocalizeCallExpressions
.concat(localizeCallExpressions)
.map(function (e) { return e.arguments; })
.filter(function (a) { return a.length > 1; })
.sort(function (a, b) { return a[0].getStart() - b[0].getStart(); })
.map(function (a) { return ({
.map(e => e.arguments)
.filter(a => a.length > 1)
.sort((a, b) => a[0].getStart() - b[0].getStart())
.map(a => ({
keySpan: { start: ts.getLineAndCharacterOfPosition(sourceFile, a[0].getStart()), end: ts.getLineAndCharacterOfPosition(sourceFile, a[0].getEnd()) },
key: a[0].getText(),
valueSpan: { start: ts.getLineAndCharacterOfPosition(sourceFile, a[1].getStart()), end: ts.getLineAndCharacterOfPosition(sourceFile, a[1].getEnd()) },
value: a[1].getText()
}); });
}));
return {
localizeCalls: localizeCalls.toArray(),
nlsExpressions: nlsExpressions.toArray()
};
}
nls_1.analyze = analyze;
var TextModel = /** @class */ (function () {
function TextModel(contents) {
var regex = /\r\n|\r|\n/g;
var index = 0;
var match;
class TextModel {
constructor(contents) {
const regex = /\r\n|\r|\n/g;
let index = 0;
let match;
this.lines = [];
this.lineEndings = [];
while (match = regex.exec(contents)) {
@@ -220,85 +217,80 @@ function isImportNode(node) {
this.lineEndings.push('');
}
}
TextModel.prototype.get = function (index) {
get(index) {
return this.lines[index];
};
TextModel.prototype.set = function (index, line) {
}
set(index, line) {
this.lines[index] = line;
};
Object.defineProperty(TextModel.prototype, "lineCount", {
get: function () {
return this.lines.length;
},
enumerable: true,
configurable: true
});
}
get lineCount() {
return this.lines.length;
}
/**
* Applies patch(es) to the model.
* Multiple patches must be ordered.
* Does not support patches spanning multiple lines.
*/
TextModel.prototype.apply = function (patch) {
var startLineNumber = patch.span.start.line;
var endLineNumber = patch.span.end.line;
var startLine = this.lines[startLineNumber] || '';
var endLine = this.lines[endLineNumber] || '';
apply(patch) {
const startLineNumber = patch.span.start.line;
const endLineNumber = patch.span.end.line;
const startLine = this.lines[startLineNumber] || '';
const endLine = this.lines[endLineNumber] || '';
this.lines[startLineNumber] = [
startLine.substring(0, patch.span.start.character),
patch.content,
endLine.substring(patch.span.end.character)
].join('');
for (var i = startLineNumber + 1; i <= endLineNumber; i++) {
for (let i = startLineNumber + 1; i <= endLineNumber; i++) {
this.lines[i] = '';
}
};
TextModel.prototype.toString = function () {
}
toString() {
return lazy(this.lines).zip(this.lineEndings)
.flatten().toArray().join('');
};
return TextModel;
}());
}
}
nls_1.TextModel = TextModel;
function patchJavascript(patches, contents, moduleId) {
var model = new nls.TextModel(contents);
const model = new nls.TextModel(contents);
// patch the localize calls
lazy(patches).reverse().each(function (p) { return model.apply(p); });
lazy(patches).reverse().each(p => model.apply(p));
// patch the 'vs/nls' imports
var firstLine = model.get(0);
var patchedFirstLine = firstLine.replace(/(['"])vs\/nls\1/g, "$1vs/nls!" + moduleId + "$1");
const firstLine = model.get(0);
const patchedFirstLine = firstLine.replace(/(['"])vs\/nls\1/g, `$1vs/nls!${moduleId}$1`);
model.set(0, patchedFirstLine);
return model.toString();
}
nls_1.patchJavascript = patchJavascript;
function patchSourcemap(patches, rsm, smc) {
var smg = new sm.SourceMapGenerator({
const smg = new sm.SourceMapGenerator({
file: rsm.file,
sourceRoot: rsm.sourceRoot
});
patches = patches.reverse();
var currentLine = -1;
var currentLineDiff = 0;
var source = null;
smc.eachMapping(function (m) {
var patch = patches[patches.length - 1];
var original = { line: m.originalLine, column: m.originalColumn };
var generated = { line: m.generatedLine, column: m.generatedColumn };
let currentLine = -1;
let currentLineDiff = 0;
let source = null;
smc.eachMapping(m => {
const patch = patches[patches.length - 1];
const original = { line: m.originalLine, column: m.originalColumn };
const generated = { line: m.generatedLine, column: m.generatedColumn };
if (currentLine !== generated.line) {
currentLineDiff = 0;
}
currentLine = generated.line;
generated.column += currentLineDiff;
if (patch && m.generatedLine - 1 === patch.span.end.line && m.generatedColumn === patch.span.end.character) {
var originalLength = patch.span.end.character - patch.span.start.character;
var modifiedLength = patch.content.length;
var lengthDiff = modifiedLength - originalLength;
const originalLength = patch.span.end.character - patch.span.start.character;
const modifiedLength = patch.content.length;
const lengthDiff = modifiedLength - originalLength;
currentLineDiff += lengthDiff;
generated.column += lengthDiff;
patches.pop();
}
source = rsm.sourceRoot ? path.relative(rsm.sourceRoot, m.source) : m.source;
source = source.replace(/\\/g, '/');
smg.addMapping({ source: source, name: m.name, original: original, generated: generated });
smg.addMapping({ source, name: m.name, original, generated });
}, null, sm.SourceMapConsumer.GENERATED_ORDER);
if (source) {
smg.setSourceContent(source, smc.sourceContentFor(source));
@@ -307,47 +299,47 @@ function isImportNode(node) {
}
nls_1.patchSourcemap = patchSourcemap;
function patch(moduleId, typescript, javascript, sourcemap) {
var _a = analyze(typescript), localizeCalls = _a.localizeCalls, nlsExpressions = _a.nlsExpressions;
const { localizeCalls, nlsExpressions } = analyze(typescript);
if (localizeCalls.length === 0) {
return { javascript: javascript, sourcemap: sourcemap };
return { javascript, sourcemap };
}
var nlsKeys = template(localizeCalls.map(function (lc) { return lc.key; }));
var nls = template(localizeCalls.map(function (lc) { return lc.value; }));
var smc = new sm.SourceMapConsumer(sourcemap);
var positionFrom = mappedPositionFrom.bind(null, sourcemap.sources[0]);
var i = 0;
const nlsKeys = template(localizeCalls.map(lc => lc.key));
const nls = template(localizeCalls.map(lc => lc.value));
const smc = new sm.SourceMapConsumer(sourcemap);
const positionFrom = mappedPositionFrom.bind(null, sourcemap.sources[0]);
let i = 0;
// build patches
var patches = lazy(localizeCalls)
.map(function (lc) { return ([
const patches = lazy(localizeCalls)
.map(lc => ([
{ range: lc.keySpan, content: '' + (i++) },
{ range: lc.valueSpan, content: 'null' }
]); })
]))
.flatten()
.map(function (c) {
var start = lcFrom(smc.generatedPositionFor(positionFrom(c.range.start)));
var end = lcFrom(smc.generatedPositionFor(positionFrom(c.range.end)));
return { span: { start: start, end: end }, content: c.content };
.map(c => {
const start = lcFrom(smc.generatedPositionFor(positionFrom(c.range.start)));
const end = lcFrom(smc.generatedPositionFor(positionFrom(c.range.end)));
return { span: { start, end }, content: c.content };
})
.toArray();
javascript = patchJavascript(patches, javascript, moduleId);
// since imports are not within the sourcemap information,
// we must do this MacGyver style
if (nlsExpressions.length) {
javascript = javascript.replace(/^define\(.*$/m, function (line) {
return line.replace(/(['"])vs\/nls\1/g, "$1vs/nls!" + moduleId + "$1");
javascript = javascript.replace(/^define\(.*$/m, line => {
return line.replace(/(['"])vs\/nls\1/g, `$1vs/nls!${moduleId}$1`);
});
}
sourcemap = patchSourcemap(patches, sourcemap, smc);
return { javascript: javascript, sourcemap: sourcemap, nlsKeys: nlsKeys, nls: nls };
return { javascript, sourcemap, nlsKeys, nls };
}
nls_1.patch = patch;
function patchFiles(javascriptFile, typescript) {
// hack?
var moduleId = javascriptFile.relative
const moduleId = javascriptFile.relative
.replace(/\.js$/, '')
.replace(/\\/g, '/');
var _a = patch(moduleId, typescript, javascriptFile.contents.toString(), javascriptFile.sourceMap), javascript = _a.javascript, sourcemap = _a.sourcemap, nlsKeys = _a.nlsKeys, nls = _a.nls;
var result = [fileFrom(javascriptFile, javascript)];
const { javascript, sourcemap, nlsKeys, nls } = patch(moduleId, typescript, javascriptFile.contents.toString(), javascriptFile.sourceMap);
const result = [fileFrom(javascriptFile, javascript)];
result[0].sourceMap = sourcemap;
if (nlsKeys) {
result.push(fileFrom(javascriptFile, nlsKeys, javascriptFile.path.replace(/\.js$/, '.nls.keys.js')));

View File

@@ -6,10 +6,9 @@
import * as ts from 'typescript';
import * as lazy from 'lazy.js';
import { duplex, through } from 'event-stream';
import File = require('vinyl');
import * as File from 'vinyl';
import * as sm from 'source-map';
import assign = require('object-assign');
import path = require('path');
import * as path from 'path';
declare class FileSourceMap extends File {
public sourceMap: sm.RawSourceMap;
@@ -26,7 +25,7 @@ function collect(node: ts.Node, fn: (node: ts.Node) => CollectStepResult): ts.No
const result: ts.Node[] = [];
function loop(node: ts.Node) {
var stepResult = fn(node);
const stepResult = fn(node);
if (stepResult === CollectStepResult.Yes || stepResult === CollectStepResult.YesAndRecurse) {
result.push(node);
@@ -42,8 +41,8 @@ function collect(node: ts.Node, fn: (node: ts.Node) => CollectStepResult): ts.No
}
function clone<T>(object: T): T {
var result = <T>{};
for (var id in object) {
const result = <T>{};
for (const id in object) {
result[id] = object[id];
}
return result;
@@ -67,8 +66,8 @@ define([], [${ wrap + lines.map(l => indent + l).join(',\n') + wrap}]);`;
* Returns a stream containing the patched JavaScript and source maps.
*/
function nls(): NodeJS.ReadWriteStream {
var input = through();
var output = input.pipe(through(function (f: FileSourceMap) {
const input = through();
const output = input.pipe(through(function (f: FileSourceMap) {
if (!f.sourceMap) {
return this.emit('error', new Error(`File ${f.relative} does not have sourcemaps.`));
}
@@ -83,7 +82,7 @@ function nls(): NodeJS.ReadWriteStream {
source = path.join(root, source);
}
const typescript = f.sourceMap.sourcesContent[0];
const typescript = f.sourceMap.sourcesContent![0];
if (!typescript) {
return this.emit('error', new Error(`File ${f.relative} does not have the original content in the source map.`));
}
@@ -174,7 +173,7 @@ module nls {
export function analyze(contents: string, options: ts.CompilerOptions = {}): ILocalizeAnalysisResult {
const filename = 'file.ts';
const serviceHost = new SingleFileServiceHost(assign(clone(options), { noResolve: true }), filename, contents);
const serviceHost = new SingleFileServiceHost(Object.assign(clone(options), { noResolve: true }), filename, contents);
const service = ts.createLanguageService(serviceHost);
const sourceFile = ts.createSourceFile(filename, contents, ts.ScriptTarget.ES5, true);
@@ -206,8 +205,8 @@ module nls {
// `nls.localize(...)` calls
const nlsLocalizeCallExpressions = importDeclarations
.filter(d => d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport)
.map(d => (<ts.NamespaceImport>d.importClause.namedBindings).name)
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamespaceImport))
.map(d => (<ts.NamespaceImport>d.importClause!.namedBindings).name)
.concat(importEqualsDeclarations.map(d => d.name))
// find read-only references to `nls`
@@ -226,8 +225,8 @@ module nls {
// `localize` named imports
const allLocalizeImportDeclarations = importDeclarations
.filter(d => d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports)
.map(d => [].concat((<ts.NamedImports>d.importClause.namedBindings).elements))
.filter(d => !!(d.importClause && d.importClause.namedBindings && d.importClause.namedBindings.kind === ts.SyntaxKind.NamedImports))
.map(d => ([] as any[]).concat((<ts.NamedImports>d.importClause!.namedBindings!).elements))
.flatten();
// `localize` read-only references
@@ -279,7 +278,7 @@ module nls {
constructor(contents: string) {
const regex = /\r\n|\r|\n/g;
let index = 0;
let match: RegExpExecArray;
let match: RegExpExecArray | null;
this.lines = [];
this.lineEndings = [];
@@ -360,7 +359,7 @@ module nls {
patches = patches.reverse();
let currentLine = -1;
let currentLineDiff = 0;
let source = null;
let source: string | null = null;
smc.eachMapping(m => {
const patch = patches[patches.length - 1];

View File

@@ -4,29 +4,32 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var path = require("path");
var gulp = require("gulp");
var sourcemaps = require("gulp-sourcemaps");
var filter = require("gulp-filter");
var minifyCSS = require("gulp-cssnano");
var uglify = require("gulp-uglify");
var composer = require("gulp-uglify/composer");
var uglifyes = require("uglify-es");
var es = require("event-stream");
var concat = require("gulp-concat");
var VinylFile = require("vinyl");
var bundle = require("./bundle");
var util = require("./util");
var gulpUtil = require("gulp-util");
var flatmap = require("gulp-flatmap");
var pump = require("pump");
var REPO_ROOT_PATH = path.join(__dirname, '../..');
const es = require("event-stream");
const gulp = require("gulp");
const concat = require("gulp-concat");
const minifyCSS = require("gulp-cssnano");
const filter = require("gulp-filter");
const flatmap = require("gulp-flatmap");
const sourcemaps = require("gulp-sourcemaps");
const uglify = require("gulp-uglify");
const composer = require("gulp-uglify/composer");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const path = require("path");
const pump = require("pump");
const uglifyes = require("uglify-es");
const VinylFile = require("vinyl");
const bundle = require("./bundle");
const i18n_1 = require("./i18n");
const stats_1 = require("./stats");
const util = require("./util");
const REPO_ROOT_PATH = path.join(__dirname, '../..');
function log(prefix, message) {
gulpUtil.log(gulpUtil.colors.cyan('[' + prefix + ']'), message);
fancyLog(ansiColors.cyan('[' + prefix + ']'), message);
}
// {{SQL CARBON EDIT}}
function loaderConfig(emptyPaths) {
var result = {
const result = {
paths: {
'vs': 'out-build/vs',
'sql': 'out-build/sql',
@@ -38,26 +41,26 @@ function loaderConfig(emptyPaths) {
return result;
}
exports.loaderConfig = loaderConfig;
var IS_OUR_COPYRIGHT_REGEXP = /Copyright \(C\) Microsoft Corporation/i;
const IS_OUR_COPYRIGHT_REGEXP = /Copyright \(C\) Microsoft Corporation/i;
function loader(src, bundledFileHeader, bundleLoader) {
var sources = [
src + "/vs/loader.js"
let sources = [
`${src}/vs/loader.js`
];
if (bundleLoader) {
sources = sources.concat([
src + "/vs/css.js",
src + "/vs/nls.js"
`${src}/vs/css.js`,
`${src}/vs/nls.js`
]);
}
var isFirst = true;
let isFirst = true;
return (gulp
.src(sources, { base: "" + src })
.src(sources, { base: `${src}` })
.pipe(es.through(function (data) {
if (isFirst) {
isFirst = false;
this.emit('data', new VinylFile({
path: 'fake',
base: '',
base: undefined,
contents: Buffer.from(bundledFileHeader)
}));
this.emit('data', data);
@@ -74,12 +77,12 @@ function loader(src, bundledFileHeader, bundleLoader) {
})));
}
function toConcatStream(src, bundledFileHeader, sources, dest) {
var useSourcemaps = /\.js$/.test(dest) && !/\.nls\.js$/.test(dest);
const useSourcemaps = /\.js$/.test(dest) && !/\.nls\.js$/.test(dest);
// If a bundle ends up including in any of the sources our copyright, then
// insert a fake source at the beginning of each bundle with our copyright
var containsOurCopyright = false;
for (var i = 0, len = sources.length; i < len; i++) {
var fileContents = sources[i].contents;
let containsOurCopyright = false;
for (let i = 0, len = sources.length; i < len; i++) {
const fileContents = sources[i].contents;
if (IS_OUR_COPYRIGHT_REGEXP.test(fileContents)) {
containsOurCopyright = true;
break;
@@ -91,9 +94,9 @@ function toConcatStream(src, bundledFileHeader, sources, dest) {
contents: bundledFileHeader
});
}
var treatedSources = sources.map(function (source) {
var root = source.path ? REPO_ROOT_PATH.replace(/\\/g, '/') : '';
var base = source.path ? root + ("/" + src) : '';
const treatedSources = sources.map(function (source) {
const root = source.path ? REPO_ROOT_PATH.replace(/\\/g, '/') : '';
const base = source.path ? root + `/${src}` : undefined;
return new VinylFile({
path: source.path ? root + '/' + source.path.replace(/\\/g, '/') : 'fake',
base: base,
@@ -102,7 +105,8 @@ function toConcatStream(src, bundledFileHeader, sources, dest) {
});
return es.readArray(treatedSources)
.pipe(useSourcemaps ? util.loadSourcemaps() : es.through())
.pipe(concat(dest));
.pipe(concat(dest))
.pipe(stats_1.createStatsStream(dest));
}
function toBundleStream(src, bundledFileHeader, bundles) {
return es.merge(bundles.map(function (bundle) {
@@ -110,33 +114,32 @@ function toBundleStream(src, bundledFileHeader, bundles) {
}));
}
function optimizeTask(opts) {
var src = opts.src;
var entryPoints = opts.entryPoints;
var otherSources = opts.otherSources;
var resources = opts.resources;
var loaderConfig = opts.loaderConfig;
var bundledFileHeader = opts.header;
var bundleLoader = (typeof opts.bundleLoader === 'undefined' ? true : opts.bundleLoader);
var out = opts.out;
const src = opts.src;
const entryPoints = opts.entryPoints;
const resources = opts.resources;
const loaderConfig = opts.loaderConfig;
const bundledFileHeader = opts.header;
const bundleLoader = (typeof opts.bundleLoader === 'undefined' ? true : opts.bundleLoader);
const out = opts.out;
return function () {
var bundlesStream = es.through(); // this stream will contain the bundled files
var resourcesStream = es.through(); // this stream will contain the resources
var bundleInfoStream = es.through(); // this stream will contain bundleInfo.json
const bundlesStream = es.through(); // this stream will contain the bundled files
const resourcesStream = es.through(); // this stream will contain the resources
const bundleInfoStream = es.through(); // this stream will contain bundleInfo.json
bundle.bundle(entryPoints, loaderConfig, function (err, result) {
if (err) {
if (err || !result) {
return bundlesStream.emit('error', JSON.stringify(err));
}
toBundleStream(src, bundledFileHeader, result.files).pipe(bundlesStream);
// Remove css inlined resources
var filteredResources = resources.slice();
const filteredResources = resources.slice();
result.cssInlinedResources.forEach(function (resource) {
if (process.env['VSCODE_BUILD_VERBOSE']) {
log('optimizer', 'excluding inlined: ' + resource);
}
filteredResources.push('!' + resource);
});
gulp.src(filteredResources, { base: "" + src }).pipe(resourcesStream);
var bundleInfoArray = [];
gulp.src(filteredResources, { base: `${src}`, allowEmpty: true }).pipe(resourcesStream);
const bundleInfoArray = [];
if (opts.bundleInfo) {
bundleInfoArray.push(new VinylFile({
path: 'bundleInfo.json',
@@ -146,26 +149,17 @@ function optimizeTask(opts) {
}
es.readArray(bundleInfoArray).pipe(bundleInfoStream);
});
var otherSourcesStream = es.through();
var otherSourcesStreamArr = [];
gulp.src(otherSources, { base: "" + src })
.pipe(es.through(function (data) {
otherSourcesStreamArr.push(toConcatStream(src, bundledFileHeader, [data], data.relative));
}, function () {
if (!otherSourcesStreamArr.length) {
setTimeout(function () { otherSourcesStream.emit('end'); }, 0);
}
else {
es.merge(otherSourcesStreamArr).pipe(otherSourcesStream);
}
}));
var result = es.merge(loader(src, bundledFileHeader, bundleLoader), bundlesStream, otherSourcesStream, resourcesStream, bundleInfoStream);
const result = es.merge(loader(src, bundledFileHeader, bundleLoader), bundlesStream, resourcesStream, bundleInfoStream);
return result
.pipe(sourcemaps.write('./', {
sourceRoot: null,
sourceRoot: undefined,
addComment: true,
includeContent: true
}))
.pipe(opts.languages && opts.languages.length ? i18n_1.processNlsFiles({
fileHeader: bundledFileHeader,
languages: opts.languages
}) : es.through())
.pipe(gulp.dest(out));
};
}
@@ -175,14 +169,14 @@ exports.optimizeTask = optimizeTask;
* to have a file "context" to include our copyright only once per file.
*/
function uglifyWithCopyrights() {
var preserveComments = function (f) {
return function (node, comment) {
var text = comment.value;
var type = comment.type;
const preserveComments = (f) => {
return (_node, comment) => {
const text = comment.value;
const type = comment.type;
if (/@minifier_do_not_preserve/.test(text)) {
return false;
}
var isOurCopyright = IS_OUR_COPYRIGHT_REGEXP.test(text);
const isOurCopyright = IS_OUR_COPYRIGHT_REGEXP.test(text);
if (isOurCopyright) {
if (f.__hasOurCopyright) {
return false;
@@ -200,10 +194,10 @@ function uglifyWithCopyrights() {
return false;
};
};
var minify = composer(uglifyes);
var input = es.through();
var output = input
.pipe(flatmap(function (stream, f) {
const minify = composer(uglifyes);
const input = es.through();
const output = input
.pipe(flatmap((stream, f) => {
return stream.pipe(minify({
output: {
comments: preserveComments(f),
@@ -214,18 +208,23 @@ function uglifyWithCopyrights() {
return es.duplex(input, output);
}
function minifyTask(src, sourceMapBaseUrl) {
var sourceMappingURL = sourceMapBaseUrl && (function (f) { return sourceMapBaseUrl + "/" + f.relative + ".map"; });
return function (cb) {
var jsFilter = filter('**/*.js', { restore: true });
var cssFilter = filter('**/*.css', { restore: true });
pump(gulp.src([src + '/**', '!' + src + '/**/*.map']), jsFilter, sourcemaps.init({ loadMaps: true }), uglifyWithCopyrights(), jsFilter.restore, cssFilter, minifyCSS({ reduceIdents: false }), cssFilter.restore, sourcemaps.write('./', {
sourceMappingURL: sourceMappingURL,
sourceRoot: null,
const sourceMappingURL = sourceMapBaseUrl ? ((f) => `${sourceMapBaseUrl}/${f.relative}.map`) : undefined;
return cb => {
const jsFilter = filter('**/*.js', { restore: true });
const cssFilter = filter('**/*.css', { restore: true });
pump(gulp.src([src + '/**', '!' + src + '/**/*.map']), jsFilter, sourcemaps.init({ loadMaps: true }), uglifyWithCopyrights(), jsFilter.restore, cssFilter, minifyCSS({ reduceIdents: false }), cssFilter.restore, sourcemaps.mapSources((sourcePath) => {
if (sourcePath === 'bootstrap-fork.js') {
return 'bootstrap-fork.orig.js';
}
return sourcePath;
}), sourcemaps.write('./', {
sourceMappingURL,
sourceRoot: undefined,
includeContent: true,
addComment: true
}), gulp.dest(src + '-min'), function (err) {
}), gulp.dest(src + '-min'), (err) => {
if (err instanceof uglify.GulpUglifyError) {
console.error("Uglify error in '" + (err.cause && err.cause.filename) + "'");
console.error(`Uglify error in '${err.cause && err.cause.filename}'`);
}
cb(err);
});

View File

@@ -5,34 +5,36 @@
'use strict';
import * as path from 'path';
import * as es from 'event-stream';
import * as gulp from 'gulp';
import * as sourcemaps from 'gulp-sourcemaps';
import * as filter from 'gulp-filter';
import * as concat from 'gulp-concat';
import * as minifyCSS from 'gulp-cssnano';
import * as filter from 'gulp-filter';
import * as flatmap from 'gulp-flatmap';
import * as sourcemaps from 'gulp-sourcemaps';
import * as uglify from 'gulp-uglify';
import * as composer from 'gulp-uglify/composer';
import * as uglifyes from 'uglify-es';
import * as es from 'event-stream';
import * as concat from 'gulp-concat';
import * as VinylFile from 'vinyl';
import * as bundle from './bundle';
import * as util from './util';
import * as gulpUtil from 'gulp-util';
import * as flatmap from 'gulp-flatmap';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as path from 'path';
import * as pump from 'pump';
import * as sm from 'source-map';
import { Language } from './i18n';
import * as uglifyes from 'uglify-es';
import * as VinylFile from 'vinyl';
import * as bundle from './bundle';
import { Language, processNlsFiles } from './i18n';
import { createStatsStream } from './stats';
import * as util from './util';
const REPO_ROOT_PATH = path.join(__dirname, '../..');
function log(prefix: string, message: string): void {
gulpUtil.log(gulpUtil.colors.cyan('[' + prefix + ']'), message);
fancyLog(ansiColors.cyan('[' + prefix + ']'), message);
}
// {{SQL CARBON EDIT}}
export function loaderConfig(emptyPaths?: string[]) {
const result = {
const result: any = {
paths: {
'vs': 'out-build/vs',
'sql': 'out-build/sql',
@@ -72,7 +74,7 @@ function loader(src: string, bundledFileHeader: string, bundleLoader: boolean):
isFirst = false;
this.emit('data', new VinylFile({
path: 'fake',
base: '',
base: undefined,
contents: Buffer.from(bundledFileHeader)
}));
this.emit('data', data);
@@ -112,7 +114,7 @@ function toConcatStream(src: string, bundledFileHeader: string, sources: bundle.
const treatedSources = sources.map(function (source) {
const root = source.path ? REPO_ROOT_PATH.replace(/\\/g, '/') : '';
const base = source.path ? root + `/${src}` : '';
const base = source.path ? root + `/${src}` : undefined;
return new VinylFile({
path: source.path ? root + '/' + source.path.replace(/\\/g, '/') : 'fake',
@@ -123,10 +125,11 @@ function toConcatStream(src: string, bundledFileHeader: string, sources: bundle.
return es.readArray(treatedSources)
.pipe(useSourcemaps ? util.loadSourcemaps() : es.through())
.pipe(concat(dest));
.pipe(concat(dest))
.pipe(createStatsStream(dest));
}
function toBundleStream(src:string, bundledFileHeader: string, bundles: bundle.IConcatFile[]): NodeJS.ReadWriteStream {
function toBundleStream(src: string, bundledFileHeader: string, bundles: bundle.IConcatFile[]): NodeJS.ReadWriteStream {
return es.merge(bundles.map(function (bundle) {
return toConcatStream(src, bundledFileHeader, bundle.sources, bundle.dest);
}));
@@ -141,10 +144,6 @@ export interface IOptimizeTaskOpts {
* (for AMD files, will get bundled and get Copyright treatment)
*/
entryPoints: bundle.IEntryPoint[];
/**
* (for non-AMD files that should get Copyright treatment)
*/
otherSources: string[];
/**
* (svg, etc.)
*/
@@ -175,7 +174,6 @@ export interface IOptimizeTaskOpts {
export function optimizeTask(opts: IOptimizeTaskOpts): () => NodeJS.ReadWriteStream {
const src = opts.src;
const entryPoints = opts.entryPoints;
const otherSources = opts.otherSources;
const resources = opts.resources;
const loaderConfig = opts.loaderConfig;
const bundledFileHeader = opts.header;
@@ -188,7 +186,7 @@ export function optimizeTask(opts: IOptimizeTaskOpts): () => NodeJS.ReadWriteStr
const bundleInfoStream = es.through(); // this stream will contain bundleInfo.json
bundle.bundle(entryPoints, loaderConfig, function (err, result) {
if (err) { return bundlesStream.emit('error', JSON.stringify(err)); }
if (err || !result) { return bundlesStream.emit('error', JSON.stringify(err)); }
toBundleStream(src, bundledFileHeader, result.files).pipe(bundlesStream);
@@ -200,7 +198,7 @@ export function optimizeTask(opts: IOptimizeTaskOpts): () => NodeJS.ReadWriteStr
}
filteredResources.push('!' + resource);
});
gulp.src(filteredResources, { base: `${src}` }).pipe(resourcesStream);
gulp.src(filteredResources, { base: `${src}`, allowEmpty: true }).pipe(resourcesStream);
const bundleInfoArray: VinylFile[] = [];
if (opts.bundleInfo) {
@@ -213,34 +211,23 @@ export function optimizeTask(opts: IOptimizeTaskOpts): () => NodeJS.ReadWriteStr
es.readArray(bundleInfoArray).pipe(bundleInfoStream);
});
const otherSourcesStream = es.through();
const otherSourcesStreamArr: NodeJS.ReadWriteStream[] = [];
gulp.src(otherSources, { base: `${src}` })
.pipe(es.through(function (data) {
otherSourcesStreamArr.push(toConcatStream(src, bundledFileHeader, [data], data.relative));
}, function () {
if (!otherSourcesStreamArr.length) {
setTimeout(function () { otherSourcesStream.emit('end'); }, 0);
} else {
es.merge(otherSourcesStreamArr).pipe(otherSourcesStream);
}
}));
const result = es.merge(
loader(src, bundledFileHeader, bundleLoader),
bundlesStream,
otherSourcesStream,
resourcesStream,
bundleInfoStream
);
return result
.pipe(sourcemaps.write('./', {
sourceRoot: null,
sourceRoot: undefined,
addComment: true,
includeContent: true
}))
.pipe(opts.languages && opts.languages.length ? processNlsFiles({
fileHeader: bundledFileHeader,
languages: opts.languages
}) : es.through())
.pipe(gulp.dest(out));
};
}
@@ -254,7 +241,7 @@ declare class FileWithCopyright extends VinylFile {
*/
function uglifyWithCopyrights(): NodeJS.ReadWriteStream {
const preserveComments = (f: FileWithCopyright) => {
return (node, comment: { value: string; type: string; }) => {
return (_node: any, comment: { value: string; type: string; }) => {
const text = comment.value;
const type = comment.type;
@@ -282,7 +269,7 @@ function uglifyWithCopyrights(): NodeJS.ReadWriteStream {
};
};
const minify = composer(uglifyes);
const minify = (composer as any)(uglifyes);
const input = es.through();
const output = input
.pipe(flatmap((stream, f) => {
@@ -298,7 +285,7 @@ function uglifyWithCopyrights(): NodeJS.ReadWriteStream {
}
export function minifyTask(src: string, sourceMapBaseUrl?: string): (cb: any) => void {
const sourceMappingURL = sourceMapBaseUrl && (f => `${sourceMapBaseUrl}/${f.relative}.map`);
const sourceMappingURL = sourceMapBaseUrl ? ((f: any) => `${sourceMapBaseUrl}/${f.relative}.map`) : undefined;
return cb => {
const jsFilter = filter('**/*.js', { restore: true });
@@ -313,15 +300,22 @@ export function minifyTask(src: string, sourceMapBaseUrl?: string): (cb: any) =>
cssFilter,
minifyCSS({ reduceIdents: false }),
cssFilter.restore,
(<any>sourcemaps).mapSources((sourcePath: string) => {
if (sourcePath === 'bootstrap-fork.js') {
return 'bootstrap-fork.orig.js';
}
return sourcePath;
}),
sourcemaps.write('./', {
sourceMappingURL,
sourceRoot: null,
sourceRoot: undefined,
includeContent: true,
addComment: true
}),
} as any),
gulp.dest(src + '-min')
, (err: any) => {
if (err instanceof uglify.GulpUglifyError) {
if (err instanceof (uglify as any).GulpUglifyError) {
console.error(`Uglify error in '${err.cause && err.cause.filename}'`);
}

View File

@@ -4,20 +4,21 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var es = require("event-stream");
var _ = require("underscore");
var util = require("gulp-util");
var fs = require("fs");
var path = require("path");
var allErrors = [];
var startTime = null;
var count = 0;
const es = require("event-stream");
const _ = require("underscore");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const fs = require("fs");
const path = require("path");
const allErrors = [];
let startTime = null;
let count = 0;
function onStart() {
if (count++ > 0) {
return;
}
startTime = new Date().getTime();
util.log("Starting " + util.colors.green('compilation') + "...");
fancyLog(`Starting ${ansiColors.green('compilation')}...`);
}
function onEnd() {
if (--count > 0) {
@@ -25,7 +26,7 @@ function onEnd() {
}
log();
}
var buildLogPath = path.join(path.dirname(path.dirname(__dirname)), '.build', 'log');
const buildLogPath = path.join(path.dirname(path.dirname(__dirname)), '.build', 'log');
try {
fs.mkdirSync(path.dirname(buildLogPath));
}
@@ -33,61 +34,52 @@ catch (err) {
// ignore
}
function log() {
var errors = _.flatten(allErrors);
var seen = new Set();
errors.map(function (err) {
const errors = _.flatten(allErrors);
const seen = new Set();
errors.map(err => {
if (!seen.has(err)) {
seen.add(err);
util.log(util.colors.red('Error') + ": " + err);
fancyLog(`${ansiColors.red('Error')}: ${err}`);
}
});
var regex = /^([^(]+)\((\d+),(\d+)\): (.*)$/;
var messages = errors
.map(function (err) { return regex.exec(err); })
.filter(function (match) { return !!match; })
.map(function (_a) {
var path = _a[1], line = _a[2], column = _a[3], message = _a[4];
return ({ path: path, line: parseInt(line), column: parseInt(column), message: message });
});
const regex = /^([^(]+)\((\d+),(\d+)\): (.*)$/;
const messages = errors
.map(err => regex.exec(err))
.filter(match => !!match)
.map(x => x)
.map(([, path, line, column, message]) => ({ path, line: parseInt(line), column: parseInt(column), message }));
try {
fs.writeFileSync(buildLogPath, JSON.stringify(messages));
}
catch (err) {
//noop
}
util.log("Finished " + util.colors.green('compilation') + " with " + errors.length + " errors after " + util.colors.magenta((new Date().getTime() - startTime) + ' ms'));
fancyLog(`Finished ${ansiColors.green('compilation')} with ${errors.length} errors after ${ansiColors.magenta((new Date().getTime() - startTime) + ' ms')}`);
}
function createReporter() {
var errors = [];
const errors = [];
allErrors.push(errors);
var ReportFunc = /** @class */ (function () {
function ReportFunc(err) {
errors.push(err);
}
ReportFunc.hasErrors = function () {
return errors.length > 0;
};
ReportFunc.end = function (emitError) {
errors.length = 0;
onStart();
return es.through(null, function () {
onEnd();
if (emitError && errors.length > 0) {
errors.__logged__ = true;
if (!errors.__logged__) {
log();
}
var err = new Error("Found " + errors.length + " errors");
err.__reporter__ = true;
this.emit('error', err);
const result = (err) => errors.push(err);
result.hasErrors = () => errors.length > 0;
result.end = (emitError) => {
errors.length = 0;
onStart();
return es.through(undefined, function () {
onEnd();
if (emitError && errors.length > 0) {
if (!errors.__logged__) {
log();
}
else {
this.emit('end');
}
});
};
return ReportFunc;
}());
return ReportFunc;
errors.__logged__ = true;
const err = new Error(`Found ${errors.length} errors`);
err.__reporter__ = true;
this.emit('error', err);
}
else {
this.emit('end');
}
});
};
return result;
}
exports.createReporter = createReporter;

View File

@@ -7,12 +7,13 @@
import * as es from 'event-stream';
import * as _ from 'underscore';
import * as util from 'gulp-util';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as fs from 'fs';
import * as path from 'path';
const allErrors: string[][] = [];
let startTime: number = null;
let startTime: number | null = null;
let count = 0;
function onStart(): void {
@@ -21,7 +22,7 @@ function onStart(): void {
}
startTime = new Date().getTime();
util.log(`Starting ${util.colors.green('compilation')}...`);
fancyLog(`Starting ${ansiColors.green('compilation')}...`);
}
function onEnd(): void {
@@ -47,7 +48,7 @@ function log(): void {
errors.map(err => {
if (!seen.has(err)) {
seen.add(err);
util.log(`${util.colors.red('Error')}: ${err}`);
fancyLog(`${ansiColors.red('Error')}: ${err}`);
}
});
@@ -55,6 +56,7 @@ function log(): void {
const messages = errors
.map(err => regex.exec(err))
.filter(match => !!match)
.map(x => x as string[])
.map(([, path, line, column, message]) => ({ path, line: parseInt(line), column: parseInt(column), message }));
try {
@@ -64,7 +66,7 @@ function log(): void {
//noop
}
util.log(`Finished ${util.colors.green('compilation')} with ${errors.length} errors after ${util.colors.magenta((new Date().getTime() - startTime) + ' ms')}`);
fancyLog(`Finished ${ansiColors.green('compilation')} with ${errors.length} errors after ${ansiColors.magenta((new Date().getTime() - startTime!) + ' ms')}`);
}
export interface IReporter {
@@ -77,38 +79,32 @@ export function createReporter(): IReporter {
const errors: string[] = [];
allErrors.push(errors);
class ReportFunc {
constructor(err: string) {
errors.push(err);
}
const result = (err: string) => errors.push(err);
static hasErrors(): boolean {
return errors.length > 0;
}
result.hasErrors = () => errors.length > 0;
static end(emitError: boolean): NodeJS.ReadWriteStream {
errors.length = 0;
onStart();
result.end = (emitError: boolean): NodeJS.ReadWriteStream => {
errors.length = 0;
onStart();
return es.through(null, function () {
onEnd();
return es.through(undefined, function () {
onEnd();
if (emitError && errors.length > 0) {
(errors as any).__logged__ = true;
if (!(errors as any).__logged__) {
log();
}
const err = new Error(`Found ${errors.length} errors`);
(err as any).__reporter__ = true;
this.emit('error', err);
} else {
this.emit('end');
if (emitError && errors.length > 0) {
if (!(errors as any).__logged__) {
log();
}
});
}
}
return <IReporter><any>ReportFunc;
(errors as any).__logged__ = true;
const err = new Error(`Found ${errors.length} errors`);
(err as any).__reporter__ = true;
this.emit('error', err);
} else {
this.emit('end');
}
});
};
return result;
}

View File

@@ -5,35 +5,51 @@
'use strict';
var snaps;
(function (snaps) {
var fs = require('fs');
var path = require('path');
var os = require('os');
var cp = require('child_process');
var mksnapshot = path.join(__dirname, "../../node_modules/.bin/" + (process.platform === 'win32' ? 'mksnapshot.cmd' : 'mksnapshot'));
var product = require('../../product.json');
var arch = (process.argv.join('').match(/--arch=(.*)/) || [])[1];
const fs = require('fs');
const path = require('path');
const os = require('os');
const cp = require('child_process');
const mksnapshot = path.join(__dirname, `../../node_modules/.bin/${process.platform === 'win32' ? 'mksnapshot.cmd' : 'mksnapshot'}`);
const product = require('../../product.json');
const arch = (process.argv.join('').match(/--arch=(.*)/) || [])[1];
//
var loaderFilepath;
var startupBlobFilepath;
let loaderFilepath;
let startupBlobFilepath;
switch (process.platform) {
case 'darwin':
loaderFilepath = "VSCode-darwin/" + product.nameLong + ".app/Contents/Resources/app/out/vs/loader.js";
startupBlobFilepath = "VSCode-darwin/" + product.nameLong + ".app/Contents/Frameworks/Electron Framework.framework/Resources/snapshot_blob.bin";
loaderFilepath = `VSCode-darwin/${product.nameLong}.app/Contents/Resources/app/out/vs/loader.js`;
startupBlobFilepath = `VSCode-darwin/${product.nameLong}.app/Contents/Frameworks/Electron Framework.framework/Resources/snapshot_blob.bin`;
break;
case 'win32':
case 'linux':
loaderFilepath = "VSCode-" + process.platform + "-" + arch + "/resources/app/out/vs/loader.js";
startupBlobFilepath = "VSCode-" + process.platform + "-" + arch + "/snapshot_blob.bin";
loaderFilepath = `VSCode-${process.platform}-${arch}/resources/app/out/vs/loader.js`;
startupBlobFilepath = `VSCode-${process.platform}-${arch}/snapshot_blob.bin`;
break;
default:
throw new Error('Unknown platform');
}
loaderFilepath = path.join(__dirname, '../../../', loaderFilepath);
startupBlobFilepath = path.join(__dirname, '../../../', startupBlobFilepath);
snapshotLoader(loaderFilepath, startupBlobFilepath);
function snapshotLoader(loaderFilepath, startupBlobFilepath) {
var inputFile = fs.readFileSync(loaderFilepath);
var wrappedInputFile = "\n\t\tvar Monaco_Loader_Init;\n\t\t(function() {\n\t\t\tvar doNotInitLoader = true;\n\t\t\t" + inputFile.toString() + ";\n\t\t\tMonaco_Loader_Init = function() {\n\t\t\t\tAMDLoader.init();\n\t\t\t\tCSSLoaderPlugin.init();\n\t\t\t\tNLSLoaderPlugin.init();\n\n\t\t\t\treturn { define, require };\n\t\t\t}\n\t\t})();\n\t\t";
var wrappedInputFilepath = path.join(os.tmpdir(), 'wrapped-loader.js');
const inputFile = fs.readFileSync(loaderFilepath);
const wrappedInputFile = `
var Monaco_Loader_Init;
(function() {
var doNotInitLoader = true;
${inputFile.toString()};
Monaco_Loader_Init = function() {
AMDLoader.init();
CSSLoaderPlugin.init();
NLSLoaderPlugin.init();
return { define, require };
}
})();
`;
const wrappedInputFilepath = path.join(os.tmpdir(), 'wrapped-loader.js');
console.log(wrappedInputFilepath);
fs.writeFileSync(wrappedInputFilepath, wrappedInputFile);
cp.execFileSync(mksnapshot, [wrappedInputFilepath, "--startup_blob", startupBlobFilepath]);
cp.execFileSync(mksnapshot, [wrappedInputFilepath, `--startup_blob`, startupBlobFilepath]);
}
})(snaps || (snaps = {}));

View File

@@ -30,6 +30,10 @@ namespace snaps {
case 'linux':
loaderFilepath = `VSCode-${process.platform}-${arch}/resources/app/out/vs/loader.js`;
startupBlobFilepath = `VSCode-${process.platform}-${arch}/snapshot_blob.bin`;
break;
default:
throw new Error('Unknown platform');
}
loaderFilepath = path.join(__dirname, '../../../', loaderFilepath);

View File

@@ -4,14 +4,13 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var ts = require("typescript");
var fs = require("fs");
var path = require("path");
var tss = require("./treeshaking");
var REPO_ROOT = path.join(__dirname, '../../');
var SRC_DIR = path.join(REPO_ROOT, 'src');
var OUT_EDITOR = path.join(REPO_ROOT, 'out-editor');
var dirCache = {};
const ts = require("typescript");
const fs = require("fs");
const path = require("path");
const tss = require("./treeshaking");
const REPO_ROOT = path.join(__dirname, '../../');
const SRC_DIR = path.join(REPO_ROOT, 'src');
let dirCache = {};
function writeFile(filePath, contents) {
function ensureDirs(dirPath) {
if (dirCache[dirPath]) {
@@ -28,32 +27,47 @@ function writeFile(filePath, contents) {
fs.writeFileSync(filePath, contents);
}
function extractEditor(options) {
var result = tss.shake(options);
for (var fileName in result) {
const tsConfig = JSON.parse(fs.readFileSync(path.join(options.sourcesRoot, 'tsconfig.monaco.json')).toString());
let compilerOptions;
if (tsConfig.extends) {
compilerOptions = Object.assign({}, require(path.join(options.sourcesRoot, tsConfig.extends)).compilerOptions, tsConfig.compilerOptions);
}
else {
compilerOptions = tsConfig.compilerOptions;
}
tsConfig.compilerOptions = compilerOptions;
compilerOptions.noEmit = false;
compilerOptions.noUnusedLocals = false;
compilerOptions.preserveConstEnums = false;
compilerOptions.declaration = false;
compilerOptions.moduleResolution = ts.ModuleResolutionKind.Classic;
options.compilerOptions = compilerOptions;
let result = tss.shake(options);
for (let fileName in result) {
if (result.hasOwnProperty(fileName)) {
writeFile(path.join(options.destRoot, fileName), result[fileName]);
}
}
var copied = {};
var copyFile = function (fileName) {
let copied = {};
const copyFile = (fileName) => {
if (copied[fileName]) {
return;
}
copied[fileName] = true;
var srcPath = path.join(options.sourcesRoot, fileName);
var dstPath = path.join(options.destRoot, fileName);
const srcPath = path.join(options.sourcesRoot, fileName);
const dstPath = path.join(options.destRoot, fileName);
writeFile(dstPath, fs.readFileSync(srcPath));
};
var writeOutputFile = function (fileName, contents) {
const writeOutputFile = (fileName, contents) => {
writeFile(path.join(options.destRoot, fileName), contents);
};
for (var fileName in result) {
for (let fileName in result) {
if (result.hasOwnProperty(fileName)) {
var fileContents = result[fileName];
var info = ts.preProcessFile(fileContents);
for (var i = info.importedFiles.length - 1; i >= 0; i--) {
var importedFileName = info.importedFiles[i].fileName;
var importedFilePath = void 0;
const fileContents = result[fileName];
const info = ts.preProcessFile(fileContents);
for (let i = info.importedFiles.length - 1; i >= 0; i--) {
const importedFileName = info.importedFiles[i].fileName;
let importedFilePath;
if (/^vs\/css!/.test(importedFileName)) {
importedFilePath = importedFileName.substr('vs/css!'.length) + '.css';
}
@@ -74,215 +88,187 @@ function extractEditor(options) {
}
}
}
var tsConfig = JSON.parse(fs.readFileSync(path.join(options.sourcesRoot, 'tsconfig.json')).toString());
tsConfig.compilerOptions.noUnusedLocals = false;
delete tsConfig.compilerOptions.moduleResolution;
writeOutputFile('tsconfig.json', JSON.stringify(tsConfig, null, '\t'));
[
'vs/css.build.js',
'vs/css.d.ts',
'vs/css.js',
'vs/loader.js',
'vs/monaco.d.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/nls.js',
'vs/nls.mock.ts',
'typings/lib.ie11_safe_es6.d.ts',
'typings/thenable.d.ts',
'typings/es6-promise.d.ts',
'typings/require.d.ts',
].forEach(copyFile);
}
exports.extractEditor = extractEditor;
function createESMSourcesAndResources(options) {
var OUT_FOLDER = path.join(REPO_ROOT, options.outFolder);
var OUT_RESOURCES_FOLDER = path.join(REPO_ROOT, options.outResourcesFolder);
var in_queue = Object.create(null);
var queue = [];
var enqueue = function (module) {
if (in_queue[module]) {
return;
function createESMSourcesAndResources2(options) {
const SRC_FOLDER = path.join(REPO_ROOT, options.srcFolder);
const OUT_FOLDER = path.join(REPO_ROOT, options.outFolder);
const OUT_RESOURCES_FOLDER = path.join(REPO_ROOT, options.outResourcesFolder);
const getDestAbsoluteFilePath = (file) => {
let dest = options.renames[file.replace(/\\/g, '/')] || file;
if (dest === 'tsconfig.json') {
return path.join(OUT_FOLDER, `tsconfig.json`);
}
in_queue[module] = true;
queue.push(module);
if (/\.ts$/.test(dest)) {
return path.join(OUT_FOLDER, dest);
}
return path.join(OUT_RESOURCES_FOLDER, dest);
};
var seenDir = {};
var createDirectoryRecursive = function (dir) {
if (seenDir[dir]) {
return;
const allFiles = walkDirRecursive(SRC_FOLDER);
for (const file of allFiles) {
if (options.ignores.indexOf(file.replace(/\\/g, '/')) >= 0) {
continue;
}
var lastSlash = dir.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = dir.lastIndexOf('\\');
if (file === 'tsconfig.json') {
const tsConfig = JSON.parse(fs.readFileSync(path.join(SRC_FOLDER, file)).toString());
tsConfig.compilerOptions.module = 'es6';
tsConfig.compilerOptions.outDir = path.join(path.relative(OUT_FOLDER, OUT_RESOURCES_FOLDER), 'vs').replace(/\\/g, '/');
write(getDestAbsoluteFilePath(file), JSON.stringify(tsConfig, null, '\t'));
continue;
}
if (lastSlash !== -1) {
createDirectoryRecursive(dir.substring(0, lastSlash));
if (/\.d\.ts$/.test(file) || /\.css$/.test(file) || /\.js$/.test(file)) {
// Transport the files directly
write(getDestAbsoluteFilePath(file), fs.readFileSync(path.join(SRC_FOLDER, file)));
continue;
}
seenDir[dir] = true;
try {
fs.mkdirSync(dir);
}
catch (err) { }
};
seenDir[REPO_ROOT] = true;
var toggleComments = function (fileContents) {
var lines = fileContents.split(/\r\n|\r|\n/);
var mode = 0;
for (var i = 0; i < lines.length; i++) {
var line = lines[i];
if (mode === 0) {
if (/\/\/ ESM-comment-begin/.test(line)) {
mode = 1;
continue;
if (/\.ts$/.test(file)) {
// Transform the .ts file
let fileContents = fs.readFileSync(path.join(SRC_FOLDER, file)).toString();
const info = ts.preProcessFile(fileContents);
for (let i = info.importedFiles.length - 1; i >= 0; i--) {
const importedFilename = info.importedFiles[i].fileName;
const pos = info.importedFiles[i].pos;
const end = info.importedFiles[i].end;
let importedFilepath;
if (/^vs\/css!/.test(importedFilename)) {
importedFilepath = importedFilename.substr('vs/css!'.length) + '.css';
}
if (/\/\/ ESM-uncomment-begin/.test(line)) {
mode = 2;
continue;
else {
importedFilepath = importedFilename;
}
continue;
if (/(^\.\/)|(^\.\.\/)/.test(importedFilepath)) {
importedFilepath = path.join(path.dirname(file), importedFilepath);
}
let relativePath;
if (importedFilepath === path.dirname(file).replace(/\\/g, '/')) {
relativePath = '../' + path.basename(path.dirname(file));
}
else if (importedFilepath === path.dirname(path.dirname(file)).replace(/\\/g, '/')) {
relativePath = '../../' + path.basename(path.dirname(path.dirname(file)));
}
else {
relativePath = path.relative(path.dirname(file), importedFilepath);
}
relativePath = relativePath.replace(/\\/g, '/');
if (!/(^\.\/)|(^\.\.\/)/.test(relativePath)) {
relativePath = './' + relativePath;
}
fileContents = (fileContents.substring(0, pos + 1)
+ relativePath
+ fileContents.substring(end + 1));
}
if (mode === 1) {
if (/\/\/ ESM-comment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = '// ' + line;
continue;
fileContents = fileContents.replace(/import ([a-zA-z0-9]+) = require\(('[^']+')\);/g, function (_, m1, m2) {
return `import * as ${m1} from ${m2};`;
});
write(getDestAbsoluteFilePath(file), fileContents);
continue;
}
console.log(`UNKNOWN FILE: ${file}`);
}
function walkDirRecursive(dir) {
if (dir.charAt(dir.length - 1) !== '/' || dir.charAt(dir.length - 1) !== '\\') {
dir += '/';
}
let result = [];
_walkDirRecursive(dir, result, dir.length);
return result;
}
function _walkDirRecursive(dir, result, trimPos) {
const files = fs.readdirSync(dir);
for (let i = 0; i < files.length; i++) {
const file = path.join(dir, files[i]);
if (fs.statSync(file).isDirectory()) {
_walkDirRecursive(file, result, trimPos);
}
if (mode === 2) {
if (/\/\/ ESM-uncomment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = line.replace(/^(\s*)\/\/ ?/, function (_, indent) {
return indent;
});
else {
result.push(file.substr(trimPos));
}
}
return lines.join('\n');
};
var write = function (filePath, contents) {
var absoluteFilePath;
if (/\.ts$/.test(filePath)) {
absoluteFilePath = path.join(OUT_FOLDER, filePath);
}
else {
absoluteFilePath = path.join(OUT_RESOURCES_FOLDER, filePath);
}
createDirectoryRecursive(path.dirname(absoluteFilePath));
if (/(\.ts$)|(\.js$)/.test(filePath)) {
}
function write(absoluteFilePath, contents) {
if (/(\.ts$)|(\.js$)/.test(absoluteFilePath)) {
contents = toggleComments(contents.toString());
}
fs.writeFileSync(absoluteFilePath, contents);
};
options.entryPoints.forEach(function (entryPoint) { return enqueue(entryPoint); });
while (queue.length > 0) {
var module_1 = queue.shift();
if (transportCSS(module_1, enqueue, write)) {
continue;
writeFile(absoluteFilePath, contents);
function toggleComments(fileContents) {
let lines = fileContents.split(/\r\n|\r|\n/);
let mode = 0;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (mode === 0) {
if (/\/\/ ESM-comment-begin/.test(line)) {
mode = 1;
continue;
}
if (/\/\/ ESM-uncomment-begin/.test(line)) {
mode = 2;
continue;
}
continue;
}
if (mode === 1) {
if (/\/\/ ESM-comment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = '// ' + line;
continue;
}
if (mode === 2) {
if (/\/\/ ESM-uncomment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = line.replace(/^(\s*)\/\/ ?/, function (_, indent) {
return indent;
});
}
}
return lines.join('\n');
}
if (transportResource(options, module_1, enqueue, write)) {
continue;
}
if (transportDTS(options, module_1, enqueue, write)) {
continue;
}
var filename = void 0;
if (options.redirects[module_1]) {
filename = path.join(SRC_DIR, options.redirects[module_1] + '.ts');
}
else {
filename = path.join(SRC_DIR, module_1 + '.ts');
}
var fileContents = fs.readFileSync(filename).toString();
var info = ts.preProcessFile(fileContents);
for (var i = info.importedFiles.length - 1; i >= 0; i--) {
var importedFilename = info.importedFiles[i].fileName;
var pos = info.importedFiles[i].pos;
var end = info.importedFiles[i].end;
var importedFilepath = void 0;
if (/^vs\/css!/.test(importedFilename)) {
importedFilepath = importedFilename.substr('vs/css!'.length) + '.css';
}
else {
importedFilepath = importedFilename;
}
if (/(^\.\/)|(^\.\.\/)/.test(importedFilepath)) {
importedFilepath = path.join(path.dirname(module_1), importedFilepath);
}
enqueue(importedFilepath);
var relativePath = void 0;
if (importedFilepath === path.dirname(module_1)) {
relativePath = '../' + path.basename(path.dirname(module_1));
}
else if (importedFilepath === path.dirname(path.dirname(module_1))) {
relativePath = '../../' + path.basename(path.dirname(path.dirname(module_1)));
}
else {
relativePath = path.relative(path.dirname(module_1), importedFilepath);
}
if (!/(^\.\/)|(^\.\.\/)/.test(relativePath)) {
relativePath = './' + relativePath;
}
fileContents = (fileContents.substring(0, pos + 1)
+ relativePath
+ fileContents.substring(end + 1));
}
fileContents = fileContents.replace(/import ([a-zA-z0-9]+) = require\(('[^']+')\);/g, function (_, m1, m2) {
return "import * as " + m1 + " from " + m2 + ";";
});
fileContents = fileContents.replace(/Thenable/g, 'PromiseLike');
write(module_1 + '.ts', fileContents);
}
var esm_opts = {
"compilerOptions": {
"outDir": path.relative(path.dirname(OUT_FOLDER), OUT_RESOURCES_FOLDER),
"rootDir": "src",
"module": "es6",
"target": "es5",
"experimentalDecorators": true,
"lib": [
"dom",
"es5",
"es2015.collection",
"es2015.promise"
],
"types": []
}
};
fs.writeFileSync(path.join(path.dirname(OUT_FOLDER), 'tsconfig.json'), JSON.stringify(esm_opts, null, '\t'));
var monacodts = fs.readFileSync(path.join(SRC_DIR, 'vs/monaco.d.ts')).toString();
fs.writeFileSync(path.join(OUT_FOLDER, 'vs/monaco.d.ts'), monacodts);
}
exports.createESMSourcesAndResources = createESMSourcesAndResources;
exports.createESMSourcesAndResources2 = createESMSourcesAndResources2;
function transportCSS(module, enqueue, write) {
if (!/\.css/.test(module)) {
return false;
}
var filename = path.join(SRC_DIR, module);
var fileContents = fs.readFileSync(filename).toString();
var inlineResources = 'base64'; // see https://github.com/Microsoft/monaco-editor/issues/148
var inlineResourcesLimit = 300000; //3000; // see https://github.com/Microsoft/monaco-editor/issues/336
var newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64', inlineResourcesLimit);
const filename = path.join(SRC_DIR, module);
const fileContents = fs.readFileSync(filename).toString();
const inlineResources = 'base64'; // see https://github.com/Microsoft/monaco-editor/issues/148
const inlineResourcesLimit = 300000; //3000; // see https://github.com/Microsoft/monaco-editor/issues/336
const newContents = _rewriteOrInlineUrls(fileContents, inlineResources === 'base64', inlineResourcesLimit);
write(module, newContents);
return true;
function _rewriteOrInlineUrls(contents, forceBase64, inlineByteLimit) {
return _replaceURL(contents, function (url) {
var imagePath = path.join(path.dirname(module), url);
var fileContents = fs.readFileSync(path.join(SRC_DIR, imagePath));
return _replaceURL(contents, (url) => {
let imagePath = path.join(path.dirname(module), url);
let fileContents = fs.readFileSync(path.join(SRC_DIR, imagePath));
if (fileContents.length < inlineByteLimit) {
var MIME = /\.svg$/.test(url) ? 'image/svg+xml' : 'image/png';
var DATA = ';base64,' + fileContents.toString('base64');
const MIME = /\.svg$/.test(url) ? 'image/svg+xml' : 'image/png';
let DATA = ';base64,' + fileContents.toString('base64');
if (!forceBase64 && /\.svg$/.test(url)) {
// .svg => url encode as explained at https://codepen.io/tigt/post/optimizing-svgs-in-data-uris
var newText = fileContents.toString()
let newText = fileContents.toString()
.replace(/"/g, '\'')
.replace(/</g, '%3C')
.replace(/>/g, '%3E')
.replace(/&/g, '%26')
.replace(/#/g, '%23')
.replace(/\s+/g, ' ');
var encodedData = ',' + newText;
let encodedData = ',' + newText;
if (encodedData.length < DATA.length) {
DATA = encodedData;
}
@@ -295,12 +281,8 @@ function transportCSS(module, enqueue, write) {
}
function _replaceURL(contents, replacer) {
// Use ")" as the terminator as quotes are oftentimes not used at all
return contents.replace(/url\(\s*([^\)]+)\s*\)?/g, function (_) {
var matches = [];
for (var _i = 1; _i < arguments.length; _i++) {
matches[_i - 1] = arguments[_i];
}
var url = matches[0];
return contents.replace(/url\(\s*([^\)]+)\s*\)?/g, (_, ...matches) => {
let url = matches[0];
// Eliminate starting quotes (the initial whitespace is not captured)
if (url.charAt(0) === '"' || url.charAt(0) === '\'') {
url = url.substring(1);
@@ -323,27 +305,3 @@ function transportCSS(module, enqueue, write) {
return haystack.length >= needle.length && haystack.substr(0, needle.length) === needle;
}
}
function transportResource(options, module, enqueue, write) {
if (!/\.svg/.test(module)) {
return false;
}
write(module, fs.readFileSync(path.join(SRC_DIR, module)));
return true;
}
function transportDTS(options, module, enqueue, write) {
if (options.redirects[module] && fs.existsSync(path.join(SRC_DIR, options.redirects[module] + '.ts'))) {
return false;
}
if (!fs.existsSync(path.join(SRC_DIR, module + '.d.ts'))) {
return false;
}
write(module + '.d.ts', fs.readFileSync(path.join(SRC_DIR, module + '.d.ts')));
var filename;
if (options.redirects[module]) {
write(module + '.js', fs.readFileSync(path.join(SRC_DIR, options.redirects[module] + '.js')));
}
else {
write(module + '.js', fs.readFileSync(path.join(SRC_DIR, module + '.js')));
}
return true;
}

View File

@@ -10,7 +10,6 @@ import * as tss from './treeshaking';
const REPO_ROOT = path.join(__dirname, '../../');
const SRC_DIR = path.join(REPO_ROOT, 'src');
const OUT_EDITOR = path.join(REPO_ROOT, 'out-editor');
let dirCache: { [dir: string]: boolean; } = {};
@@ -32,13 +31,31 @@ function writeFile(filePath: string, contents: Buffer | string): void {
}
export function extractEditor(options: tss.ITreeShakingOptions & { destRoot: string }): void {
const tsConfig = JSON.parse(fs.readFileSync(path.join(options.sourcesRoot, 'tsconfig.monaco.json')).toString());
let compilerOptions: { [key: string]: any };
if (tsConfig.extends) {
compilerOptions = Object.assign({}, require(path.join(options.sourcesRoot, tsConfig.extends)).compilerOptions, tsConfig.compilerOptions);
} else {
compilerOptions = tsConfig.compilerOptions;
}
tsConfig.compilerOptions = compilerOptions;
compilerOptions.noEmit = false;
compilerOptions.noUnusedLocals = false;
compilerOptions.preserveConstEnums = false;
compilerOptions.declaration = false;
compilerOptions.moduleResolution = ts.ModuleResolutionKind.Classic;
options.compilerOptions = compilerOptions;
let result = tss.shake(options);
for (let fileName in result) {
if (result.hasOwnProperty(fileName)) {
writeFile(path.join(options.destRoot, fileName), result[fileName]);
}
}
let copied: { [fileName:string]: boolean; } = {};
let copied: { [fileName: string]: boolean; } = {};
const copyFile = (fileName: string) => {
if (copied[fileName]) {
return;
@@ -48,7 +65,7 @@ export function extractEditor(options: tss.ITreeShakingOptions & { destRoot: str
const dstPath = path.join(options.destRoot, fileName);
writeFile(dstPath, fs.readFileSync(srcPath));
};
const writeOutputFile = (fileName: string, contents: string) => {
const writeOutputFile = (fileName: string, contents: string | Buffer) => {
writeFile(path.join(options.destRoot, fileName), contents);
};
for (let fileName in result) {
@@ -80,8 +97,7 @@ export function extractEditor(options: tss.ITreeShakingOptions & { destRoot: str
}
}
const tsConfig = JSON.parse(fs.readFileSync(path.join(options.sourcesRoot, 'tsconfig.json')).toString());
tsConfig.compilerOptions.noUnusedLocals = false;
delete tsConfig.compilerOptions.moduleResolution;
writeOutputFile('tsconfig.json', JSON.stringify(tsConfig, null, '\t'));
[
@@ -89,203 +105,177 @@ export function extractEditor(options: tss.ITreeShakingOptions & { destRoot: str
'vs/css.d.ts',
'vs/css.js',
'vs/loader.js',
'vs/monaco.d.ts',
'vs/nls.build.js',
'vs/nls.d.ts',
'vs/nls.js',
'vs/nls.mock.ts',
'typings/lib.ie11_safe_es6.d.ts',
'typings/thenable.d.ts',
'typings/es6-promise.d.ts',
'typings/require.d.ts',
].forEach(copyFile);
}
export interface IOptions {
entryPoints: string[];
export interface IOptions2 {
srcFolder: string;
outFolder: string;
outResourcesFolder: string;
redirects: { [module: string]: string; };
ignores: string[];
renames: { [filename: string]: string; };
}
export function createESMSourcesAndResources(options: IOptions): void {
export function createESMSourcesAndResources2(options: IOptions2): void {
const SRC_FOLDER = path.join(REPO_ROOT, options.srcFolder);
const OUT_FOLDER = path.join(REPO_ROOT, options.outFolder);
const OUT_RESOURCES_FOLDER = path.join(REPO_ROOT, options.outResourcesFolder);
let in_queue: { [module: string]: boolean; } = Object.create(null);
let queue: string[] = [];
const enqueue = (module: string) => {
if (in_queue[module]) {
return;
const getDestAbsoluteFilePath = (file: string): string => {
let dest = options.renames[file.replace(/\\/g, '/')] || file;
if (dest === 'tsconfig.json') {
return path.join(OUT_FOLDER, `tsconfig.json`);
}
in_queue[module] = true;
queue.push(module);
if (/\.ts$/.test(dest)) {
return path.join(OUT_FOLDER, dest);
}
return path.join(OUT_RESOURCES_FOLDER, dest);
};
const seenDir: { [key: string]: boolean; } = {};
const createDirectoryRecursive = (dir: string) => {
if (seenDir[dir]) {
return;
}
const allFiles = walkDirRecursive(SRC_FOLDER);
for (const file of allFiles) {
let lastSlash = dir.lastIndexOf('/');
if (lastSlash === -1) {
lastSlash = dir.lastIndexOf('\\');
}
if (lastSlash !== -1) {
createDirectoryRecursive(dir.substring(0, lastSlash));
}
seenDir[dir] = true;
try { fs.mkdirSync(dir); } catch (err) { }
};
seenDir[REPO_ROOT] = true;
const toggleComments = (fileContents: string) => {
let lines = fileContents.split(/\r\n|\r|\n/);
let mode = 0;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (mode === 0) {
if (/\/\/ ESM-comment-begin/.test(line)) {
mode = 1;
continue;
}
if (/\/\/ ESM-uncomment-begin/.test(line)) {
mode = 2;
continue;
}
continue;
}
if (mode === 1) {
if (/\/\/ ESM-comment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = '// ' + line;
continue;
}
if (mode === 2) {
if (/\/\/ ESM-uncomment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = line.replace(/^(\s*)\/\/ ?/, function (_, indent) {
return indent;
});
}
}
return lines.join('\n');
};
const write = (filePath: string, contents: string | Buffer) => {
let absoluteFilePath: string;
if (/\.ts$/.test(filePath)) {
absoluteFilePath = path.join(OUT_FOLDER, filePath);
} else {
absoluteFilePath = path.join(OUT_RESOURCES_FOLDER, filePath);
}
createDirectoryRecursive(path.dirname(absoluteFilePath));
if (/(\.ts$)|(\.js$)/.test(filePath)) {
contents = toggleComments(contents.toString());
}
fs.writeFileSync(absoluteFilePath, contents);
};
options.entryPoints.forEach((entryPoint) => enqueue(entryPoint));
while (queue.length > 0) {
const module = queue.shift();
if (transportCSS(module, enqueue, write)) {
continue;
}
if (transportResource(options, module, enqueue, write)) {
continue;
}
if (transportDTS(options, module, enqueue, write)) {
if (options.ignores.indexOf(file.replace(/\\/g, '/')) >= 0) {
continue;
}
let filename: string;
if (options.redirects[module]) {
filename = path.join(SRC_DIR, options.redirects[module] + '.ts');
} else {
filename = path.join(SRC_DIR, module + '.ts');
}
let fileContents = fs.readFileSync(filename).toString();
const info = ts.preProcessFile(fileContents);
for (let i = info.importedFiles.length - 1; i >= 0; i--) {
const importedFilename = info.importedFiles[i].fileName;
const pos = info.importedFiles[i].pos;
const end = info.importedFiles[i].end;
let importedFilepath: string;
if (/^vs\/css!/.test(importedFilename)) {
importedFilepath = importedFilename.substr('vs/css!'.length) + '.css';
} else {
importedFilepath = importedFilename;
}
if (/(^\.\/)|(^\.\.\/)/.test(importedFilepath)) {
importedFilepath = path.join(path.dirname(module), importedFilepath);
}
enqueue(importedFilepath);
let relativePath: string;
if (importedFilepath === path.dirname(module)) {
relativePath = '../' + path.basename(path.dirname(module));
} else if (importedFilepath === path.dirname(path.dirname(module))) {
relativePath = '../../' + path.basename(path.dirname(path.dirname(module)));
} else {
relativePath = path.relative(path.dirname(module), importedFilepath);
}
if (!/(^\.\/)|(^\.\.\/)/.test(relativePath)) {
relativePath = './' + relativePath;
}
fileContents = (
fileContents.substring(0, pos + 1)
+ relativePath
+ fileContents.substring(end + 1)
);
if (file === 'tsconfig.json') {
const tsConfig = JSON.parse(fs.readFileSync(path.join(SRC_FOLDER, file)).toString());
tsConfig.compilerOptions.module = 'es6';
tsConfig.compilerOptions.outDir = path.join(path.relative(OUT_FOLDER, OUT_RESOURCES_FOLDER), 'vs').replace(/\\/g, '/');
write(getDestAbsoluteFilePath(file), JSON.stringify(tsConfig, null, '\t'));
continue;
}
fileContents = fileContents.replace(/import ([a-zA-z0-9]+) = require\(('[^']+')\);/g, function (_, m1, m2) {
return `import * as ${m1} from ${m2};`;
});
fileContents = fileContents.replace(/Thenable/g, 'PromiseLike');
if (/\.d\.ts$/.test(file) || /\.css$/.test(file) || /\.js$/.test(file)) {
// Transport the files directly
write(getDestAbsoluteFilePath(file), fs.readFileSync(path.join(SRC_FOLDER, file)));
continue;
}
write(module + '.ts', fileContents);
if (/\.ts$/.test(file)) {
// Transform the .ts file
let fileContents = fs.readFileSync(path.join(SRC_FOLDER, file)).toString();
const info = ts.preProcessFile(fileContents);
for (let i = info.importedFiles.length - 1; i >= 0; i--) {
const importedFilename = info.importedFiles[i].fileName;
const pos = info.importedFiles[i].pos;
const end = info.importedFiles[i].end;
let importedFilepath: string;
if (/^vs\/css!/.test(importedFilename)) {
importedFilepath = importedFilename.substr('vs/css!'.length) + '.css';
} else {
importedFilepath = importedFilename;
}
if (/(^\.\/)|(^\.\.\/)/.test(importedFilepath)) {
importedFilepath = path.join(path.dirname(file), importedFilepath);
}
let relativePath: string;
if (importedFilepath === path.dirname(file).replace(/\\/g, '/')) {
relativePath = '../' + path.basename(path.dirname(file));
} else if (importedFilepath === path.dirname(path.dirname(file)).replace(/\\/g, '/')) {
relativePath = '../../' + path.basename(path.dirname(path.dirname(file)));
} else {
relativePath = path.relative(path.dirname(file), importedFilepath);
}
relativePath = relativePath.replace(/\\/g, '/');
if (!/(^\.\/)|(^\.\.\/)/.test(relativePath)) {
relativePath = './' + relativePath;
}
fileContents = (
fileContents.substring(0, pos + 1)
+ relativePath
+ fileContents.substring(end + 1)
);
}
fileContents = fileContents.replace(/import ([a-zA-z0-9]+) = require\(('[^']+')\);/g, function (_, m1, m2) {
return `import * as ${m1} from ${m2};`;
});
write(getDestAbsoluteFilePath(file), fileContents);
continue;
}
console.log(`UNKNOWN FILE: ${file}`);
}
const esm_opts = {
"compilerOptions": {
"outDir": path.relative(path.dirname(OUT_FOLDER), OUT_RESOURCES_FOLDER),
"rootDir": "src",
"module": "es6",
"target": "es5",
"experimentalDecorators": true,
"lib": [
"dom",
"es5",
"es2015.collection",
"es2015.promise"
],
"types": [
]
function walkDirRecursive(dir: string): string[] {
if (dir.charAt(dir.length - 1) !== '/' || dir.charAt(dir.length - 1) !== '\\') {
dir += '/';
}
};
fs.writeFileSync(path.join(path.dirname(OUT_FOLDER), 'tsconfig.json'), JSON.stringify(esm_opts, null, '\t'));
let result: string[] = [];
_walkDirRecursive(dir, result, dir.length);
return result;
}
const monacodts = fs.readFileSync(path.join(SRC_DIR, 'vs/monaco.d.ts')).toString();
fs.writeFileSync(path.join(OUT_FOLDER, 'vs/monaco.d.ts'), monacodts);
function _walkDirRecursive(dir: string, result: string[], trimPos: number): void {
const files = fs.readdirSync(dir);
for (let i = 0; i < files.length; i++) {
const file = path.join(dir, files[i]);
if (fs.statSync(file).isDirectory()) {
_walkDirRecursive(file, result, trimPos);
} else {
result.push(file.substr(trimPos));
}
}
}
function write(absoluteFilePath: string, contents: string | Buffer): void {
if (/(\.ts$)|(\.js$)/.test(absoluteFilePath)) {
contents = toggleComments(contents.toString());
}
writeFile(absoluteFilePath, contents);
function toggleComments(fileContents: string): string {
let lines = fileContents.split(/\r\n|\r|\n/);
let mode = 0;
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (mode === 0) {
if (/\/\/ ESM-comment-begin/.test(line)) {
mode = 1;
continue;
}
if (/\/\/ ESM-uncomment-begin/.test(line)) {
mode = 2;
continue;
}
continue;
}
if (mode === 1) {
if (/\/\/ ESM-comment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = '// ' + line;
continue;
}
if (mode === 2) {
if (/\/\/ ESM-uncomment-end/.test(line)) {
mode = 0;
continue;
}
lines[i] = line.replace(/^(\s*)\/\/ ?/, function (_, indent) {
return indent;
});
}
}
return lines.join('\n');
}
}
}
function transportCSS(module: string, enqueue: (module: string) => void, write: (path: string, contents: string | Buffer) => void): boolean {
@@ -337,7 +327,7 @@ function transportCSS(module: string, enqueue: (module: string) => void, write:
function _replaceURL(contents: string, replacer: (url: string) => string): string {
// Use ")" as the terminator as quotes are oftentimes not used at all
return contents.replace(/url\(\s*([^\)]+)\s*\)?/g, (_: string, ...matches: string[]) => {
var url = matches[0];
let url = matches[0];
// Eliminate starting quotes (the initial whitespace is not captured)
if (url.charAt(0) === '"' || url.charAt(0) === '\'') {
url = url.substring(1);
@@ -363,33 +353,3 @@ function transportCSS(module: string, enqueue: (module: string) => void, write:
return haystack.length >= needle.length && haystack.substr(0, needle.length) === needle;
}
}
function transportResource(options: IOptions, module: string, enqueue: (module: string) => void, write: (path: string, contents: string | Buffer) => void): boolean {
if (!/\.svg/.test(module)) {
return false;
}
write(module, fs.readFileSync(path.join(SRC_DIR, module)));
return true;
}
function transportDTS(options: IOptions, module: string, enqueue: (module: string) => void, write: (path: string, contents: string | Buffer) => void): boolean {
if (options.redirects[module] && fs.existsSync(path.join(SRC_DIR, options.redirects[module] + '.ts'))) {
return false;
}
if (!fs.existsSync(path.join(SRC_DIR, module + '.d.ts'))) {
return false;
}
write(module + '.d.ts', fs.readFileSync(path.join(SRC_DIR, module + '.d.ts')));
let filename: string;
if (options.redirects[module]) {
write(module + '.js', fs.readFileSync(path.join(SRC_DIR, options.redirects[module] + '.js')));
} else {
write(module + '.js', fs.readFileSync(path.join(SRC_DIR, module + '.js')));
}
return true;
}

136
build/lib/stats.js Normal file
View File

@@ -0,0 +1,136 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const es = require("event-stream");
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
const appInsights = require("applicationinsights");
class Entry {
constructor(name, totalCount, totalSize) {
this.name = name;
this.totalCount = totalCount;
this.totalSize = totalSize;
}
toString(pretty) {
if (!pretty) {
if (this.totalCount === 1) {
return `${this.name}: ${this.totalSize} bytes`;
}
else {
return `${this.name}: ${this.totalCount} files with ${this.totalSize} bytes`;
}
}
else {
if (this.totalCount === 1) {
return `Stats for '${ansiColors.grey(this.name)}': ${Math.round(this.totalSize / 1204)}KB`;
}
else {
const count = this.totalCount < 100
? ansiColors.green(this.totalCount.toString())
: ansiColors.red(this.totalCount.toString());
return `Stats for '${ansiColors.grey(this.name)}': ${count} files, ${Math.round(this.totalSize / 1204)}KB`;
}
}
}
}
const _entries = new Map();
function createStatsStream(group, log) {
const entry = new Entry(group, 0, 0);
_entries.set(entry.name, entry);
return es.through(function (data) {
const file = data;
if (typeof file.path === 'string') {
entry.totalCount += 1;
if (Buffer.isBuffer(file.contents)) {
entry.totalSize += file.contents.length;
}
else if (file.stat && typeof file.stat.size === 'number') {
entry.totalSize += file.stat.size;
}
else {
// funky file...
}
}
this.emit('data', data);
}, function () {
if (log) {
if (entry.totalCount === 1) {
fancyLog(`Stats for '${ansiColors.grey(entry.name)}': ${Math.round(entry.totalSize / 1204)}KB`);
}
else {
const count = entry.totalCount < 100
? ansiColors.green(entry.totalCount.toString())
: ansiColors.red(entry.totalCount.toString());
fancyLog(`Stats for '${ansiColors.grey(entry.name)}': ${count} files, ${Math.round(entry.totalSize / 1204)}KB`);
}
}
this.emit('end');
});
}
exports.createStatsStream = createStatsStream;
function submitAllStats(productJson, commit) {
const sorted = [];
// move entries for single files to the front
_entries.forEach(value => {
if (value.totalCount === 1) {
sorted.unshift(value);
}
else {
sorted.push(value);
}
});
// print to console
for (const entry of sorted) {
console.log(entry.toString(true));
}
// send data as telementry event when the
// product is configured to send telemetry
if (!productJson || !productJson.aiConfig || typeof productJson.aiConfig.asimovKey !== 'string') {
return Promise.resolve(false);
}
return new Promise(resolve => {
try {
const sizes = {};
const counts = {};
for (const entry of sorted) {
sizes[entry.name] = entry.totalSize;
counts[entry.name] = entry.totalCount;
}
appInsights.setup(productJson.aiConfig.asimovKey)
.setAutoCollectConsole(false)
.setAutoCollectExceptions(false)
.setAutoCollectPerformance(false)
.setAutoCollectRequests(false)
.setAutoCollectDependencies(false)
.setAutoDependencyCorrelation(false)
.start();
appInsights.defaultClient.config.endpointUrl = 'https://vortex.data.microsoft.com/collect/v1';
/* __GDPR__
"monacoworkbench/packagemetrics" : {
"commit" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" },
"size" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" },
"count" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" }
}
*/
appInsights.defaultClient.trackEvent({
name: 'monacoworkbench/packagemetrics',
properties: { commit, size: JSON.stringify(sizes), count: JSON.stringify(counts) }
});
appInsights.defaultClient.flush({
callback: () => {
appInsights.dispose();
resolve(true);
}
});
}
catch (err) {
console.error('ERROR sending build stats as telemetry event!');
console.error(err);
resolve(false);
}
});
}
exports.submitAllStats = submitAllStats;

148
build/lib/stats.ts Normal file
View File

@@ -0,0 +1,148 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as es from 'event-stream';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
import * as File from 'vinyl';
import * as appInsights from 'applicationinsights';
class Entry {
constructor(readonly name: string, public totalCount: number, public totalSize: number) { }
toString(pretty?: boolean): string {
if (!pretty) {
if (this.totalCount === 1) {
return `${this.name}: ${this.totalSize} bytes`;
} else {
return `${this.name}: ${this.totalCount} files with ${this.totalSize} bytes`;
}
} else {
if (this.totalCount === 1) {
return `Stats for '${ansiColors.grey(this.name)}': ${Math.round(this.totalSize / 1204)}KB`;
} else {
const count = this.totalCount < 100
? ansiColors.green(this.totalCount.toString())
: ansiColors.red(this.totalCount.toString());
return `Stats for '${ansiColors.grey(this.name)}': ${count} files, ${Math.round(this.totalSize / 1204)}KB`;
}
}
}
}
const _entries = new Map<string, Entry>();
export function createStatsStream(group: string, log?: boolean): es.ThroughStream {
const entry = new Entry(group, 0, 0);
_entries.set(entry.name, entry);
return es.through(function (data) {
const file = data as File;
if (typeof file.path === 'string') {
entry.totalCount += 1;
if (Buffer.isBuffer(file.contents)) {
entry.totalSize += file.contents.length;
} else if (file.stat && typeof file.stat.size === 'number') {
entry.totalSize += file.stat.size;
} else {
// funky file...
}
}
this.emit('data', data);
}, function () {
if (log) {
if (entry.totalCount === 1) {
fancyLog(`Stats for '${ansiColors.grey(entry.name)}': ${Math.round(entry.totalSize / 1204)}KB`);
} else {
const count = entry.totalCount < 100
? ansiColors.green(entry.totalCount.toString())
: ansiColors.red(entry.totalCount.toString());
fancyLog(`Stats for '${ansiColors.grey(entry.name)}': ${count} files, ${Math.round(entry.totalSize / 1204)}KB`);
}
}
this.emit('end');
});
}
export function submitAllStats(productJson: any, commit: string): Promise<boolean> {
const sorted: Entry[] = [];
// move entries for single files to the front
_entries.forEach(value => {
if (value.totalCount === 1) {
sorted.unshift(value);
} else {
sorted.push(value);
}
});
// print to console
for (const entry of sorted) {
console.log(entry.toString(true));
}
// send data as telementry event when the
// product is configured to send telemetry
if (!productJson || !productJson.aiConfig || typeof productJson.aiConfig.asimovKey !== 'string') {
return Promise.resolve(false);
}
return new Promise(resolve => {
try {
const sizes: any = {};
const counts: any = {};
for (const entry of sorted) {
sizes[entry.name] = entry.totalSize;
counts[entry.name] = entry.totalCount;
}
appInsights.setup(productJson.aiConfig.asimovKey)
.setAutoCollectConsole(false)
.setAutoCollectExceptions(false)
.setAutoCollectPerformance(false)
.setAutoCollectRequests(false)
.setAutoCollectDependencies(false)
.setAutoDependencyCorrelation(false)
.start();
appInsights.defaultClient.config.endpointUrl = 'https://vortex.data.microsoft.com/collect/v1';
/* __GDPR__
"monacoworkbench/packagemetrics" : {
"commit" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" },
"size" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" },
"count" : {"classification": "SystemMetaData", "purpose": "PerformanceAndHealth" }
}
*/
appInsights.defaultClient.trackEvent({
name: 'monacoworkbench/packagemetrics',
properties: { commit, size: JSON.stringify(sizes), count: JSON.stringify(counts) }
});
appInsights.defaultClient.flush({
callback: () => {
appInsights.dispose();
resolve(true);
}
});
} catch (err) {
console.error('ERROR sending build stats as telemetry event!');
console.error(err);
resolve(false);
}
});
}

96
build/lib/task.js Normal file
View File

@@ -0,0 +1,96 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
const fancyLog = require("fancy-log");
const ansiColors = require("ansi-colors");
function _isPromise(p) {
if (typeof p.then === 'function') {
return true;
}
return false;
}
function _renderTime(time) {
return `${Math.round(time)} ms`;
}
async function _execute(task) {
const name = task.taskName || task.displayName || `<anonymous>`;
if (!task._tasks) {
fancyLog('Starting', ansiColors.cyan(name), '...');
}
const startTime = process.hrtime();
await _doExecute(task);
const elapsedArr = process.hrtime(startTime);
const elapsedNanoseconds = (elapsedArr[0] * 1e9 + elapsedArr[1]);
if (!task._tasks) {
fancyLog(`Finished`, ansiColors.cyan(name), 'after', ansiColors.magenta(_renderTime(elapsedNanoseconds / 1e6)));
}
}
async function _doExecute(task) {
// Always invoke as if it were a callback task
return new Promise((resolve, reject) => {
if (task.length === 1) {
// this is a calback task
task((err) => {
if (err) {
return reject(err);
}
resolve();
});
return;
}
const taskResult = task();
if (typeof taskResult === 'undefined') {
// this is a sync task
resolve();
return;
}
if (_isPromise(taskResult)) {
// this is a promise returning task
taskResult.then(resolve, reject);
return;
}
// this is a stream returning task
taskResult.on('end', _ => resolve());
taskResult.on('error', err => reject(err));
});
}
function series(...tasks) {
const result = async () => {
for (let i = 0; i < tasks.length; i++) {
await _execute(tasks[i]);
}
};
result._tasks = tasks;
return result;
}
exports.series = series;
function parallel(...tasks) {
const result = async () => {
await Promise.all(tasks.map(t => _execute(t)));
};
result._tasks = tasks;
return result;
}
exports.parallel = parallel;
function define(name, task) {
if (task._tasks) {
// This is a composite task
const lastTask = task._tasks[task._tasks.length - 1];
if (lastTask._tasks || lastTask.taskName) {
// This is a composite task without a real task function
// => generate a fake task function
return define(name, series(task, () => Promise.resolve()));
}
lastTask.taskName = name;
task.displayName = name;
return task;
}
// This is a simple task
task.taskName = name;
task.displayName = name;
return task;
}
exports.define = define;

125
build/lib/task.ts Normal file
View File

@@ -0,0 +1,125 @@
/*---------------------------------------------------------------------------------------------
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
'use strict';
import * as fancyLog from 'fancy-log';
import * as ansiColors from 'ansi-colors';
export interface BaseTask {
displayName?: string;
taskName?: string;
_tasks?: Task[];
}
export interface PromiseTask extends BaseTask {
(): Promise<void>;
}
export interface StreamTask extends BaseTask {
(): NodeJS.ReadWriteStream;
}
export interface CallbackTask extends BaseTask {
(cb?: (err?: any) => void): void;
}
export type Task = PromiseTask | StreamTask | CallbackTask;
function _isPromise(p: Promise<void> | NodeJS.ReadWriteStream): p is Promise<void> {
if (typeof (<any>p).then === 'function') {
return true;
}
return false;
}
function _renderTime(time: number): string {
return `${Math.round(time)} ms`;
}
async function _execute(task: Task): Promise<void> {
const name = task.taskName || task.displayName || `<anonymous>`;
if (!task._tasks) {
fancyLog('Starting', ansiColors.cyan(name), '...');
}
const startTime = process.hrtime();
await _doExecute(task);
const elapsedArr = process.hrtime(startTime);
const elapsedNanoseconds = (elapsedArr[0] * 1e9 + elapsedArr[1]);
if (!task._tasks) {
fancyLog(`Finished`, ansiColors.cyan(name), 'after', ansiColors.magenta(_renderTime(elapsedNanoseconds / 1e6)));
}
}
async function _doExecute(task: Task): Promise<void> {
// Always invoke as if it were a callback task
return new Promise((resolve, reject) => {
if (task.length === 1) {
// this is a calback task
task((err) => {
if (err) {
return reject(err);
}
resolve();
});
return;
}
const taskResult = task();
if (typeof taskResult === 'undefined') {
// this is a sync task
resolve();
return;
}
if (_isPromise(taskResult)) {
// this is a promise returning task
taskResult.then(resolve, reject);
return;
}
// this is a stream returning task
taskResult.on('end', _ => resolve());
taskResult.on('error', err => reject(err));
});
}
export function series(...tasks: Task[]): PromiseTask {
const result = async () => {
for (let i = 0; i < tasks.length; i++) {
await _execute(tasks[i]);
}
};
result._tasks = tasks;
return result;
}
export function parallel(...tasks: Task[]): PromiseTask {
const result = async () => {
await Promise.all(tasks.map(t => _execute(t)));
};
result._tasks = tasks;
return result;
}
export function define(name: string, task: Task): Task {
if (task._tasks) {
// This is a composite task
const lastTask = task._tasks[task._tasks.length - 1];
if (lastTask._tasks || lastTask.taskName) {
// This is a composite task without a real task function
// => generate a fake task function
return define(name, series(task, () => Promise.resolve()));
}
lastTask.taskName = name;
task.displayName = name;
return task;
}
// This is a simple task
task.taskName = name;
task.displayName = name;
return task;
}

View File

@@ -4,36 +4,36 @@
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
Object.defineProperty(exports, "__esModule", { value: true });
var assert = require("assert");
var i18n = require("../i18n");
suite('XLF Parser Tests', function () {
var sampleXlf = '<?xml version="1.0" encoding="utf-8"?><xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2"><file original="vs/base/common/keybinding" source-language="en" datatype="plaintext"><body><trans-unit id="key1"><source xml:lang="en">Key #1</source></trans-unit><trans-unit id="key2"><source xml:lang="en">Key #2 &amp;</source></trans-unit></body></file></xliff>';
var sampleTranslatedXlf = '<?xml version="1.0" encoding="utf-8"?><xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2"><file original="vs/base/common/keybinding" source-language="en" target-language="ru" datatype="plaintext"><body><trans-unit id="key1"><source xml:lang="en">Key #1</source><target>Кнопка #1</target></trans-unit><trans-unit id="key2"><source xml:lang="en">Key #2 &amp;</source><target>Кнопка #2 &amp;</target></trans-unit></body></file></xliff>';
var originalFilePath = 'vs/base/common/keybinding';
var keys = ['key1', 'key2'];
var messages = ['Key #1', 'Key #2 &'];
var translatedMessages = { key1: 'Кнопка #1', key2: 'Кнопка #2 &' };
test('Keys & messages to XLF conversion', function () {
var xlf = new i18n.XLF('vscode-workbench');
const assert = require("assert");
const i18n = require("../i18n");
suite('XLF Parser Tests', () => {
const sampleXlf = '<?xml version="1.0" encoding="utf-8"?><xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2"><file original="vs/base/common/keybinding" source-language="en" datatype="plaintext"><body><trans-unit id="key1"><source xml:lang="en">Key #1</source></trans-unit><trans-unit id="key2"><source xml:lang="en">Key #2 &amp;</source></trans-unit></body></file></xliff>';
const sampleTranslatedXlf = '<?xml version="1.0" encoding="utf-8"?><xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2"><file original="vs/base/common/keybinding" source-language="en" target-language="ru" datatype="plaintext"><body><trans-unit id="key1"><source xml:lang="en">Key #1</source><target>Кнопка #1</target></trans-unit><trans-unit id="key2"><source xml:lang="en">Key #2 &amp;</source><target>Кнопка #2 &amp;</target></trans-unit></body></file></xliff>';
const originalFilePath = 'vs/base/common/keybinding';
const keys = ['key1', 'key2'];
const messages = ['Key #1', 'Key #2 &'];
const translatedMessages = { key1: 'Кнопка #1', key2: 'Кнопка #2 &' };
test('Keys & messages to XLF conversion', () => {
const xlf = new i18n.XLF('vscode-workbench');
xlf.addFile(originalFilePath, keys, messages);
var xlfString = xlf.toString();
const xlfString = xlf.toString();
assert.strictEqual(xlfString.replace(/\s{2,}/g, ''), sampleXlf);
});
test('XLF to keys & messages conversion', function () {
test('XLF to keys & messages conversion', () => {
i18n.XLF.parse(sampleTranslatedXlf).then(function (resolvedFiles) {
assert.deepEqual(resolvedFiles[0].messages, translatedMessages);
assert.strictEqual(resolvedFiles[0].originalFilePath, originalFilePath);
});
});
test('JSON file source path to Transifex resource match', function () {
var editorProject = 'vscode-editor', workbenchProject = 'vscode-workbench';
var platform = { name: 'vs/platform', project: editorProject }, editorContrib = { name: 'vs/editor/contrib', project: editorProject }, editor = { name: 'vs/editor', project: editorProject }, base = { name: 'vs/base', project: editorProject }, code = { name: 'vs/code', project: workbenchProject }, workbenchParts = { name: 'vs/workbench/parts/html', project: workbenchProject }, workbenchServices = { name: 'vs/workbench/services/files', project: workbenchProject }, workbench = { name: 'vs/workbench', project: workbenchProject };
test('JSON file source path to Transifex resource match', () => {
const editorProject = 'vscode-editor', workbenchProject = 'vscode-workbench';
const platform = { name: 'vs/platform', project: editorProject }, editorContrib = { name: 'vs/editor/contrib', project: editorProject }, editor = { name: 'vs/editor', project: editorProject }, base = { name: 'vs/base', project: editorProject }, code = { name: 'vs/code', project: workbenchProject }, workbenchParts = { name: 'vs/workbench/contrib/html', project: workbenchProject }, workbenchServices = { name: 'vs/workbench/services/files', project: workbenchProject }, workbench = { name: 'vs/workbench', project: workbenchProject };
assert.deepEqual(i18n.getResource('vs/platform/actions/browser/menusExtensionPoint'), platform);
assert.deepEqual(i18n.getResource('vs/editor/contrib/clipboard/browser/clipboard'), editorContrib);
assert.deepEqual(i18n.getResource('vs/editor/common/modes/modesRegistry'), editor);
assert.deepEqual(i18n.getResource('vs/base/common/errorMessage'), base);
assert.deepEqual(i18n.getResource('vs/code/electron-main/window'), code);
assert.deepEqual(i18n.getResource('vs/workbench/parts/html/browser/webview'), workbenchParts);
assert.deepEqual(i18n.getResource('vs/workbench/contrib/html/browser/webview'), workbenchParts);
assert.deepEqual(i18n.getResource('vs/workbench/services/files/node/fileService'), workbenchServices);
assert.deepEqual(i18n.getResource('vs/workbench/browser/parts/panel/panelActions'), workbench);
});

View File

@@ -15,7 +15,7 @@ suite('XLF Parser Tests', () => {
const translatedMessages = { key1: 'Кнопка #1', key2: 'Кнопка #2 &' };
test('Keys & messages to XLF conversion', () => {
let xlf = new i18n.XLF('vscode-workbench');
const xlf = new i18n.XLF('vscode-workbench');
xlf.addFile(originalFilePath, keys, messages);
const xlfString = xlf.toString();
@@ -38,7 +38,7 @@ suite('XLF Parser Tests', () => {
editor = { name: 'vs/editor', project: editorProject },
base = { name: 'vs/base', project: editorProject },
code = { name: 'vs/code', project: workbenchProject },
workbenchParts = { name: 'vs/workbench/parts/html', project: workbenchProject },
workbenchParts = { name: 'vs/workbench/contrib/html', project: workbenchProject },
workbenchServices = { name: 'vs/workbench/services/files', project: workbenchProject },
workbench = { name: 'vs/workbench', project: workbenchProject};
@@ -47,7 +47,7 @@ suite('XLF Parser Tests', () => {
assert.deepEqual(i18n.getResource('vs/editor/common/modes/modesRegistry'), editor);
assert.deepEqual(i18n.getResource('vs/base/common/errorMessage'), base);
assert.deepEqual(i18n.getResource('vs/code/electron-main/window'), code);
assert.deepEqual(i18n.getResource('vs/workbench/parts/html/browser/webview'), workbenchParts);
assert.deepEqual(i18n.getResource('vs/workbench/contrib/html/browser/webview'), workbenchParts);
assert.deepEqual(i18n.getResource('vs/workbench/services/files/node/fileService'), workbenchServices);
assert.deepEqual(i18n.getResource('vs/workbench/browser/parts/panel/panelActions'), workbench);
});

View File

@@ -4,18 +4,48 @@
*--------------------------------------------------------------------------------------------*/
'use strict';
Object.defineProperty(exports, "__esModule", { value: true });
var fs = require("fs");
var path = require("path");
var ts = require("typescript");
var TYPESCRIPT_LIB_FOLDER = path.dirname(require.resolve('typescript/lib/lib.d.ts'));
const fs = require("fs");
const path = require("path");
const ts = require("typescript");
const TYPESCRIPT_LIB_FOLDER = path.dirname(require.resolve('typescript/lib/lib.d.ts'));
var ShakeLevel;
(function (ShakeLevel) {
ShakeLevel[ShakeLevel["Files"] = 0] = "Files";
ShakeLevel[ShakeLevel["InnerFile"] = 1] = "InnerFile";
ShakeLevel[ShakeLevel["ClassMembers"] = 2] = "ClassMembers";
})(ShakeLevel = exports.ShakeLevel || (exports.ShakeLevel = {}));
function printDiagnostics(diagnostics) {
for (const diag of diagnostics) {
let result = '';
if (diag.file) {
result += `${diag.file.fileName}: `;
}
if (diag.file && diag.start) {
let location = diag.file.getLineAndCharacterOfPosition(diag.start);
result += `- ${location.line + 1},${location.character} - `;
}
result += JSON.stringify(diag.messageText);
console.log(result);
}
}
function shake(options) {
var languageService = createTypeScriptLanguageService(options);
const languageService = createTypeScriptLanguageService(options);
const program = languageService.getProgram();
const globalDiagnostics = program.getGlobalDiagnostics();
if (globalDiagnostics.length > 0) {
printDiagnostics(globalDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
const syntacticDiagnostics = program.getSyntacticDiagnostics();
if (syntacticDiagnostics.length > 0) {
printDiagnostics(syntacticDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
const semanticDiagnostics = program.getSemanticDiagnostics();
if (semanticDiagnostics.length > 0) {
printDiagnostics(semanticDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
markNodes(languageService, options);
return generateResult(languageService, options.shakeLevel);
}
@@ -23,93 +53,104 @@ exports.shake = shake;
//#region Discovery, LanguageService & Setup
function createTypeScriptLanguageService(options) {
// Discover referenced files
var FILES = discoverAndReadFiles(options);
const FILES = discoverAndReadFiles(options);
// Add fake usage files
options.inlineEntryPoints.forEach(function (inlineEntryPoint, index) {
FILES["inlineEntryPoint:" + index + ".ts"] = inlineEntryPoint;
options.inlineEntryPoints.forEach((inlineEntryPoint, index) => {
FILES[`inlineEntryPoint.${index}.ts`] = inlineEntryPoint;
});
// Add additional typings
options.typings.forEach((typing) => {
const filePath = path.join(options.sourcesRoot, typing);
FILES[typing] = fs.readFileSync(filePath).toString();
});
// Resolve libs
var RESOLVED_LIBS = {};
options.libs.forEach(function (filename) {
var filepath = path.join(TYPESCRIPT_LIB_FOLDER, filename);
RESOLVED_LIBS["defaultLib:" + filename] = fs.readFileSync(filepath).toString();
const RESOLVED_LIBS = {};
options.libs.forEach((filename) => {
const filepath = path.join(TYPESCRIPT_LIB_FOLDER, filename);
RESOLVED_LIBS[`defaultLib:${filename}`] = fs.readFileSync(filepath).toString();
});
var host = new TypeScriptLanguageServiceHost(RESOLVED_LIBS, FILES, options.compilerOptions);
const compilerOptions = ts.convertCompilerOptionsFromJson(options.compilerOptions, options.sourcesRoot).options;
const host = new TypeScriptLanguageServiceHost(RESOLVED_LIBS, FILES, compilerOptions);
return ts.createLanguageService(host);
}
/**
* Read imports and follow them until all files have been handled
*/
function discoverAndReadFiles(options) {
var FILES = {};
var in_queue = Object.create(null);
var queue = [];
var enqueue = function (moduleId) {
const FILES = {};
const in_queue = Object.create(null);
const queue = [];
const enqueue = (moduleId) => {
if (in_queue[moduleId]) {
return;
}
in_queue[moduleId] = true;
queue.push(moduleId);
};
options.entryPoints.forEach(function (entryPoint) { return enqueue(entryPoint); });
options.entryPoints.forEach((entryPoint) => enqueue(entryPoint));
while (queue.length > 0) {
var moduleId = queue.shift();
var dts_filename = path.join(options.sourcesRoot, moduleId + '.d.ts');
const moduleId = queue.shift();
const dts_filename = path.join(options.sourcesRoot, moduleId + '.d.ts');
if (fs.existsSync(dts_filename)) {
var dts_filecontents = fs.readFileSync(dts_filename).toString();
FILES[moduleId + '.d.ts'] = dts_filecontents;
const dts_filecontents = fs.readFileSync(dts_filename).toString();
FILES[`${moduleId}.d.ts`] = dts_filecontents;
continue;
}
var ts_filename = void 0;
const js_filename = path.join(options.sourcesRoot, moduleId + '.js');
if (fs.existsSync(js_filename)) {
// This is an import for a .js file, so ignore it...
continue;
}
let ts_filename;
if (options.redirects[moduleId]) {
ts_filename = path.join(options.sourcesRoot, options.redirects[moduleId] + '.ts');
}
else {
ts_filename = path.join(options.sourcesRoot, moduleId + '.ts');
}
var ts_filecontents = fs.readFileSync(ts_filename).toString();
var info = ts.preProcessFile(ts_filecontents);
for (var i = info.importedFiles.length - 1; i >= 0; i--) {
var importedFileName = info.importedFiles[i].fileName;
const ts_filecontents = fs.readFileSync(ts_filename).toString();
const info = ts.preProcessFile(ts_filecontents);
for (let i = info.importedFiles.length - 1; i >= 0; i--) {
const importedFileName = info.importedFiles[i].fileName;
if (options.importIgnorePattern.test(importedFileName)) {
// Ignore vs/css! imports
continue;
}
var importedModuleId = importedFileName;
let importedModuleId = importedFileName;
if (/(^\.\/)|(^\.\.\/)/.test(importedModuleId)) {
importedModuleId = path.join(path.dirname(moduleId), importedModuleId);
}
enqueue(importedModuleId);
}
FILES[moduleId + '.ts'] = ts_filecontents;
FILES[`${moduleId}.ts`] = ts_filecontents;
}
return FILES;
}
/**
* A TypeScript language service host
*/
var TypeScriptLanguageServiceHost = /** @class */ (function () {
function TypeScriptLanguageServiceHost(libs, files, compilerOptions) {
class TypeScriptLanguageServiceHost {
constructor(libs, files, compilerOptions) {
this._libs = libs;
this._files = files;
this._compilerOptions = compilerOptions;
}
// --- language service host ---------------
TypeScriptLanguageServiceHost.prototype.getCompilationSettings = function () {
getCompilationSettings() {
return this._compilerOptions;
};
TypeScriptLanguageServiceHost.prototype.getScriptFileNames = function () {
}
getScriptFileNames() {
return ([]
.concat(Object.keys(this._libs))
.concat(Object.keys(this._files)));
};
TypeScriptLanguageServiceHost.prototype.getScriptVersion = function (fileName) {
}
getScriptVersion(_fileName) {
return '1';
};
TypeScriptLanguageServiceHost.prototype.getProjectVersion = function () {
}
getProjectVersion() {
return '1';
};
TypeScriptLanguageServiceHost.prototype.getScriptSnapshot = function (fileName) {
}
getScriptSnapshot(fileName) {
if (this._files.hasOwnProperty(fileName)) {
return ts.ScriptSnapshot.fromString(this._files[fileName]);
}
@@ -119,21 +160,20 @@ var TypeScriptLanguageServiceHost = /** @class */ (function () {
else {
return ts.ScriptSnapshot.fromString('');
}
};
TypeScriptLanguageServiceHost.prototype.getScriptKind = function (fileName) {
}
getScriptKind(_fileName) {
return ts.ScriptKind.TS;
};
TypeScriptLanguageServiceHost.prototype.getCurrentDirectory = function () {
}
getCurrentDirectory() {
return '';
};
TypeScriptLanguageServiceHost.prototype.getDefaultLibFileName = function (options) {
}
getDefaultLibFileName(_options) {
return 'defaultLib:lib.d.ts';
};
TypeScriptLanguageServiceHost.prototype.isDefaultLibFileName = function (fileName) {
}
isDefaultLibFileName(fileName) {
return fileName === this.getDefaultLibFileName(this._compilerOptions);
};
return TypeScriptLanguageServiceHost;
}());
}
}
//#endregion
//#region Tree Shaking
var NodeColor;
@@ -150,7 +190,7 @@ function setColor(node, color) {
}
function nodeOrParentIsBlack(node) {
while (node) {
var color = getColor(node);
const color = getColor(node);
if (color === 2 /* Black */) {
return true;
}
@@ -162,8 +202,7 @@ function nodeOrChildIsBlack(node) {
if (getColor(node) === 2 /* Black */) {
return true;
}
for (var _i = 0, _a = node.getChildren(); _i < _a.length; _i++) {
var child = _a[_i];
for (const child of node.getChildren()) {
if (nodeOrChildIsBlack(child)) {
return true;
}
@@ -171,19 +210,22 @@ function nodeOrChildIsBlack(node) {
return false;
}
function markNodes(languageService, options) {
var program = languageService.getProgram();
const program = languageService.getProgram();
if (!program) {
throw new Error('Could not get program from language service');
}
if (options.shakeLevel === 0 /* Files */) {
// Mark all source files Black
program.getSourceFiles().forEach(function (sourceFile) {
program.getSourceFiles().forEach((sourceFile) => {
setColor(sourceFile, 2 /* Black */);
});
return;
}
var black_queue = [];
var gray_queue = [];
var sourceFilesLoaded = {};
const black_queue = [];
const gray_queue = [];
const sourceFilesLoaded = {};
function enqueueTopLevelModuleStatements(sourceFile) {
sourceFile.forEachChild(function (node) {
sourceFile.forEachChild((node) => {
if (ts.isImportDeclaration(node)) {
if (!node.importClause && ts.isStringLiteral(node.moduleSpecifier)) {
setColor(node, 2 /* Black */);
@@ -192,7 +234,7 @@ function markNodes(languageService, options) {
return;
}
if (ts.isExportDeclaration(node)) {
if (ts.isStringLiteral(node.moduleSpecifier)) {
if (node.moduleSpecifier && ts.isStringLiteral(node.moduleSpecifier)) {
setColor(node, 2 /* Black */);
enqueueImport(node, node.moduleSpecifier.text);
}
@@ -220,7 +262,7 @@ function markNodes(languageService, options) {
gray_queue.push(node);
}
function enqueue_black(node) {
var previousColor = getColor(node);
const previousColor = getColor(node);
if (previousColor === 2 /* Black */) {
return;
}
@@ -238,12 +280,12 @@ function markNodes(languageService, options) {
if (nodeOrParentIsBlack(node)) {
return;
}
var fileName = node.getSourceFile().fileName;
const fileName = node.getSourceFile().fileName;
if (/^defaultLib:/.test(fileName) || /\.d\.ts$/.test(fileName)) {
setColor(node, 2 /* Black */);
return;
}
var sourceFile = node.getSourceFile();
const sourceFile = node.getSourceFile();
if (!sourceFilesLoaded[sourceFile.fileName]) {
sourceFilesLoaded[sourceFile.fileName] = true;
enqueueTopLevelModuleStatements(sourceFile);
@@ -254,12 +296,15 @@ function markNodes(languageService, options) {
setColor(node, 2 /* Black */);
black_queue.push(node);
if (options.shakeLevel === 2 /* ClassMembers */ && (ts.isMethodDeclaration(node) || ts.isMethodSignature(node) || ts.isPropertySignature(node) || ts.isGetAccessor(node) || ts.isSetAccessor(node))) {
var references = languageService.getReferencesAtPosition(node.getSourceFile().fileName, node.name.pos + node.name.getLeadingTriviaWidth());
const references = languageService.getReferencesAtPosition(node.getSourceFile().fileName, node.name.pos + node.name.getLeadingTriviaWidth());
if (references) {
for (var i = 0, len = references.length; i < len; i++) {
var reference = references[i];
var referenceSourceFile = program.getSourceFile(reference.fileName);
var referenceNode = getTokenAtPosition(referenceSourceFile, reference.textSpan.start, false, false);
for (let i = 0, len = references.length; i < len; i++) {
const reference = references[i];
const referenceSourceFile = program.getSourceFile(reference.fileName);
if (!referenceSourceFile) {
continue;
}
const referenceNode = getTokenAtPosition(referenceSourceFile, reference.textSpan.start, false, false);
if (ts.isMethodDeclaration(referenceNode.parent)
|| ts.isPropertyDeclaration(referenceNode.parent)
|| ts.isGetAccessor(referenceNode.parent)
@@ -271,9 +316,9 @@ function markNodes(languageService, options) {
}
}
function enqueueFile(filename) {
var sourceFile = program.getSourceFile(filename);
const sourceFile = program.getSourceFile(filename);
if (!sourceFile) {
console.warn("Cannot find source file " + filename);
console.warn(`Cannot find source file ${filename}`);
return;
}
enqueue_black(sourceFile);
@@ -283,8 +328,8 @@ function markNodes(languageService, options) {
// this import should be ignored
return;
}
var nodeSourceFile = node.getSourceFile();
var fullPath;
const nodeSourceFile = node.getSourceFile();
let fullPath;
if (/(^\.\/)|(^\.\.\/)/.test(importText)) {
fullPath = path.join(path.dirname(nodeSourceFile.fileName), importText) + '.ts';
}
@@ -293,25 +338,25 @@ function markNodes(languageService, options) {
}
enqueueFile(fullPath);
}
options.entryPoints.forEach(function (moduleId) { return enqueueFile(moduleId + '.ts'); });
options.entryPoints.forEach(moduleId => enqueueFile(moduleId + '.ts'));
// Add fake usage files
options.inlineEntryPoints.forEach(function (_, index) { return enqueueFile("inlineEntryPoint:" + index + ".ts"); });
var step = 0;
var checker = program.getTypeChecker();
var _loop_1 = function () {
options.inlineEntryPoints.forEach((_, index) => enqueueFile(`inlineEntryPoint.${index}.ts`));
let step = 0;
const checker = program.getTypeChecker();
while (black_queue.length > 0 || gray_queue.length > 0) {
++step;
var node = void 0;
let node;
if (step % 100 === 0) {
console.log(step + "/" + (step + black_queue.length + gray_queue.length) + " (" + black_queue.length + ", " + gray_queue.length + ")");
console.log(`${step}/${step + black_queue.length + gray_queue.length} (${black_queue.length}, ${gray_queue.length})`);
}
if (black_queue.length === 0) {
for (var i = 0; i < gray_queue.length; i++) {
var node_1 = gray_queue[i];
var nodeParent = node_1.parent;
for (let i = 0; i < gray_queue.length; i++) {
const node = gray_queue[i];
const nodeParent = node.parent;
if ((ts.isClassDeclaration(nodeParent) || ts.isInterfaceDeclaration(nodeParent)) && nodeOrChildIsBlack(nodeParent)) {
gray_queue.splice(i, 1);
black_queue.push(node_1);
setColor(node_1, 2 /* Black */);
black_queue.push(node);
setColor(node, 2 /* Black */);
i--;
}
}
@@ -320,17 +365,18 @@ function markNodes(languageService, options) {
node = black_queue.shift();
}
else {
return "break";
// only gray nodes remaining...
break;
}
var nodeSourceFile = node.getSourceFile();
var loop = function (node) {
var _a = getRealNodeSymbol(checker, node), symbol = _a[0], symbolImportNode = _a[1];
const nodeSourceFile = node.getSourceFile();
const loop = (node) => {
const [symbol, symbolImportNode] = getRealNodeSymbol(checker, node);
if (symbolImportNode) {
setColor(symbolImportNode, 2 /* Black */);
}
if (symbol && !nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol)) {
for (var i = 0, len = symbol.declarations.length; i < len; i++) {
var declaration = symbol.declarations[i];
for (let i = 0, len = symbol.declarations.length; i < len; i++) {
const declaration = symbol.declarations[i];
if (ts.isSourceFile(declaration)) {
// Do not enqueue full source files
// (they can be the declaration of a module import)
@@ -338,9 +384,9 @@ function markNodes(languageService, options) {
}
if (options.shakeLevel === 2 /* ClassMembers */ && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration))) {
enqueue_black(declaration.name);
for (var j = 0; j < declaration.members.length; j++) {
var member = declaration.members[j];
var memberName = member.name ? member.name.getText() : null;
for (let j = 0; j < declaration.members.length; j++) {
const member = declaration.members[j];
const memberName = member.name ? member.name.getText() : null;
if (ts.isConstructorDeclaration(member)
|| ts.isConstructSignatureDeclaration(member)
|| ts.isIndexSignatureDeclaration(member)
@@ -354,8 +400,7 @@ function markNodes(languageService, options) {
}
// queue the heritage clauses
if (declaration.heritageClauses) {
for (var _i = 0, _b = declaration.heritageClauses; _i < _b.length; _i++) {
var heritageClause = _b[_i];
for (let heritageClause of declaration.heritageClauses) {
enqueue_black(heritageClause);
}
}
@@ -368,17 +413,12 @@ function markNodes(languageService, options) {
node.forEachChild(loop);
};
node.forEachChild(loop);
};
while (black_queue.length > 0 || gray_queue.length > 0) {
var state_1 = _loop_1();
if (state_1 === "break")
break;
}
}
function nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol) {
for (var i = 0, len = symbol.declarations.length; i < len; i++) {
var declaration = symbol.declarations[i];
var declarationSourceFile = declaration.getSourceFile();
for (let i = 0, len = symbol.declarations.length; i < len; i++) {
const declaration = symbol.declarations[i];
const declarationSourceFile = declaration.getSourceFile();
if (nodeSourceFile === declarationSourceFile) {
if (declaration.pos <= node.pos && node.end <= declaration.end) {
return true;
@@ -388,25 +428,28 @@ function nodeIsInItsOwnDeclaration(nodeSourceFile, node, symbol) {
return false;
}
function generateResult(languageService, shakeLevel) {
var program = languageService.getProgram();
var result = {};
var writeFile = function (filePath, contents) {
const program = languageService.getProgram();
if (!program) {
throw new Error('Could not get program from language service');
}
let result = {};
const writeFile = (filePath, contents) => {
result[filePath] = contents;
};
program.getSourceFiles().forEach(function (sourceFile) {
var fileName = sourceFile.fileName;
program.getSourceFiles().forEach((sourceFile) => {
const fileName = sourceFile.fileName;
if (/^defaultLib:/.test(fileName)) {
return;
}
var destination = fileName;
const destination = fileName;
if (/\.d\.ts$/.test(fileName)) {
if (nodeOrChildIsBlack(sourceFile)) {
writeFile(destination, sourceFile.text);
}
return;
}
var text = sourceFile.text;
var result = '';
let text = sourceFile.text;
let result = '';
function keep(node) {
result += text.substring(node.pos, node.end);
}
@@ -435,24 +478,23 @@ function generateResult(languageService, shakeLevel) {
}
}
else {
var survivingImports = [];
for (var i = 0; i < node.importClause.namedBindings.elements.length; i++) {
var importNode = node.importClause.namedBindings.elements[i];
let survivingImports = [];
for (const importNode of node.importClause.namedBindings.elements) {
if (getColor(importNode) === 2 /* Black */) {
survivingImports.push(importNode.getFullText(sourceFile));
}
}
var leadingTriviaWidth = node.getLeadingTriviaWidth();
var leadingTrivia = sourceFile.text.substr(node.pos, leadingTriviaWidth);
const leadingTriviaWidth = node.getLeadingTriviaWidth();
const leadingTrivia = sourceFile.text.substr(node.pos, leadingTriviaWidth);
if (survivingImports.length > 0) {
if (node.importClause && getColor(node.importClause) === 2 /* Black */) {
return write(leadingTrivia + "import " + node.importClause.name.text + ", {" + survivingImports.join(',') + " } from" + node.moduleSpecifier.getFullText(sourceFile) + ";");
if (node.importClause && node.importClause.name && getColor(node.importClause) === 2 /* Black */) {
return write(`${leadingTrivia}import ${node.importClause.name.text}, {${survivingImports.join(',')} } from${node.moduleSpecifier.getFullText(sourceFile)};`);
}
return write(leadingTrivia + "import {" + survivingImports.join(',') + " } from" + node.moduleSpecifier.getFullText(sourceFile) + ";");
return write(`${leadingTrivia}import {${survivingImports.join(',')} } from${node.moduleSpecifier.getFullText(sourceFile)};`);
}
else {
if (node.importClause && getColor(node.importClause) === 2 /* Black */) {
return write(leadingTrivia + "import " + node.importClause.name.text + " from" + node.moduleSpecifier.getFullText(sourceFile) + ";");
if (node.importClause && node.importClause.name && getColor(node.importClause) === 2 /* Black */) {
return write(`${leadingTrivia}import ${node.importClause.name.text} from${node.moduleSpecifier.getFullText(sourceFile)};`);
}
}
}
@@ -464,10 +506,10 @@ function generateResult(languageService, shakeLevel) {
}
}
if (shakeLevel === 2 /* ClassMembers */ && (ts.isClassDeclaration(node) || ts.isInterfaceDeclaration(node)) && nodeOrChildIsBlack(node)) {
var toWrite = node.getFullText();
for (var i = node.members.length - 1; i >= 0; i--) {
var member = node.members[i];
if (getColor(member) === 2 /* Black */) {
let toWrite = node.getFullText();
for (let i = node.members.length - 1; i >= 0; i--) {
const member = node.members[i];
if (getColor(member) === 2 /* Black */ || !member.name) {
// keep method
continue;
}
@@ -475,8 +517,8 @@ function generateResult(languageService, shakeLevel) {
// TODO: keep all members ending with `Brand`...
continue;
}
var pos = member.pos - node.pos;
var end = member.end - node.pos;
let pos = member.pos - node.pos;
let end = member.end - node.pos;
toWrite = toWrite.substring(0, pos) + toWrite.substring(end);
}
return write(toWrite);
@@ -508,68 +550,9 @@ function generateResult(languageService, shakeLevel) {
* Returns the node's symbol and the `import` node (if the symbol resolved from a different module)
*/
function getRealNodeSymbol(checker, node) {
/**
* Returns the containing object literal property declaration given a possible name node, e.g. "a" in x = { "a": 1 }
*/
/* @internal */
function getContainingObjectLiteralElement(node) {
switch (node.kind) {
case ts.SyntaxKind.StringLiteral:
case ts.SyntaxKind.NumericLiteral:
if (node.parent.kind === ts.SyntaxKind.ComputedPropertyName) {
return ts.isObjectLiteralElement(node.parent.parent) ? node.parent.parent : undefined;
}
// falls through
case ts.SyntaxKind.Identifier:
return ts.isObjectLiteralElement(node.parent) &&
(node.parent.parent.kind === ts.SyntaxKind.ObjectLiteralExpression || node.parent.parent.kind === ts.SyntaxKind.JsxAttributes) &&
node.parent.name === node ? node.parent : undefined;
}
return undefined;
}
function getPropertySymbolsFromType(type, propName) {
function getTextOfPropertyName(name) {
function isStringOrNumericLiteral(node) {
var kind = node.kind;
return kind === ts.SyntaxKind.StringLiteral
|| kind === ts.SyntaxKind.NumericLiteral;
}
switch (name.kind) {
case ts.SyntaxKind.Identifier:
return name.text;
case ts.SyntaxKind.StringLiteral:
case ts.SyntaxKind.NumericLiteral:
return name.text;
case ts.SyntaxKind.ComputedPropertyName:
return isStringOrNumericLiteral(name.expression) ? name.expression.text : undefined;
}
}
var name = getTextOfPropertyName(propName);
if (name && type) {
var result = [];
var symbol_1 = type.getProperty(name);
if (type.flags & ts.TypeFlags.Union) {
for (var _i = 0, _a = type.types; _i < _a.length; _i++) {
var t = _a[_i];
var symbol_2 = t.getProperty(name);
if (symbol_2) {
result.push(symbol_2);
}
}
return result;
}
if (symbol_1) {
result.push(symbol_1);
return result;
}
}
return undefined;
}
function getPropertySymbolsFromContextualType(typeChecker, node) {
var objectLiteral = node.parent;
var contextualType = typeChecker.getContextualType(objectLiteral);
return getPropertySymbolsFromType(contextualType, node.name);
}
const getPropertySymbolsFromContextualType = ts.getPropertySymbolsFromContextualType;
const getContainingObjectLiteralElement = ts.getContainingObjectLiteralElement;
const getNameFromPropertyName = ts.getNameFromPropertyName;
// Go to the original declaration for cases:
//
// (1) when the aliased symbol was declared in the location(parent).
@@ -597,10 +580,15 @@ function getRealNodeSymbol(checker, node) {
return [null, null];
}
}
var symbol = checker.getSymbolAtLocation(node);
var importNode = null;
const { parent } = node;
let symbol = checker.getSymbolAtLocation(node);
let importNode = null;
// If this is an alias, and the request came at the declaration location
// get the aliased symbol instead. This allows for goto def on an import e.g.
// import {A, B} from "mod";
// to jump to the implementation directly.
if (symbol && symbol.flags & ts.SymbolFlags.Alias && shouldSkipAlias(node, symbol.declarations[0])) {
var aliased = checker.getAliasedSymbol(symbol);
const aliased = checker.getAliasedSymbol(symbol);
if (aliased.declarations) {
// We should mark the import as visited
importNode = symbol.declarations[0];
@@ -627,13 +615,22 @@ function getRealNodeSymbol(checker, node) {
// pr/*destination*/op1: number
// }
// bar<Test>(({pr/*goto*/op1})=>{});
if (ts.isPropertyName(node) && ts.isBindingElement(node.parent) && ts.isObjectBindingPattern(node.parent.parent) &&
(node === (node.parent.propertyName || node.parent.name))) {
var type = checker.getTypeAtLocation(node.parent.parent);
if (type) {
var propSymbols = getPropertySymbolsFromType(type, node);
if (propSymbols) {
symbol = propSymbols[0];
if (ts.isPropertyName(node) && ts.isBindingElement(parent) && ts.isObjectBindingPattern(parent.parent) &&
(node === (parent.propertyName || parent.name))) {
const name = getNameFromPropertyName(node);
const type = checker.getTypeAtLocation(parent.parent);
if (name && type) {
if (type.isUnion()) {
const prop = type.types[0].getProperty(name);
if (prop) {
symbol = prop;
}
}
else {
const prop = type.getProperty(name);
if (prop) {
symbol = prop;
}
}
}
}
@@ -646,11 +643,14 @@ function getRealNodeSymbol(checker, node) {
// }
// function Foo(arg: Props) {}
// Foo( { pr/*1*/op1: 10, prop2: false })
var element = getContainingObjectLiteralElement(node);
if (element && checker.getContextualType(element.parent)) {
var propertySymbols = getPropertySymbolsFromContextualType(checker, element);
if (propertySymbols) {
symbol = propertySymbols[0];
const element = getContainingObjectLiteralElement(node);
if (element) {
const contextualType = element && checker.getContextualType(element.parent);
if (contextualType) {
const propertySymbols = getPropertySymbolsFromContextualType(element, checker, contextualType, /*unionSymbolOk*/ false);
if (propertySymbols) {
symbol = propertySymbols[0];
}
}
}
}
@@ -661,17 +661,16 @@ function getRealNodeSymbol(checker, node) {
}
/** Get the token whose text contains the position */
function getTokenAtPosition(sourceFile, position, allowPositionInLeadingTrivia, includeEndPosition) {
var current = sourceFile;
let current = sourceFile;
outer: while (true) {
// find the child that contains 'position'
for (var _i = 0, _a = current.getChildren(); _i < _a.length; _i++) {
var child = _a[_i];
var start = allowPositionInLeadingTrivia ? child.getFullStart() : child.getStart(sourceFile, /*includeJsDoc*/ true);
for (const child of current.getChildren()) {
const start = allowPositionInLeadingTrivia ? child.getFullStart() : child.getStart(sourceFile, /*includeJsDoc*/ true);
if (start > position) {
// If this child begins after position, then all subsequent children will as well.
break;
}
var end = child.getEnd();
const end = child.getEnd();
if (position < end || (position === end && (child.kind === ts.SyntaxKind.EndOfFileToken || includeEndPosition))) {
current = child;
continue outer;

View File

@@ -36,10 +36,14 @@ export interface ITreeShakingOptions {
* e.g. `lib.d.ts`, `lib.es2015.collection.d.ts`
*/
libs: string[];
/**
* Other .d.ts files
*/
typings: string[];
/**
* TypeScript compiler options.
*/
compilerOptions: ts.CompilerOptions;
compilerOptions?: any;
/**
* The shake level to perform.
*/
@@ -56,8 +60,42 @@ export interface ITreeShakingResult {
[file: string]: string;
}
function printDiagnostics(diagnostics: ReadonlyArray<ts.Diagnostic>): void {
for (const diag of diagnostics) {
let result = '';
if (diag.file) {
result += `${diag.file.fileName}: `;
}
if (diag.file && diag.start) {
let location = diag.file.getLineAndCharacterOfPosition(diag.start);
result += `- ${location.line + 1},${location.character} - `;
}
result += JSON.stringify(diag.messageText);
console.log(result);
}
}
export function shake(options: ITreeShakingOptions): ITreeShakingResult {
const languageService = createTypeScriptLanguageService(options);
const program = languageService.getProgram()!;
const globalDiagnostics = program.getGlobalDiagnostics();
if (globalDiagnostics.length > 0) {
printDiagnostics(globalDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
const syntacticDiagnostics = program.getSyntacticDiagnostics();
if (syntacticDiagnostics.length > 0) {
printDiagnostics(syntacticDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
const semanticDiagnostics = program.getSemanticDiagnostics();
if (semanticDiagnostics.length > 0) {
printDiagnostics(semanticDiagnostics);
throw new Error(`Compilation Errors encountered.`);
}
markNodes(languageService, options);
@@ -71,7 +109,13 @@ function createTypeScriptLanguageService(options: ITreeShakingOptions): ts.Langu
// Add fake usage files
options.inlineEntryPoints.forEach((inlineEntryPoint, index) => {
FILES[`inlineEntryPoint:${index}.ts`] = inlineEntryPoint;
FILES[`inlineEntryPoint.${index}.ts`] = inlineEntryPoint;
});
// Add additional typings
options.typings.forEach((typing) => {
const filePath = path.join(options.sourcesRoot, typing);
FILES[typing] = fs.readFileSync(filePath).toString();
});
// Resolve libs
@@ -81,7 +125,9 @@ function createTypeScriptLanguageService(options: ITreeShakingOptions): ts.Langu
RESOLVED_LIBS[`defaultLib:${filename}`] = fs.readFileSync(filepath).toString();
});
const host = new TypeScriptLanguageServiceHost(RESOLVED_LIBS, FILES, options.compilerOptions);
const compilerOptions = ts.convertCompilerOptionsFromJson(options.compilerOptions, options.sourcesRoot).options;
const host = new TypeScriptLanguageServiceHost(RESOLVED_LIBS, FILES, compilerOptions);
return ts.createLanguageService(host);
}
@@ -105,11 +151,17 @@ function discoverAndReadFiles(options: ITreeShakingOptions): IFileMap {
options.entryPoints.forEach((entryPoint) => enqueue(entryPoint));
while (queue.length > 0) {
const moduleId = queue.shift();
const moduleId = queue.shift()!;
const dts_filename = path.join(options.sourcesRoot, moduleId + '.d.ts');
if (fs.existsSync(dts_filename)) {
const dts_filecontents = fs.readFileSync(dts_filename).toString();
FILES[moduleId + '.d.ts'] = dts_filecontents;
FILES[`${moduleId}.d.ts`] = dts_filecontents;
continue;
}
const js_filename = path.join(options.sourcesRoot, moduleId + '.js');
if (fs.existsSync(js_filename)) {
// This is an import for a .js file, so ignore it...
continue;
}
@@ -136,7 +188,7 @@ function discoverAndReadFiles(options: ITreeShakingOptions): IFileMap {
enqueue(importedModuleId);
}
FILES[moduleId + '.ts'] = ts_filecontents;
FILES[`${moduleId}.ts`] = ts_filecontents;
}
return FILES;
@@ -167,12 +219,12 @@ class TypeScriptLanguageServiceHost implements ts.LanguageServiceHost {
}
getScriptFileNames(): string[] {
return (
[]
([] as string[])
.concat(Object.keys(this._libs))
.concat(Object.keys(this._files))
);
}
getScriptVersion(fileName: string): string {
getScriptVersion(_fileName: string): string {
return '1';
}
getProjectVersion(): string {
@@ -187,13 +239,13 @@ class TypeScriptLanguageServiceHost implements ts.LanguageServiceHost {
return ts.ScriptSnapshot.fromString('');
}
}
getScriptKind(fileName: string): ts.ScriptKind {
getScriptKind(_fileName: string): ts.ScriptKind {
return ts.ScriptKind.TS;
}
getCurrentDirectory(): string {
return '';
}
getDefaultLibFileName(options: ts.CompilerOptions): string {
getDefaultLibFileName(_options: ts.CompilerOptions): string {
return 'defaultLib:lib.d.ts';
}
isDefaultLibFileName(fileName: string): boolean {
@@ -240,6 +292,9 @@ function nodeOrChildIsBlack(node: ts.Node): boolean {
function markNodes(languageService: ts.LanguageService, options: ITreeShakingOptions) {
const program = languageService.getProgram();
if (!program) {
throw new Error('Could not get program from language service');
}
if (options.shakeLevel === ShakeLevel.Files) {
// Mark all source files Black
@@ -266,7 +321,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
}
if (ts.isExportDeclaration(node)) {
if (ts.isStringLiteral(node.moduleSpecifier)) {
if (node.moduleSpecifier && ts.isStringLiteral(node.moduleSpecifier)) {
setColor(node, NodeColor.Black);
enqueueImport(node, node.moduleSpecifier.text);
}
@@ -349,7 +404,11 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
if (references) {
for (let i = 0, len = references.length; i < len; i++) {
const reference = references[i];
const referenceSourceFile = program.getSourceFile(reference.fileName);
const referenceSourceFile = program!.getSourceFile(reference.fileName);
if (!referenceSourceFile) {
continue;
}
const referenceNode = getTokenAtPosition(referenceSourceFile, reference.textSpan.start, false, false);
if (
ts.isMethodDeclaration(referenceNode.parent)
@@ -365,7 +424,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
}
function enqueueFile(filename: string): void {
const sourceFile = program.getSourceFile(filename);
const sourceFile = program!.getSourceFile(filename);
if (!sourceFile) {
console.warn(`Cannot find source file ${filename}`);
return;
@@ -391,7 +450,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
options.entryPoints.forEach(moduleId => enqueueFile(moduleId + '.ts'));
// Add fake usage files
options.inlineEntryPoints.forEach((_, index) => enqueueFile(`inlineEntryPoint:${index}.ts`));
options.inlineEntryPoints.forEach((_, index) => enqueueFile(`inlineEntryPoint.${index}.ts`));
let step = 0;
@@ -401,11 +460,11 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
let node: ts.Node;
if (step % 100 === 0) {
console.log(`${step}/${step+black_queue.length+gray_queue.length} (${black_queue.length}, ${gray_queue.length})`);
console.log(`${step}/${step + black_queue.length + gray_queue.length} (${black_queue.length}, ${gray_queue.length})`);
}
if (black_queue.length === 0) {
for (let i = 0; i < gray_queue.length; i++) {
for (let i = 0; i< gray_queue.length; i++) {
const node = gray_queue[i];
const nodeParent = node.parent;
if ((ts.isClassDeclaration(nodeParent) || ts.isInterfaceDeclaration(nodeParent)) && nodeOrChildIsBlack(nodeParent)) {
@@ -418,7 +477,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
}
if (black_queue.length > 0) {
node = black_queue.shift();
node = black_queue.shift()!;
} else {
// only gray nodes remaining...
break;
@@ -441,7 +500,7 @@ function markNodes(languageService: ts.LanguageService, options: ITreeShakingOpt
}
if (options.shakeLevel === ShakeLevel.ClassMembers && (ts.isClassDeclaration(declaration) || ts.isInterfaceDeclaration(declaration))) {
enqueue_black(declaration.name);
enqueue_black(declaration.name!);
for (let j = 0; j < declaration.members.length; j++) {
const member = declaration.members[j];
@@ -493,6 +552,9 @@ function nodeIsInItsOwnDeclaration(nodeSourceFile: ts.SourceFile, node: ts.Node,
function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLevel): ITreeShakingResult {
const program = languageService.getProgram();
if (!program) {
throw new Error('Could not get program from language service');
}
let result: ITreeShakingResult = {};
const writeFile = (filePath: string, contents: string): void => {
@@ -547,8 +609,7 @@ function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLe
}
} else {
let survivingImports: string[] = [];
for (let i = 0; i < node.importClause.namedBindings.elements.length; i++) {
const importNode = node.importClause.namedBindings.elements[i];
for (const importNode of node.importClause.namedBindings.elements) {
if (getColor(importNode) === NodeColor.Black) {
survivingImports.push(importNode.getFullText(sourceFile));
}
@@ -556,12 +617,12 @@ function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLe
const leadingTriviaWidth = node.getLeadingTriviaWidth();
const leadingTrivia = sourceFile.text.substr(node.pos, leadingTriviaWidth);
if (survivingImports.length > 0) {
if (node.importClause && getColor(node.importClause) === NodeColor.Black) {
if (node.importClause && node.importClause.name && getColor(node.importClause) === NodeColor.Black) {
return write(`${leadingTrivia}import ${node.importClause.name.text}, {${survivingImports.join(',')} } from${node.moduleSpecifier.getFullText(sourceFile)};`);
}
return write(`${leadingTrivia}import {${survivingImports.join(',')} } from${node.moduleSpecifier.getFullText(sourceFile)};`);
} else {
if (node.importClause && getColor(node.importClause) === NodeColor.Black) {
if (node.importClause && node.importClause.name && getColor(node.importClause) === NodeColor.Black) {
return write(`${leadingTrivia}import ${node.importClause.name.text} from${node.moduleSpecifier.getFullText(sourceFile)};`);
}
}
@@ -577,7 +638,7 @@ function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLe
let toWrite = node.getFullText();
for (let i = node.members.length - 1; i >= 0; i--) {
const member = node.members[i];
if (getColor(member) === NodeColor.Black) {
if (getColor(member) === NodeColor.Black || !member.name) {
// keep method
continue;
}
@@ -625,74 +686,13 @@ function generateResult(languageService: ts.LanguageService, shakeLevel: ShakeLe
/**
* Returns the node's symbol and the `import` node (if the symbol resolved from a different module)
*/
function getRealNodeSymbol(checker: ts.TypeChecker, node: ts.Node): [ts.Symbol, ts.Declaration] {
/**
* Returns the containing object literal property declaration given a possible name node, e.g. "a" in x = { "a": 1 }
*/
/* @internal */
function getContainingObjectLiteralElement(node: ts.Node): ts.ObjectLiteralElement | undefined {
switch (node.kind) {
case ts.SyntaxKind.StringLiteral:
case ts.SyntaxKind.NumericLiteral:
if (node.parent.kind === ts.SyntaxKind.ComputedPropertyName) {
return ts.isObjectLiteralElement(node.parent.parent) ? node.parent.parent : undefined;
}
// falls through
case ts.SyntaxKind.Identifier:
return ts.isObjectLiteralElement(node.parent) &&
(node.parent.parent.kind === ts.SyntaxKind.ObjectLiteralExpression || node.parent.parent.kind === ts.SyntaxKind.JsxAttributes) &&
node.parent.name === node ? node.parent : undefined;
}
return undefined;
}
function getRealNodeSymbol(checker: ts.TypeChecker, node: ts.Node): [ts.Symbol | null, ts.Declaration | null] {
function getPropertySymbolsFromType(type: ts.Type, propName: ts.PropertyName) {
function getTextOfPropertyName(name: ts.PropertyName): string {
function isStringOrNumericLiteral(node: ts.Node): node is ts.StringLiteral | ts.NumericLiteral {
const kind = node.kind;
return kind === ts.SyntaxKind.StringLiteral
|| kind === ts.SyntaxKind.NumericLiteral;
}
switch (name.kind) {
case ts.SyntaxKind.Identifier:
return name.text;
case ts.SyntaxKind.StringLiteral:
case ts.SyntaxKind.NumericLiteral:
return name.text;
case ts.SyntaxKind.ComputedPropertyName:
return isStringOrNumericLiteral(name.expression) ? name.expression.text : undefined!;
}
}
const name = getTextOfPropertyName(propName);
if (name && type) {
const result: ts.Symbol[] = [];
const symbol = type.getProperty(name);
if (type.flags & ts.TypeFlags.Union) {
for (const t of (<ts.UnionType>type).types) {
const symbol = t.getProperty(name);
if (symbol) {
result.push(symbol);
}
}
return result;
}
if (symbol) {
result.push(symbol);
return result;
}
}
return undefined;
}
function getPropertySymbolsFromContextualType(typeChecker: ts.TypeChecker, node: ts.ObjectLiteralElement): ts.Symbol[] {
const objectLiteral = <ts.ObjectLiteralExpression | ts.JsxAttributes>node.parent;
const contextualType = typeChecker.getContextualType(objectLiteral)!;
return getPropertySymbolsFromType(contextualType, node.name!)!;
}
// Use some TypeScript internals to avoid code duplication
type ObjectLiteralElementWithName = ts.ObjectLiteralElement & { name: ts.PropertyName; parent: ts.ObjectLiteralExpression | ts.JsxAttributes };
const getPropertySymbolsFromContextualType: (node: ObjectLiteralElementWithName, checker: ts.TypeChecker, contextualType: ts.Type, unionSymbolOk: boolean) => ReadonlyArray<ts.Symbol> = (<any>ts).getPropertySymbolsFromContextualType;
const getContainingObjectLiteralElement: (node: ts.Node) => ObjectLiteralElementWithName | undefined = (<any>ts).getContainingObjectLiteralElement;
const getNameFromPropertyName: (name: ts.PropertyName) => string | undefined = (<any>ts).getNameFromPropertyName;
// Go to the original declaration for cases:
//
@@ -723,8 +723,14 @@ function getRealNodeSymbol(checker: ts.TypeChecker, node: ts.Node): [ts.Symbol,
}
}
const { parent } = node;
let symbol = checker.getSymbolAtLocation(node);
let importNode: ts.Declaration = null;
let importNode: ts.Declaration | null = null;
// If this is an alias, and the request came at the declaration location
// get the aliased symbol instead. This allows for goto def on an import e.g.
// import {A, B} from "mod";
// to jump to the implementation directly.
if (symbol && symbol.flags & ts.SymbolFlags.Alias && shouldSkipAlias(node, symbol.declarations[0])) {
const aliased = checker.getAliasedSymbol(symbol);
if (aliased.declarations) {
@@ -755,13 +761,21 @@ function getRealNodeSymbol(checker: ts.TypeChecker, node: ts.Node): [ts.Symbol,
// pr/*destination*/op1: number
// }
// bar<Test>(({pr/*goto*/op1})=>{});
if (ts.isPropertyName(node) && ts.isBindingElement(node.parent) && ts.isObjectBindingPattern(node.parent.parent) &&
(node === (node.parent.propertyName || node.parent.name))) {
const type = checker.getTypeAtLocation(node.parent.parent);
if (type) {
const propSymbols = getPropertySymbolsFromType(type, node);
if (propSymbols) {
symbol = propSymbols[0];
if (ts.isPropertyName(node) && ts.isBindingElement(parent) && ts.isObjectBindingPattern(parent.parent) &&
(node === (parent.propertyName || parent.name))) {
const name = getNameFromPropertyName(node);
const type = checker.getTypeAtLocation(parent.parent);
if (name && type) {
if (type.isUnion()) {
const prop = type.types[0].getProperty(name);
if (prop) {
symbol = prop;
}
} else {
const prop = type.getProperty(name);
if (prop) {
symbol = prop;
}
}
}
}
@@ -776,10 +790,13 @@ function getRealNodeSymbol(checker: ts.TypeChecker, node: ts.Node): [ts.Symbol,
// function Foo(arg: Props) {}
// Foo( { pr/*1*/op1: 10, prop2: false })
const element = getContainingObjectLiteralElement(node);
if (element && checker.getContextualType(element.parent as ts.Expression)) {
const propertySymbols = getPropertySymbolsFromContextualType(checker, element);
if (propertySymbols) {
symbol = propertySymbols[0];
if (element) {
const contextualType = element && checker.getContextualType(element.parent);
if (contextualType) {
const propertySymbols = getPropertySymbolsFromContextualType(element, checker, contextualType, /*unionSymbolOk*/ false);
if (propertySymbols) {
symbol = propertySymbols[0];
}
}
}
}

View File

@@ -3,48 +3,30 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
var __extends = (this && this.__extends) || (function () {
var extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
return function (d, b) {
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
var path_1 = require("path");
var Lint = require("tslint");
var Rule = /** @class */ (function (_super) {
__extends(Rule, _super);
function Rule() {
return _super !== null && _super.apply(this, arguments) || this;
}
Rule.prototype.apply = function (sourceFile) {
const path_1 = require("path");
const Lint = require("tslint");
class Rule extends Lint.Rules.AbstractRule {
apply(sourceFile) {
return this.applyWithWalker(new ImportPatterns(sourceFile, this.getOptions()));
};
return Rule;
}(Lint.Rules.AbstractRule));
exports.Rule = Rule;
var ImportPatterns = /** @class */ (function (_super) {
__extends(ImportPatterns, _super);
function ImportPatterns(file, opts) {
var _this = _super.call(this, file, opts) || this;
_this.imports = Object.create(null);
return _this;
}
ImportPatterns.prototype.visitImportDeclaration = function (node) {
var path = node.moduleSpecifier.getText();
}
exports.Rule = Rule;
class ImportPatterns extends Lint.RuleWalker {
constructor(file, opts) {
super(file, opts);
this.imports = Object.create(null);
}
visitImportDeclaration(node) {
let path = node.moduleSpecifier.getText();
// remove quotes
path = path.slice(1, -1);
if (path[0] === '.') {
path = path_1.join(path_1.dirname(node.getSourceFile().fileName), path);
}
if (this.imports[path]) {
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), "Duplicate imports for '" + path + "'."));
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), `Duplicate imports for '${path}'.`));
}
this.imports[path] = true;
};
return ImportPatterns;
}(Lint.RuleWalker));
}
}

View File

@@ -3,79 +3,60 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
var __extends = (this && this.__extends) || (function () {
var extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
return function (d, b) {
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
var ts = require("typescript");
var Lint = require("tslint");
var minimatch = require("minimatch");
var path_1 = require("path");
var Rule = /** @class */ (function (_super) {
__extends(Rule, _super);
function Rule() {
return _super !== null && _super.apply(this, arguments) || this;
}
Rule.prototype.apply = function (sourceFile) {
var configs = this.getOptions().ruleArguments;
for (var _i = 0, configs_1 = configs; _i < configs_1.length; _i++) {
var config = configs_1[_i];
const ts = require("typescript");
const Lint = require("tslint");
const minimatch = require("minimatch");
const path_1 = require("path");
class Rule extends Lint.Rules.AbstractRule {
apply(sourceFile) {
const configs = this.getOptions().ruleArguments;
for (const config of configs) {
if (minimatch(sourceFile.fileName, config.target)) {
return this.applyWithWalker(new ImportPatterns(sourceFile, this.getOptions(), config));
}
}
return [];
};
return Rule;
}(Lint.Rules.AbstractRule));
exports.Rule = Rule;
var ImportPatterns = /** @class */ (function (_super) {
__extends(ImportPatterns, _super);
function ImportPatterns(file, opts, _config) {
var _this = _super.call(this, file, opts) || this;
_this._config = _config;
return _this;
}
ImportPatterns.prototype.visitImportEqualsDeclaration = function (node) {
}
exports.Rule = Rule;
class ImportPatterns extends Lint.RuleWalker {
constructor(file, opts, _config) {
super(file, opts);
this._config = _config;
}
visitImportEqualsDeclaration(node) {
if (node.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference) {
this._validateImport(node.moduleReference.expression.getText(), node);
}
};
ImportPatterns.prototype.visitImportDeclaration = function (node) {
}
visitImportDeclaration(node) {
this._validateImport(node.moduleSpecifier.getText(), node);
};
ImportPatterns.prototype.visitCallExpression = function (node) {
_super.prototype.visitCallExpression.call(this, node);
}
visitCallExpression(node) {
super.visitCallExpression(node);
// import('foo') statements inside the code
if (node.expression.kind === ts.SyntaxKind.ImportKeyword) {
var path = node.arguments[0];
const [path] = node.arguments;
this._validateImport(path.getText(), node);
}
};
ImportPatterns.prototype._validateImport = function (path, node) {
}
_validateImport(path, node) {
// remove quotes
path = path.slice(1, -1);
// resolve relative paths
if (path[0] === '.') {
path = path_1.join(this.getSourceFile().fileName, path);
}
var restrictions;
let restrictions;
if (typeof this._config.restrictions === 'string') {
restrictions = [this._config.restrictions];
}
else {
restrictions = this._config.restrictions;
}
var matched = false;
for (var _i = 0, restrictions_1 = restrictions; _i < restrictions_1.length; _i++) {
var pattern = restrictions_1[_i];
let matched = false;
for (const pattern of restrictions) {
if (minimatch(path, pattern)) {
matched = true;
break;
@@ -83,8 +64,7 @@ var ImportPatterns = /** @class */ (function (_super) {
}
if (!matched) {
// None of the restrictions matched
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), "Imports violates '" + restrictions.join(' or ') + "' restrictions. See https://github.com/Microsoft/vscode/wiki/Code-Organization"));
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), `Imports violates '${restrictions.join(' or ')}' restrictions. See https://github.com/Microsoft/vscode/wiki/Code-Organization`));
}
};
return ImportPatterns;
}(Lint.RuleWalker));
}
}

View File

@@ -3,36 +3,22 @@
* Copyright (c) Microsoft Corporation. All rights reserved.
* Licensed under the Source EULA. See License.txt in the project root for license information.
*--------------------------------------------------------------------------------------------*/
var __extends = (this && this.__extends) || (function () {
var extendStatics = Object.setPrototypeOf ||
({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||
function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };
return function (d, b) {
extendStatics(d, b);
function __() { this.constructor = d; }
d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());
};
})();
Object.defineProperty(exports, "__esModule", { value: true });
var ts = require("typescript");
var Lint = require("tslint");
var path_1 = require("path");
var Rule = /** @class */ (function (_super) {
__extends(Rule, _super);
function Rule() {
return _super !== null && _super.apply(this, arguments) || this;
}
Rule.prototype.apply = function (sourceFile) {
var parts = path_1.dirname(sourceFile.fileName).split(/\\|\//);
var ruleArgs = this.getOptions().ruleArguments[0];
var config;
for (var i = parts.length - 1; i >= 0; i--) {
const ts = require("typescript");
const Lint = require("tslint");
const path_1 = require("path");
class Rule extends Lint.Rules.AbstractRule {
apply(sourceFile) {
const parts = path_1.dirname(sourceFile.fileName).split(/\\|\//);
const ruleArgs = this.getOptions().ruleArguments[0];
let config;
for (let i = parts.length - 1; i >= 0; i--) {
if (ruleArgs[parts[i]]) {
config = {
allowed: new Set(ruleArgs[parts[i]]).add(parts[i]),
disallowed: new Set()
};
Object.keys(ruleArgs).forEach(function (key) {
Object.keys(ruleArgs).forEach(key => {
if (!config.allowed.has(key)) {
config.disallowed.add(key);
}
@@ -44,58 +30,54 @@ var Rule = /** @class */ (function (_super) {
return [];
}
return this.applyWithWalker(new LayeringRule(sourceFile, config, this.getOptions()));
};
return Rule;
}(Lint.Rules.AbstractRule));
exports.Rule = Rule;
var LayeringRule = /** @class */ (function (_super) {
__extends(LayeringRule, _super);
function LayeringRule(file, config, opts) {
var _this = _super.call(this, file, opts) || this;
_this._config = config;
return _this;
}
LayeringRule.prototype.visitImportEqualsDeclaration = function (node) {
}
exports.Rule = Rule;
class LayeringRule extends Lint.RuleWalker {
constructor(file, config, opts) {
super(file, opts);
this._config = config;
}
visitImportEqualsDeclaration(node) {
if (node.moduleReference.kind === ts.SyntaxKind.ExternalModuleReference) {
this._validateImport(node.moduleReference.expression.getText(), node);
}
};
LayeringRule.prototype.visitImportDeclaration = function (node) {
}
visitImportDeclaration(node) {
this._validateImport(node.moduleSpecifier.getText(), node);
};
LayeringRule.prototype.visitCallExpression = function (node) {
_super.prototype.visitCallExpression.call(this, node);
}
visitCallExpression(node) {
super.visitCallExpression(node);
// import('foo') statements inside the code
if (node.expression.kind === ts.SyntaxKind.ImportKeyword) {
var path = node.arguments[0];
const [path] = node.arguments;
this._validateImport(path.getText(), node);
}
};
LayeringRule.prototype._validateImport = function (path, node) {
}
_validateImport(path, node) {
// remove quotes
path = path.slice(1, -1);
if (path[0] === '.') {
path = path_1.join(path_1.dirname(node.getSourceFile().fileName), path);
}
var parts = path_1.dirname(path).split(/\\|\//);
for (var i = parts.length - 1; i >= 0; i--) {
var part = parts[i];
const parts = path_1.dirname(path).split(/\\|\//);
for (let i = parts.length - 1; i >= 0; i--) {
const part = parts[i];
if (this._config.allowed.has(part)) {
// GOOD - same layer
return;
}
if (this._config.disallowed.has(part)) {
// BAD - wrong layer
var message = "Bad layering. You are not allowed to access '" + part + "' from here, allowed layers are: [" + LayeringRule._print(this._config.allowed) + "]";
const message = `Bad layering. You are not allowed to access '${part}' from here, allowed layers are: [${LayeringRule._print(this._config.allowed)}]`;
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), message));
return;
}
}
};
LayeringRule._print = function (set) {
var r = [];
set.forEach(function (e) { return r.push(e); });
}
static _print(set) {
const r = [];
set.forEach(e => r.push(e));
return r.join(', ');
};
return LayeringRule;
}(Lint.RuleWalker));
}
}

Some files were not shown because too many files have changed in this diff Show More