diff --git a/resources/xlf/en/Microsoft.sqlservernotebook.xlf b/resources/xlf/en/Microsoft.sqlservernotebook.xlf index 58d788fad8..7d07d23bdd 100644 --- a/resources/xlf/en/Microsoft.sqlservernotebook.xlf +++ b/resources/xlf/en/Microsoft.sqlservernotebook.xlf @@ -1,14 +1,14 @@ - - SQL Server Notebooks + + Notebooks to help get started with and troubleshoot SQL Server SQL Server 2019 Guide - - Notebooks to help get started with and troubleshoot SQL Server + + SQL Server Notebooks - + \ No newline at end of file diff --git a/resources/xlf/en/admin-tool-ext-win.xlf b/resources/xlf/en/admin-tool-ext-win.xlf index 5605a64893..405ad39ea5 100644 --- a/resources/xlf/en/admin-tool-ext-win.xlf +++ b/resources/xlf/en/admin-tool-ext-win.xlf @@ -1,37 +1,37 @@ - - - Database Administration Tool Extensions for Windows - - - Adds additional Windows-specific functionality to Azure Data Studio - - - Properties - - - Generate Scripts... - - - - No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand - - - Could not determine Object Explorer node from connectionContext : {0} + + Launching dialog... No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand + + No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand + No connectionProfile provided from connectionContext : {0} - - Launching dialog... + + Could not determine Object Explorer node from connectionContext : {0} Error calling SsmsMin with args '{0}' - {1} - + + + + Adds additional Windows-specific functionality to Azure Data Studio + + + Database Administration Tool Extensions for Windows + + + Generate Scripts... + + + Properties + + \ No newline at end of file diff --git a/resources/xlf/en/agent.xlf b/resources/xlf/en/agent.xlf index 0676c239b9..9e3707de63 100644 --- a/resources/xlf/en/agent.xlf +++ b/resources/xlf/en/agent.xlf @@ -1,221 +1,191 @@ - - - OK + + + SQL Server event alert + + Alert update failed '{0}' + + + SQL Server performance condition alert + + + WMI event alert + + + + + Job name must be provided + + + Job creation failed '{0}' + + + Job '{0}' created successfully + + + Job update failed '{0}' + + + Job '{0}' updated successfully + + + When the job completes + + + When the job fails + + + When the job succeeds + + + + + Step update failed '{0}' + + + Job name must be provided + + + Step name must be provided + + + + + Invalid notebook path + + + Job with similar name already exists + + + Notebook name must be provided + + + Notebook creation failed '{0}' + + + Notebook '{0}' created successfully + + + Notebook update failed '{0}' + + + Notebook '{0}' updated successfully + + + Select execution database + + + Select storage database + + + Template path must be provided + + + When the notebook completes + + + When the notebook fails + + + When the notebook succeeds + + + + + Proxy '{0}' created successfully + + + Proxy update failed '{0}' + + + Proxy '{0}' updated successfully + + + Cancel - - - - Locate Database Files - - - + OK - - Cancel - - - General - - - Advanced - - - Open... - - - Parse - - - The command was successfully parsed. - - - The command failed. - - - The step name cannot be left blank - - - Process exit code of a successful command: - - - Step Name - - - Type - - - Run as - - - Database - - - Command - - - On success action - - - On failure action - - - Run as user - - - Retry Attempts - - - Retry Interval (minutes) - - - Log to table - - - Append output to exisiting entry in table - - - Include step output in history - - - Output File - - - Append output to existing file - - - Selected path - - - Files of type - - - File name - - - All Files (*) - - - New Job Step - - - Edit Job Step - - - Transact-SQL script (T-SQL) - - - PowerShell - - - Operating system (CmdExec) - - - Replication Distributor - - - Replication Merge - - - Replication Queue Reader - - - Replication Snapshot - - - Replication Transaction-Log Reader - - - SQL Server Analysis Services Command - - - SQL Server Analysis Services Query - - - SQL Server Integration Service Package - - - SQL Server Agent Service Account - - - Go to the next step - - - Quit the job reporting success - - - Quit the job reporting failure - - - - - Job Schedules - - - OK - - - Cancel - - - Available Schedules: - - - Name - - - ID - - - Description - - + - - Create Alert + + Additional notification message to send - - Edit Alert - - - General - - - Response - - - Options - - - Event alert definition - - - Name - - - Type - - - Enabled + + <all databases> Database name + + Delay Minutes + + + Delay Seconds + + + Enabled + Error number - - Severity + + Execute Job + + + Job Name + + + General + + + Include alert error text in e-mail + + + Include alert error text in pager + + + Message text + + + Name + + + New Job + + + New Operator + + + Notify Operators + + + E-mail + + + Operator List + + + Operator + + + Pager + + + Options Raise alert when message contains - - Message text + + Response + + + Severity 001 - Miscellaneous System Information @@ -292,204 +262,106 @@ 025 - Fatal Error - - <all databases> + + Type - - Execute Job + + Create Alert - - Job Name + + Edit Alert - - Notify Operators + + Event alert definition - - New Job - - - Operator List - - - Operator - - - E-mail - - - Pager - - - New Operator - - - Include alert error text in e-mail - - - Include alert error text in pager - - - Additional notification message to send - - - Delay Minutes - - - Delay Seconds - - - - - Create Operator - - - Edit Operator - - - General - - - Notifications - - - Name - - + + + Enabled - - E-mail Name + + Alert Name - - Pager E-mail Name - - - Monday - - - Tuesday - - - Wednesday - - - Thursday - - - Friday - - - Saturday - - - Sunday - - - Workday begin - - - Workday end - - - Pager on duty schedule - - - Alert list - - - Alert name - - - E-mail - - - Pager - - - - - General - - - Steps - - - Schedules + + Type Alerts - - Notifications + + Alerts list The name of the job cannot be blank. - - Name - - - Owner - Category - - Description - - - Enabled - - - Job step list - - - Step - - - Type - - - On Success - - - On Failure - - - New Step - - - Edit Step - Delete Step - - Move Step Up + + Automatically delete job - - Move Step Down + + Description - - Start step + + Edit Step - - Actions to perform when the job completes + + Edit Job Email - - Page + + Enabled Write to the Windows Application event log - - Automatically delete job + + General - - Schedules list + + Job step list + + + Move Step Down + + + Move Step Up + + + Name + + + New Step + + + New Alert + + + New Job + + + Notifications + + + Actions to perform when the job completes + + + On Failure + + + On Success + + + Owner + + + Page Pick Schedule @@ -497,132 +369,333 @@ Remove Schedule - - Alerts list + + Schedules - - New Alert + + Schedules list - - Alert Name + + Start step - - Enabled + + Step - + + Steps + + Type - - New Job + + + + Operating system (CmdExec) - - Edit Job + + Transact-SQL script (T-SQL) - - - - When the job completes + + Advanced - - When the job fails + + SQL Server Agent Service Account - - When the job succeeds + + All Files (*) - - Job name must be provided + + SQL Server Analysis Services Command - - Job update failed '{0}' + + SQL Server Analysis Services Query - - Job creation failed '{0}' + + Append output to exisiting entry in table - - Job '{0}' updated successfully + + Append output to existing file - - Job '{0}' created successfully + + The step name cannot be left blank - - - - Step update failed '{0}' + + Cancel - - Job name must be provided + + Command - - Step name must be provided + + Database - - - - This feature is under development. Check-out the latest insiders build if you'd like to try out the most recent changes! + + Edit Job Step - - Template updated successfully + + The command failed. - - Template update failure + + On failure action - - The notebook must be saved before being scheduled. Please save and then retry scheduling again. + + Locate Database Files - - - Add new connection + + File name - - Select a connection + + Files of type - - Please select a valid connection - - - - - Alert update failed '{0}' - - - SQL Server event alert - - - SQL Server performance condition alert - - - WMI event alert - - - - - Create Proxy - - - Edit Proxy - - + General - - Proxy name + + Include step output in history + + Log to table + + + New Job Step + + + Go to the next step + + + OK + + + Open... + + + Output File + + + Parse + + + PowerShell + + + Process exit code of a successful command: + + + Quit the job reporting failure + + + Quit the job reporting success + + + Replication Distributor + + + Replication Merge + + + Replication Queue Reader + + + Replication Snapshot + + + Replication Transaction-Log Reader + + + Retry Attempts + + + Retry Interval (minutes) + + + Run as + + + Run as user + + + Selected path + + + SQL Server Integration Service Package + + + Step Name + + + On success action + + + The command was successfully parsed. + + + Type + + + + + Select Database + + + Description + + + Edit Notebook Job + + + Execution Database + + + Select a database against which notebook queries will run + + + General + + + Job Details + + + Name + + + New Notebook Job + + + Notebook Details + + + Owner + + + Pick Schedule + + + Remove Schedule + + + Schedules list + + + Storage Database + + + Select a database to store all notebook job metadata and results + + + Notebook Path + + + Select a notebook to schedule from PC + + + + + E-mail + + + Alert list + + + Alert name + + + Pager + + + E-mail Name + + + Enabled + + + General + + + Name + + + Notifications + + + Pager on duty schedule + + + Pager E-mail Name + + + Friday + + + Monday + + + Saturday + + + Sunday + + + Thursday + + + Tuesday + + + Wednesday + + + Create Operator + + + Edit Operator + + + Workday begin + + + Workday end + + + + + Available Schedules: + + + Cancel + + + Description + + + Job Schedules + + + OK + + + ID + + + Name + + + Credential name Description - - Subsystem + + General Operating system (CmdExec) - - Replication Snapshot + + PowerShell - - Replication Transaction-Log Reader + + Proxy name Replication Distributor @@ -633,125 +706,52 @@ Replication Queue Reader - - SQL Server Analysis Services Query + + Replication Snapshot + + + Replication Transaction-Log Reader SQL Server Analysis Services Command + + SQL Server Analysis Services Query + SQL Server Integration Services Package - - PowerShell + + Subsystem - - - - Proxy update failed '{0}' + + Create Proxy - - Proxy '{0}' updated successfully + + Edit Proxy - - Proxy '{0}' created successfully + + + + Add new connection - - - - New Notebook Job + + Select a connection - - Edit Notebook Job + + Please select a valid connection - - General + + Template update failure - - Notebook Details + + Template updated successfully - - Notebook Path + + The notebook must be saved before being scheduled. Please save and then retry scheduling again. - - Storage Database + + This feature is under development. Check-out the latest insiders build if you'd like to try out the most recent changes! - - Execution Database - - - Select Database - - - Job Details - - - Name - - - Owner - - - Schedules list - - - Pick Schedule - - - Remove Schedule - - - Description - - - Select a notebook to schedule from PC - - - Select a database to store all notebook job metadata and results - - - Select a database against which notebook queries will run - - - - - When the notebook completes - - - When the notebook fails - - - When the notebook succeeds - - - Notebook name must be provided - - - Template path must be provided - - - Invalid notebook path - - - Select storage database - - - Select execution database - - - Job with similar name already exists - - - Notebook update failed '{0}' - - - Notebook creation failed '{0}' - - - Notebook '{0}' updated successfully - - - Notebook '{0}' created successfully - - + \ No newline at end of file diff --git a/resources/xlf/en/arc.xlf b/resources/xlf/en/arc.xlf index fe1605776c..c0f79ac011 100644 --- a/resources/xlf/en/arc.xlf +++ b/resources/xlf/en/arc.xlf @@ -1,15 +1,1231 @@ + + + adding worker nodes + + + The Arc Deployment extension has been replaced by the Arc extension and has been uninstalled. + + + Azure Arc Resources + + + Available + + + Backup + + + {0} backups + + + Cancel + + + Click the new support request button to file a support request in the Azure Portal. + + + Click the troubleshoot button to open the Azure Arc {0} troubleshooting notebook. + + + Compute + + + Compute + Storage + + + there are sufficient resources available + + + Before doing so, you need to ensure + + + in your Kubernetes cluster to honor this configuration. + + + without downtime and by + + + Condition + + + Configuration + + + Configuration (per node) + + + Confirm the new password + + + Connect to Server + + + Connect + + + Connect to Existing Controller + + + Could not connect to controller {0}. {1} + + + Connect to SQL managed instance - Azure Arc ({0}) + + + Could not connect to SQL managed instance - Azure Arc Instance {0}. {1} + + + Connect to PostgreSQL Hyperscale - Azure Arc ({0}) + + + Could not connect to PostgreSQL Hyperscale - Azure Arc Instance {0}. {1} + + + A connection to the server is required to show and set database engine settings, which will require the PostgreSQL Extension to be installed. + + + Connection Mode + + + Connection string for {0} + {0} is the name of the type of connection string (e.g. Java) + + + Connection Strings + + + Pod containers are ready. + + + Cluster Context + + + Azure Arc Data Controller Dashboard (Preview) - {0} + + + Kube Config File Path + + + Name + + + The name to display in the tree view, this is not applied to the controller itself. + + + Controller Password + + + Controller URL + + + The Controller URL is necessary if there are multiple clusters with the same namespace - this should generally not be necessary. + + + https://<IP or hostname>:<port> + + + Controller Username + + + Coordinator + + + Coordinator Node CPU limit + + + Coordinator Node CPU request + + + Coordinator endpoint + + + Coordinator Node Memory limit (in GB) + + + Coordinator Node Memory request (in GB) + + + Coordinator Node + + + You can configure the number of CPU cores and storage size that will apply to the coordinator node. Adjust the number of CPU cores and memory settings for your server group. To reset the requests and/or limits, pass in empty value. + + + Coordinator Node Parameters + + + These server parameters of the Coordinator node can be set to custom (non-default) values. Search to find parameters. + + + {0} copied to clipboard + + + Copy {0} Connection String to clipboard + {0} is the name of the type of connection string (e.g. Java) + + + Copy {0} to clipboard + {0} is the name of the type of value being copied (e.g. Coordinator endpoint) + + + CPU limit + + + CPU request + + + Could not find Azure resource for {0} + + + Could not find controller registration. + + + New Instance + + + Data controller + + + Azure Arc Data Controller + + + {0} data + + + Database {0} created + + + Failed to create database {0}. {1} + + + Database name + + + Databases + + + arc-dc + + + Delete + + + Deleting instance '{0}'... + + + Description + + + Details + + + Diagnose and solve problems + + + Direct + + + Discard + + + Drop + + + Currently dropping another extension, try again once that is completed. + + + Emergency + + + Endpoint + + + Enter a non empty password or press escape to exit. + + + Enter a new password + + + Error connecting to controller. {0} + + + Error encountered while verifying password. {0} + + + Failed to install extension {0}. + + + Extension '{0}' has been installed. + + + Extension name + + + PostgreSQL provides the ability to extend the functionality of your database by using extensions. + + + Value should be either of the following: ({0}). + + + Some extensions must be loaded into PostgreSQL at startup time before they can be used. To edit, type in comma separated list of valid extensions: ({0}). + + + Extensions + + + Extensions '{0}' added + + + PostgreSQL provides the ability to extend the functionality of your database by using extensions. Extensions allow for bundling multiple related SQL objects together in a single package that can be loaded or removed from your database with a single command. After being loaded in the database, extensions can function like built-in features. + + + Extensions '{0}' dropped + + + Some extensions must be loaded into PostgreSQL at startup time before they can be used. These preloaded extensions can be viewed and edited below. + + + Learn more about PostgreSQL extensions. + + + Table of preloaded extensions. + + + Table of preloaded extensions are loading. + + + Preloaded extensions can now be viewed. + + + External Endpoint + + + Failed + + + Feedback + + + An unexpected error occurred retrieving the config for '{0}'. {1} + + + An unexpected error occurred retrieving the databases for '{0}'. {1} + + + An unexpected error occurred retrieving the endpoints for '{0}'. {1} + + + An unexpected error occurred retrieving the engine settings for '{0}'. {1} + + + An unexpected error occurred retrieving the registrations for '{0}'. {1} + + + Fully qualified domain + + + Grafana Dashboard + + + Dashboard for viewing metrics + + + Indirect + + + Installing extension '{0}'... + + + Instance '{0}' deleted + + + Failed to delete instance {0}. {1} + + + Warning! Deleting an instance is permanent and cannot be undone. To delete the instance '{0}' type the name '{0}' below to proceed. + + + Failed to update instance {0}. {1} + + + Instance '{0}' updated + + + Invalid config path + + + The value '{0}' does not match the instance name. Try again or press escape to exit + + + Issues Detected + + + Kibana Dashboard + + + Dashboard for viewing logs + + + Last transition + + + Learn more about database engine settings for Azure Arc-enabled PostgreSQL Hyperscale + + + Learn more about Azure PostgreSQL Hyperscale client interfaces + + + Learn More. + + + Load extensions + + + Loading... + + + Loading cluster contexts completed + + + Error loading cluster contexts. {0} + + + {0} log + + + Error logging into controller - wrong username or password + + + Memory limit (in GB) + + + Memory request (in GB) + + + Managed instance admin + + + You can scale your Azure SQL managed instance - Azure Arc by + + + A connection is required to list the databases on this instance. + + + SQL managed instance - Azure Arc Dashboard (Preview) - {0} + + + MSSQL + + + SQL managed instance - Azure Arc + + + The {0} extension is required to view engine settings. Do you wish to install it now? + + + Monitor + + + Name + + + Namespace + + + Networking + + + New Database + + + New support request + + + No + + + No extensions listed in configuration. + + + No External Endpoint has been configured so this information isn't available. + + + No instances available + + + No worker server parameters found... + + + There aren’t any known issues affecting this PostgreSQL Hyperscale instance. + + + No worker pods in this configuration. + + + node + + + Node configuration + + + nodes + + + Not Configured + + + Not Ready + + + {0} vCore + + + {0} vCores + + + • {0} ({1} issues) + + + Off + + + Offline + + + Ok + + + On + + + Online + + + Error opening dashboard. {0} + + + Open in Azure Portal + + + Overview + + + Failed to discard user input. {0} + + + Parameter Name + + + Password + + + Failed to acquire password. {0} + + + Password reset successfully + + + Failed to reset password. {0} + + + Provide Password to Controller + + + Pending + + + A connection is required to show and set database engine settings. + + + PostgreSQL Hyperscale - Azure Arc + + + Pod conditions table + + + Pod is initialized. + + + The pods listed below are experiencing issues that may affect performance or availability. + + + Pod is ready. + + + Pod is schedulable. + + + Pods Present + + + pods ready + + + Select a pod in the dropdown below for detailed health information. + + + Select a pod in the dropdown below for detailed health information + + + PostgreSQL Hyperscale server group by + + + Admin username + + + Azure Database for PostgreSQL - Azure Arc + + + You can scale your Azure Arc-enabled + + + PostgreSQL Hyperscale - Azure Arc Dashboard (Preview) - {0} + + + microsoft.azuredatastudio-postgresql + + + PGSQL + + + PostgreSQL version + + + Preloaded Extensions + + + Properties + + + RAM + + + Value is expected to be in the range {0} - {1} + + + Ready + + + Recovering + + + Recovery Pending + + + Refresh + + + Refresh failed. {0} + + + Refresh node to enter credentials + + + Region + + + Remember Password + + + Reset all to default + + + Reset failed. {0} + + + Reset Password + + + Reset to default + + + Resource Group + + + Resource health + + + Resource health can tell you if your resource is running as expected. + + + Restoring + + + Running + + + Save + + + scaling compute vCores and memory. + + + Search to filter items... + + + Security + + + Select from available client connection strings below. + + + Server Endpoint + + + Server group nodes + + + Server group type + + + Service endpoints + + + Service endpoints table + + + Settings + + + State + + + Status + + + storage per node + + + Subscription ID + + + Support + troubleshooting + + + Note that the resource configuration must have been uploaded to Azure first in order to open a support request. + + + Suspect + + + The passwords do not match. Confirm the password or press escape to exit. + + + Troubleshoot + + + Type + + + Unknown + + + Unload extensions + + + Editing extensions failed. {0} + + + Updated {0} + + + Updating instance '{0}'... + + + User cancelled the dialog + + + Username + + + vCores + + + Value + + + Worker + + + Worker Nodes CPU limit + + + Worker Nodes CPU request + + + Worker Nodes Memory limit (in GB) + + + Worker Nodes Memory request (in GB) + + + Worker node count + + + It is possible to scale in and out your server group by reducing or increasing the number of worker nodes. The value must be 0 or greater than 1. + + + Worker Node Parameters + + + Worker Nodes + + + You can configure the number of CPU cores and storage size that will apply to all worker nodes. Adjust the number of CPU cores and memory settings for your server group. To reset the requests and/or limits, pass in empty value. + + + Expand your server group and scale your database by adding worker nodes. + + + These server parameters of the Worker nodes can be set to custom (non-default) values. Search to find parameters. + + + Value of 1 is not supported. + + + Yes + + + Select + + + The cluster context information specified by config file: {0} and cluster context: {1} is no longer valid. Error is: + {2} + Do you want to update this information? + + + Cluster Context with name: {0} not found in the Kube config file + + + Browse + + + Attempt to get isPassword for unknown variable:{0} + + + Attempt to get variable value for unknown variable:{0} + + + No 'contexts' found in the config file: {0} + + + Controller Info could not be found with name: {0} + + + No Azure Arc controllers are currently connected. Please run the command: 'Connect to Existing Azure Arc Controller' and then try again + + + No current cluster context was found in the kube config file + + + No context is marked as 'current-context' in the config file: {0} + + + No name field was found in a cluster context in the config file: {0} + + + Password could not be retrieved for controller: {0} and user did not provide a password. Please retry later. + + - + + I accept {0} and {1}. + + + Azure Arc-enabled PostgreSQL Hyperscale terms and conditions + + + Azure Arc enabled Managed Instance provides SQL Server access and feature compatibility that can be deployed on the infrastructure of your choice. {0} + + + Learn more + + + Learn more about Azure Arc enabled Managed Instance + + + Azure SQL managed instance - Azure Arc terms and conditions + + + Azure account + + + Azure location + + + Azure resource group + + + Azure information + + + Azure subscription + + Azure Arc + + Confirm password + + + Target Azure Arc Controller + + + Cores Limit + + + Cores Request + + + Confirm password + + + Data controller login + + + Password + + + Administrator account + + + I accept {0} and {1}. + + + Config profile + + + Loading config profiles + + + Loading config profiles complete + + + Choose the config profile + + + What is your target existing Kubernetes cluster environment? + + + Azure Configuration + + + Controller Configuration + + + Review your configuration + + + Provide a namespace, name and storage class for your Azure Arc data controller. This name will be used to identify your Arc instance for remote management and monitoring. + + + Data controller details + + + Infrastructure + + + Cluster context + + + Location + + + Data controller name + + + Name must consist of lower case alphanumeric characters, '-' or '.', start/end with an alphanumeric character and be 253 characters or less in length. + + + Data controller namespace + + + Namespace must consist of lower case alphanumeric characters or '-', start/end with an alphanumeric character, and be 63 characters or fewer in length. + + + Create Azure Arc data controller + + + Select the subscription to manage deployed resources and costs. Use resource groups like folders to organize and manage all your resources. + + + Azure details + + + Read more + + + Select from existing Kubernetes clusters + + + by Microsoft + + + Azure Arc data controller + + + Terms of use + + + Privacy policy + + + | + + + Azure + + + Cluster context + + + Controller + + + Data controller infrastructure + + + Data controller name + + + Data controller namespace + + + Estimated cost per month + + + Free + + + Kube config file path + + + Kubernetes + + + Location + + + Config profile + + + Resource group + + + Subscription + + + Terms + + + By clicking 'Script to notebook', I (a) agree to the legal terms and privacy statement(s) associated with the Marketplace offering(s) listed above; (b) authorize Microsoft to bill my current payment method for the fees associated with the offering(s), with the same billing frequency as my Azure subscription; and (c) agree that Microsoft may share my contact, usage and transactional information with the provider(s) of the offering(s) for support, billing and other transactional activities. Microsoft does not provide rights for third-party offerings. For additional details see {0}. + + + Azure Marketplace Terms + + + Username + Support for Azure Arc - + Azure Arc + + Memory Limit + + + Memory Request + + + Manage + + + Password + + + The maximum number of CPU cores for the Postgres instance that can be used on the coordinator node. Fractional cores are supported. + + + CPU limit + + + The minimum number of CPU cores that must be available on the coordinator node to schedule the service. Fractional cores are supported. + + + CPU request + + + The memory limit of the Postgres instance on the coordinator node in GB. + + + Memory limit (GB) + + + The memory request of the Postgres instance on the coordinator node in GB. + + + Memory request (GB) + + + Engine Version + + + A comma-separated list of the Postgres extensions that should be loaded on startup. Please refer to the postgres documentation for supported values. + + + Extensions + + + Server group name + + + Server group name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 11 characters or fewer in length. + + + Port + + + The size of the storage volume to be used for backups in GB. + + + Volume Size GB (Backups) + + + The size of the storage volume to be used for data in GB. + + + Volume Size GB (Data) + + + The size of the storage volume to be used for logs in GB. + + + Volume Size GB (Logs) + + + The maximum number of CPU cores for the Postgres instance that can be used per node. Fractional cores are supported. + + + CPU limit (cores per node) + + + The minimum number of CPU cores that must be available per node to schedule the service. Fractional cores are supported. + + + CPU request (cores per node) + + + The number of worker nodes to provision in a sharded cluster, or zero (the default) for single-node Postgres. + + + Number of workers + + + The memory limit of the Postgres instance per node in GB. + + + Memory limit (GB per node) + + + The memory request of the Postgres instance per node in GB. + + + Memory request (GB per node) + + + Coordinator Node Compute Configuration + + + Worker Nodes Compute Configuration + + + General settings + + + Storage settings + + + The storage class to be used for backup persistent volumes + + + The storage class to be used for data persistent volumes + + + The storage class to be used for logs persistent volumes + + + Provide Azure enabled PostgreSQL Hyperscale server group parameters + + + Deploy an Azure Arc-enabled PostgreSQL Hyperscale server group (Preview) + + + SQL Connection information + + + The cores limit of the managed instance as an integer. + + + The request for cores of the managed instance as an integer. + + + Instance name + + + SQL Instance settings + + + Instance name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 13 characters or fewer in length. + + + sa username is disabled, please choose another username + + + The limit of the capacity of the managed instance as an integer. + + + The request for the capacity of the managed instance as an integer amount of memory in GBs. + + + The number of SQL Managed Instance replicas that will be deployed in your Kubernetes cluster for high availability purposes + + + Replicas + + + The storage class to be used for data (.mdf) + + + The storage class to be used for all data and logs persistent volumes for all data controller pods that require them. + + + The storage class to be used for logs (/var/log) + + + Username + + + Provide Azure SQL managed instance parameters + + + Deploy Azure SQL managed instance - Azure Arc (preview) + + + Storage Class (Backups) + + + Storage Class (Data) + + + Storage Class + + + Storage Class (Logs) + Azure Arc Controllers @@ -20,191 +1236,44 @@ Loading controllers... - - Create New Azure Arc Controller - Connect to Existing Azure Arc Controller - - Remove Controller - - - Refresh + + Create New Azure Arc Controller Edit Connection - - Manage + + Refresh - - Azure Arc data controller (preview) + + Remove Controller - - Creates an Azure Arc data controller - - - Create Azure Arc data controller - - - What is your target existing Kubernetes cluster environment? - - - Select from existing Kubernetes clusters - - - Cluster context - - - Choose the config profile - - - Config profile - - - Loading config profiles - - - Loading config profiles complete - - - Azure Configuration - - - Controller Configuration - - - Azure details - - - Select the subscription to manage deployed resources and costs. Use resource groups like folders to organize and manage all your resources. - - - Data controller details - - - Provide a namespace, name and storage class for your Azure Arc data controller. This name will be used to identify your Arc instance for remote management and monitoring. - - - Data controller namespace - - - Namespace must consist of lower case alphanumeric characters or '-', start/end with an alphanumeric character, and be 63 characters or fewer in length. - - - Data controller name - - - Name must consist of lower case alphanumeric characters, '-' or '.', start/end with an alphanumeric character and be 253 characters or less in length. - - - Location - - - Infrastructure - - - Administrator account - - - Data controller login - - - Password - - - Confirm password - - - Review your configuration - - - Azure Arc data controller - - - Estimated cost per month - - - by Microsoft - - - Free - - - Terms of use - - - | - - - Privacy policy - - - Terms - - - By clicking 'Script to notebook', I (a) agree to the legal terms and privacy statement(s) associated with the Marketplace offering(s) listed above; (b) authorize Microsoft to bill my current payment method for the fees associated with the offering(s), with the same billing frequency as my Azure subscription; and (c) agree that Microsoft may share my contact, usage and transactional information with the provider(s) of the offering(s) for support, billing and other transactional activities. Microsoft does not provide rights for third-party offerings. For additional details see {0}. - - - Azure Marketplace Terms - - - Kubernetes - - - Kube config file path - - - Cluster context - - - Config profile - - - Username - - - Azure - - - Subscription - - - Resource group - - - Data controller name - - - Data controller namespace - - - Data controller infrastructure - - - Controller - - - Location - - - I accept {0} and {1}. - - - Read more - - - Microsoft Privacy Statement - - - Script to notebook + + Cores limit must be greater than or equal to requested cores Deploy - - Azure SQL managed instance - Azure Arc (preview) + + Script to notebook + + + Memory limit must be greater than or equal to requested memory + + + Microsoft Privacy Statement + + + Requested cores must be less than or equal to cores limit + + + Requested memory must be less than or equal to memory limit + + + Deploy PostgreSQL Hyperscale server groups into an Azure Arc environment PostgreSQL Hyperscale server groups - Azure Arc (preview) @@ -212,233 +1281,14 @@ Managed SQL Instance service for app developers in a customer-managed environment - - Deploy PostgreSQL Hyperscale server groups into an Azure Arc environment + + Azure SQL managed instance - Azure Arc (preview) - - Target Azure Arc Controller + + Creates an Azure Arc data controller - - Deploy Azure SQL managed instance - Azure Arc (preview) - - - Provide Azure SQL managed instance parameters - - - SQL Connection information - - - SQL Instance settings - - - Azure information - - - Instance name - - - Username - - - sa username is disabled, please choose another username - - - Instance name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 13 characters or fewer in length. - - - Storage Class - - - The storage class to be used for all data and logs persistent volumes for all data controller pods that require them. - - - Replicas - - - The number of SQL Managed Instance replicas that will be deployed in your Kubernetes cluster for high availability purposes - - - Storage Class (Data) - - - The storage class to be used for data (.mdf) - - - The storage class to be used for data persistent volumes - - - Storage Class (Logs) - - - The storage class to be used for logs (/var/log) - - - The storage class to be used for logs persistent volumes - - - Storage Class (Backups) - - - Cores Limit - - - The cores limit of the managed instance as an integer. - - - Cores Request - - - The request for cores of the managed instance as an integer. - - - Memory Limit - - - The limit of the capacity of the managed instance as an integer. - - - Memory Request - - - The request for the capacity of the managed instance as an integer amount of memory in GBs. - - - The storage class to be used for backup persistent volumes - - - Password - - - Confirm password - - - Azure account - - - Azure subscription - - - Azure resource group - - - Azure location - - - Deploy an Azure Arc-enabled PostgreSQL Hyperscale server group (Preview) - - - Provide Azure enabled PostgreSQL Hyperscale server group parameters - - - General settings - - - Worker Nodes Compute Configuration - - - Coordinator Node Compute Configuration - - - Storage settings - - - Server group name - - - Server group name must consist of lower case alphanumeric characters or '-', start with a letter, end with an alphanumeric character, and be 11 characters or fewer in length. - - - Number of workers - - - The number of worker nodes to provision in a sharded cluster, or zero (the default) for single-node Postgres. - - - Port - - - Engine Version - - - Extensions - - - A comma-separated list of the Postgres extensions that should be loaded on startup. Please refer to the postgres documentation for supported values. - - - Volume Size GB (Data) - - - The size of the storage volume to be used for data in GB. - - - Volume Size GB (Logs) - - - The size of the storage volume to be used for logs in GB. - - - Volume Size GB (Backups) - - - The size of the storage volume to be used for backups in GB. - - - CPU request (cores per node) - - - The minimum number of CPU cores that must be available per node to schedule the service. Fractional cores are supported. - - - CPU limit (cores per node) - - - The maximum number of CPU cores for the Postgres instance that can be used per node. Fractional cores are supported. - - - Memory request (GB per node) - - - The memory request of the Postgres instance per node in GB. - - - Memory limit (GB per node) - - - The memory limit of the Postgres instance per node in GB. - - - CPU request - - - The minimum number of CPU cores that must be available on the coordinator node to schedule the service. Fractional cores are supported. - - - CPU limit - - - The maximum number of CPU cores for the Postgres instance that can be used on the coordinator node. Fractional cores are supported. - - - Memory request (GB) - - - The memory request of the Postgres instance on the coordinator node in GB. - - - Memory limit (GB) - - - The memory limit of the Postgres instance on the coordinator node in GB. - - - I accept {0} and {1}. - - - Azure SQL managed instance - Azure Arc terms and conditions - - - Azure Arc-enabled PostgreSQL Hyperscale terms and conditions + + Azure Arc data controller (preview) Value must be an integer @@ -446,855 +1296,5 @@ Value of 1 is not supported. - - Requested cores must be less than or equal to cores limit - - - Cores limit must be greater than or equal to requested cores - - - Requested memory must be less than or equal to memory limit - - - Memory limit must be greater than or equal to requested memory - - - Azure Arc enabled Managed Instance provides SQL Server access and feature compatibility that can be deployed on the infrastructure of your choice. {0} - - - Learn more - - - Learn more about Azure Arc enabled Managed Instance - - - - - The Arc Deployment extension has been replaced by the Arc extension and has been uninstalled. - - - Azure Arc Data Controller Dashboard (Preview) - {0} - - - SQL managed instance - Azure Arc Dashboard (Preview) - {0} - - - PostgreSQL Hyperscale - Azure Arc Dashboard (Preview) - {0} - - - Azure Arc Data Controller - - - PostgreSQL Hyperscale - Azure Arc - - - SQL managed instance - Azure Arc - - - Overview - - - Connection Strings - - - Preloaded Extensions - - - Networking - - - Properties - - - Settings - - - Security - - - Compute + Storage - - - Coordinator Node Parameters - - - Worker Node Parameters - - - Compute - - - Backup - - - New support request - - - Diagnose and solve problems - - - Support + troubleshooting - - - Resource health - - - Parameter Name - - - Value - - - New Instance - - - Delete - - - Learn More. - - - Drop - - - Save - - - Discard - - - Reset Password - - - Load extensions - - - Unload extensions - - - No extensions listed in configuration. - - - Open in Azure Portal - - - Resource Group - - - Region - - - Subscription ID - - - State - - - Connection Mode - - - Namespace - - - External Endpoint - - - Name - - - Type - - - Status - - - Managed instance admin - - - Extension name - - - PostgreSQL provides the ability to extend the functionality of your database by using extensions. Extensions allow for bundling multiple related SQL objects together in a single package that can be loaded or removed from your database with a single command. After being loaded in the database, extensions can function like built-in features. - - - Some extensions must be loaded into PostgreSQL at startup time before they can be used. These preloaded extensions can be viewed and edited below. - - - Some extensions must be loaded into PostgreSQL at startup time before they can be used. To edit, type in comma separated list of valid extensions: ({0}). - - - Value should be either of the following: ({0}). - - - Learn more about PostgreSQL extensions. - - - Table of preloaded extensions are loading. - - - Table of preloaded extensions. - - - Preloaded extensions can now be viewed. - - - Extensions - - - PostgreSQL provides the ability to extend the functionality of your database by using extensions. - - - Data controller - - - Kibana Dashboard - - - Grafana Dashboard - - - Dashboard for viewing logs - - - Dashboard for viewing metrics - - - Service endpoints - - - Service endpoints table - - - Databases - - - Endpoint - - - Description - - - Yes - - - No - - - Feedback - - - Select from available client connection strings below. - - - adding worker nodes - - - Expand your server group and scale your database by adding worker nodes. - - - You can configure the number of CPU cores and storage size that will apply to all worker nodes. Adjust the number of CPU cores and memory settings for your server group. To reset the requests and/or limits, pass in empty value. - - - You can configure the number of CPU cores and storage size that will apply to the coordinator node. Adjust the number of CPU cores and memory settings for your server group. To reset the requests and/or limits, pass in empty value. - - - It is possible to scale in and out your server group by reducing or increasing the number of worker nodes. The value must be 0 or greater than 1. - - - Value of 1 is not supported. - - - vCores - - - RAM - - - Refresh - - - Reset all to default - - - Reset to default - - - Troubleshoot - - - Click the new support request button to file a support request in the Azure Portal. - - - Note that the resource configuration must have been uploaded to Azure first in order to open a support request. - - - Running - - - Ready - - - Not Ready - - - Pending - - - Failed - - - Unknown - - - Direct - - - Indirect - - - Loading... - - - Refresh node to enter credentials - - - No instances available - - - Connect to Server - - - Connect to Existing Controller - - - Connect to SQL managed instance - Azure Arc ({0}) - - - Connect to PostgreSQL Hyperscale - Azure Arc ({0}) - - - Provide Password to Controller - - - Controller URL - - - https://<IP or hostname>:<port> - - - The Controller URL is necessary if there are multiple clusters with the same namespace - this should generally not be necessary. - - - Server Endpoint - - - Name - - - The name to display in the tree view, this is not applied to the controller itself. - - - Kube Config File Path - - - Cluster Context - - - arc-dc - - - PGSQL - - - MSSQL - - - Controller Username - - - Controller Password - - - Username - - - Password - - - Remember Password - - - Connect - - - Cancel - - - Ok - - - On - - - Off - - - Not Configured - - - Online - - - Offline - - - Restoring - - - Recovering - - - Recovery Pending - - - Suspect - - - Emergency - - - Coordinator endpoint - - - Admin username - - - Node configuration - - - PostgreSQL version - - - Server group type - - - Server group nodes - - - Fully qualified domain - - - Azure Database for PostgreSQL - Azure Arc - - - Coordinator - - - Worker - - - Monitor - - - Available - - - Issues Detected - - - New Database - - - Database name - - - Enter a new password - - - Confirm the new password - - - Learn more about Azure PostgreSQL Hyperscale client interfaces - - - These server parameters of the Coordinator node can be set to custom (non-default) values. Search to find parameters. - - - These server parameters of the Worker nodes can be set to custom (non-default) values. Search to find parameters. - - - Learn more about database engine settings for Azure Arc-enabled PostgreSQL Hyperscale - - - No worker server parameters found... - - - Search to filter items... - - - scaling compute vCores and memory. - - - You can scale your Azure Arc-enabled - - - You can scale your Azure SQL managed instance - Azure Arc by - - - PostgreSQL Hyperscale server group by - - - without downtime and by - - - Before doing so, you need to ensure - - - there are sufficient resources available - - - Resource health can tell you if your resource is running as expected. - - - in your Kubernetes cluster to honor this configuration. - - - node - - - nodes - - - Worker Nodes - - - Coordinator Node - - - storage per node - - - Worker node count - - - Configuration (per node) - - - Configuration - - - CPU limit - - - Worker Nodes CPU limit - - - Coordinator Node CPU limit - - - CPU request - - - Worker Nodes CPU request - - - Coordinator Node CPU request - - - Memory limit (in GB) - - - Worker Nodes Memory limit (in GB) - - - Coordinator Node Memory limit (in GB) - - - Memory request (in GB) - - - Worker Nodes Memory request (in GB) - - - Coordinator Node Memory request (in GB) - - - Azure Arc Resources - - - Enter a non empty password or press escape to exit. - - - The passwords do not match. Confirm the password or press escape to exit. - - - Password reset successfully - - - Condition - - - Details - - - Last transition - - - No External Endpoint has been configured so this information isn't available. - - - No worker pods in this configuration. - - - pods ready - - - Pods Present - - - Select a pod in the dropdown below for detailed health information. - - - Select a pod in the dropdown below for detailed health information - - - Pod conditions table - - - A connection to the server is required to show and set database engine settings, which will require the PostgreSQL Extension to be installed. - - - microsoft.azuredatastudio-postgresql - - - Pod is initialized. - - - Pod is ready. - - - There aren’t any known issues affecting this PostgreSQL Hyperscale instance. - - - The pods listed below are experiencing issues that may affect performance or availability. - - - Pod containers are ready. - - - Pod is schedulable. - - - Loading cluster contexts completed - - - Value is expected to be in the range {0} - {1} - - - Database {0} created - - - Deleting instance '{0}'... - - - Installing extension '{0}'... - - - Extension '{0}' has been installed. - - - Updating instance '{0}'... - - - Instance '{0}' deleted - - - Instance '{0}' updated - - - Extensions '{0}' dropped - - - Extensions '{0}' added - - - {0} copied to clipboard - - - Click the troubleshoot button to open the Azure Arc {0} troubleshooting notebook. - - - {0} data - - - {0} log - - - {0} backups - - - {0} vCore - - - {0} vCores - - - Updated {0} - - - Connection string for {0} - {0} is the name of the type of connection string (e.g. Java) - - - Copy {0} Connection String to clipboard - {0} is the name of the type of connection string (e.g. Java) - - - Copy {0} to clipboard - {0} is the name of the type of value being copied (e.g. Coordinator endpoint) - - - A connection is required to show and set database engine settings. - - - A connection is required to list the databases on this instance. - - - Could not find controller registration. - - - Currently dropping another extension, try again once that is completed. - - - Editing extensions failed. {0} - - - Refresh failed. {0} - - - Reset failed. {0} - - - Error opening dashboard. {0} - - - Failed to delete instance {0}. {1} - - - Failed to update instance {0}. {1} - - - Failed to discard user input. {0} - - - Failed to create database {0}. {1} - - - Could not connect to controller {0}. {1} - - - Could not connect to SQL managed instance - Azure Arc Instance {0}. {1} - - - Could not connect to PostgreSQL Hyperscale - Azure Arc Instance {0}. {1} - - - The {0} extension is required to view engine settings. Do you wish to install it now? - - - Failed to install extension {0}. - - - An unexpected error occurred retrieving the config for '{0}'. {1} - - - An unexpected error occurred retrieving the endpoints for '{0}'. {1} - - - An unexpected error occurred retrieving the registrations for '{0}'. {1} - - - An unexpected error occurred retrieving the databases for '{0}'. {1} - - - An unexpected error occurred retrieving the engine settings for '{0}'. {1} - - - • {0} ({1} issues) - - - Warning! Deleting an instance is permanent and cannot be undone. To delete the instance '{0}' type the name '{0}' below to proceed. - - - The value '{0}' does not match the instance name. Try again or press escape to exit - - - Could not find Azure resource for {0} - - - Failed to reset password. {0} - - - Error connecting to controller. {0} - - - Failed to acquire password. {0} - - - Error logging into controller - wrong username or password - - - Error encountered while verifying password. {0} - - - No Azure Arc controllers are currently connected. Please run the command: 'Connect to Existing Azure Arc Controller' and then try again - - - Attempt to get variable value for unknown variable:{0} - - - Attempt to get isPassword for unknown variable:{0} - - - Controller Info could not be found with name: {0} - - - Password could not be retrieved for controller: {0} and user did not provide a password. Please retry later. - - - Cluster Context with name: {0} not found in the Kube config file - - - No current cluster context was found in the kube config file - - - Browse - - - Select - - - No 'contexts' found in the config file: {0} - - - No context is marked as 'current-context' in the config file: {0} - - - No name field was found in a cluster context in the config file: {0} - - - User cancelled the dialog - - - The cluster context information specified by config file: {0} and cluster context: {1} is no longer valid. Error is: - {2} - Do you want to update this information? - - - Invalid config path - - - Error loading cluster contexts. {0} - - + \ No newline at end of file diff --git a/resources/xlf/en/asde-deployment.xlf b/resources/xlf/en/asde-deployment.xlf index d1ff6321b9..4bbdb0f33d 100644 --- a/resources/xlf/en/asde-deployment.xlf +++ b/resources/xlf/en/asde-deployment.xlf @@ -1,47 +1,44 @@ - - Azure SQL Edge Deployment Extension + + Azure information - - Provides a notebook-based experience to deploy Azure SQL Edge + + Location - - Container name + + Resource group - - Azure SQL Edge instance (sa) password + + Subscription id + + + Device ID + + + Device IP Address + + + Will be used to connect to the Azure SQL Edge instance after deployment + + + Target condition + + + Learn more about target condition Confirm password - - Azure SQL Edge Port + + Container name - - Microsoft Privacy Statement + + Image tag - - Azure SQL Edge - - - Azure SQL Edge (Preview) is an optimized relational database engine geared for IoT and IoT Edge deployments. - - - Deployment target - - - Local container instance - - - Remote container instance - - - Deploy Azure SQL Edge container instance on local machine - - - Docker settings + + Password Registry @@ -49,26 +46,26 @@ Repository - - Image tag + + Docker settings + + + Azure SQL Edge instance (sa) password + + + Azure SQL Edge Port Username - - Password - I accept {0} and {1}. Microsoft Azure SQL Edge License Agreement - - Deploy Azure SQL Edge container instance on remote machine - - - Target machine information + + Password Name or IP address @@ -76,26 +73,74 @@ Username - - Password + + Provides a notebook-based experience to deploy Azure SQL Edge + + + Azure SQL Edge Deployment Extension + + + IoT Hub name + + + Microsoft Privacy Statement + + + SQL Server Package files + + + Package file + + + Path of the SQL Server package file(dacpac, bacpac) or compressed package file. + + + Target machine information + + + Azure SQL Edge (Preview) is an optimized relational database engine geared for IoT and IoT Edge deployments. + + + Azure SQL Edge New Azure IoT Hub and VM (password authentication) + + Multiple devices of an Azure IoT Hub + + + Deploy Azure SQL Edge to multiple Azure IoT devices + + + Existing device of an Azure IoT Hub + + + Deploy Azure SQL Edge to an existing device + New Azure IoT Hub and VM (ssh public key authentication) Deploy Azure SQL Edge to a new Azure VM via IoT hub - - Subscription id + + Local container instance - - Resource group + + Deploy Azure SQL Edge container instance on local machine - - Location + + Remote container instance + + + Deploy Azure SQL Edge container instance on remote machine + + + Deployment target + + + Azure SQL Edge information VM admin username @@ -109,53 +154,8 @@ VM password must be 12 to 123 characters in length and consists of upper case characters, lower case characters, numbers and special characters. - - Package file - - - Path of the SQL Server package file(dacpac, bacpac) or compressed package file. - - - Azure information - - - Azure SQL Edge information - - - SQL Server Package files - - - Existing device of an Azure IoT Hub - - - Deploy Azure SQL Edge to an existing device - - - IoT Hub name - - - Device ID - - - Device IP Address - - - Will be used to connect to the Azure SQL Edge instance after deployment - - - Multiple devices of an Azure IoT Hub - - - Deploy Azure SQL Edge to multiple Azure IoT devices - - - Target condition - - - Learn more about target condition - SSH public key - + \ No newline at end of file diff --git a/resources/xlf/en/azurecore.xlf b/resources/xlf/en/azurecore.xlf index 749005d162..02d579b74a 100644 --- a/resources/xlf/en/azurecore.xlf +++ b/resources/xlf/en/azurecore.xlf @@ -1,120 +1,304 @@ - - - Azure (Core) + + + No access token returned from Microsoft OAuth - - Browse and work with Azure resources + + Error when adding your account to the cache. - + + Error when getting your account from the cache + + + Error when parsing your account from the cache + + + Error when removing your account from the cache. + + + Microsoft Account + + + Microsoft Corp + + + Something failed with the authentication, or your tokens have been deleted from the system. Please try adding your account to Azure Data Studio again. + + + The user had no unique identifier within AAD + + + Token retrieval failed with an error. Open developer tools to view the error + + + Specified tenant with ID '{0}' not found. + + + Unidentified error with azure authentication + + + Work or school account + + + Your tenant '{0} ({1})' requires you to re-authenticate again to access {2} resources. Press Open to start the authentication process. + + + Cancel + + + Ignore Tenant + + + Open + + + + + Azure Auth Code Grant + + + Server could not start. This could be a permissions error or an incompatibility on your system. You can try enabling device code authentication from settings. + + + Authentication failed due to a nonce mismatch, please close Azure Data Studio and try again. + + + Authentication failed due to a state mismatch, please close ADS and try again. + + + + + Add {0} account + + + Azure Device Code + + + Error encountered when trying to check for login results + + + Timed out when waiting for device code login. + + + + + No Azure auth method available. You must enable the auth methods in ADS configuration. + + + No Azure auth method selected. You must select what method of authentication you want to use. + + + A call was made to azdata.accounts.getSecurityToken, this method is deprecated and will be removed in future releases. Please use getAccountSecurityToken instead. + + + + + Failed to clear token cache + + + Token cache successfully cleared + + + + + Azure (China) + + + Azure (Germany) + + Azure - - Azure Resource Configuration + + Azure (US Government) - - The resource filter, each element is an account id, a subscription id and name separated by a slash + + Azure (US National) - - Azure + + + + You must select an Azure account for this feature to work. - - Azure: Refresh All Accounts + + You must enable preview features in order to use Azure Cloud Shell. - - Refresh + + You must select a tenant for this feature to work. - - Azure: Sign In + + You are not currently signed into any Azure accounts, Please sign in and then try again. - - Select Subscriptions + + A tenant is required for this feature. Your Azure subscription seems to have no tenants. - - Start Cloud Shell + + Select an Azure account - - Connect + + Sign in - - Add to Servers + + Starting cloud shell… - - Azure (Preview) + + + + Failed to get subscriptions for account {0}. Please refresh the account. - - The list of tenant IDs to ignore when querying azure resources. Each element is a tenant id. + + + + Log Analytics workspace - - Clear Azure Account Token Cache + + + + SQL database - - Open in Azure Portal + + + + SQL server - - Azure Account Configuration + + + + Azure Data Explorer Cluster - - Should Azure public cloud integration be enabled + + + + PostgreSQL Hyperscale – Azure Arc - - Should US Government Azure cloud (Fairfax) integration be enabled + + + + Azure Database for PostgreSQL server - - Should US National Azure cloud integration be enabled + + + + Azure SQL DB managed instance - - Should Azure China integration be enabled + + + + SQL managed instance – Azure Arc - - Should Azure Germany integration be enabled + + + + No Resources found - - Azure Authentication Method + + + + Failed to get subscriptions for account {0} (tenant '{1}'). {2} - - Code Grant Method + + + + Azure Cloud Shell (Preview) {0} ({1}) - - Device Code Method + + OK - - Disable system keychain integration. Credentials will be stored in a flat file in the user's home directory. + + Open Azure Shell - - Should Personally Identifiable Information (PII) be logged in the console view locally + + If you have not launched Azure Cloud Shell from this account before, please visit https://shell.azure.com/ to get started. Once you are set up, you can use AzureCloud Shell directly in Azure Data Studio. - + + Select Bash or PowerShell for Azure Cloud Shell + + + Shell closed. + + + + You must pick a shell type + + + + + Sign in to Azure... + + + + + No Subscriptions found. + + + Unable to access subscription {0} ({1}). Please [refresh the account](command:azure.resource.signin) to try again. {2} + + + + + Failed to load some Azure accounts. {0} + + + Loading ... + + + Show Azure accounts + + + + + Requests from this account have been throttled. To retry, please select a smaller number of subscriptions. + + + {0} ({1}/{2} subscriptions) + {0} is the display name of the azure account +{1} is the number of selected subscriptions in this account +{2} is the number of total subscriptions in this account + + + {0} - Loading... + + + An error occurred while loading Azure resources: {0} + + + + + No Resources found. + + + + + Loading ... + + - - Error: {0} + + Error fetching locations for account {0} ({1}) subscription {2} ({3}) tenant {4} : {5} Error fetching resource groups for account {0} ({1}) subscription {2} ({3}) tenant {4} : {5} - - Error fetching locations for account {0} ({1}) subscription {2} ({3}) tenant {4} : {5} - - - Invalid query + + Error fetching subscriptions for account {0} : {1} Error fetching subscriptions for account {0} tenant {1} : {2} - - Error fetching subscriptions for account {0} : {1} + + Invalid query - + + Error: {0} + + - - Azure Accounts + + Azure Resources (Preview) - - Modifying this setting requires reloading the window for all changes to take effect. - - - Reload + + Unable to open link, missing required values Australia Central @@ -128,6 +312,15 @@ Australia Southeast + + Azure Arc-enabled PostgreSQL Hyperscale + + + Data Service - Azure Arc + + + SQL managed instance - Azure Arc + Brazil South @@ -161,6 +354,9 @@ East US 2 EUAP + + Azure Accounts + France Central @@ -173,6 +369,12 @@ Germany West Central + + Invalid account + + + Invalid tenant for subscription + Japan East @@ -185,6 +387,12 @@ Korea South + + Location + + + Name + North Central US @@ -197,6 +405,21 @@ Norway West + + Azure Database for PostgreSQL server + + + Reload + + + Modifying this setting requires reloading the window for all changes to take effect. + + + Resource group + + + Resource type + South Africa North @@ -212,12 +435,30 @@ South India + + SQL database + + + SQL managed instance + + + SQL server + + + SQL Server - Azure Arc + + + Subscription + Switzerland North Switzerland West + + Type Icon + UAE Central @@ -230,6 +471,9 @@ UK West + + Unable to get token for tenant {0} + West Central US @@ -245,335 +489,91 @@ West US 2 - - Name + + + + Clear Azure Account Token Cache - - Resource type + + Browse and work with Azure resources - - Resource group + + Azure (Core) - - Location + + Open in Azure Portal - - Subscription + + The resource filter, each element is an account id, a subscription id and name separated by a slash - - Type Icon + + Azure Resource Configuration - - SQL server + + Add to Servers - - SQL database + + Connect - - Azure Database for PostgreSQL server - - - SQL managed instance - - - SQL managed instance - Azure Arc - - - Data Service - Azure Arc - - - SQL Server - Azure Arc - - - Azure Arc-enabled PostgreSQL Hyperscale - - - Unable to open link, missing required values - - - Azure Resources (Preview) - - - Invalid account - - - Invalid tenant for subscription - - - Unable to get token for tenant {0} - - - - - Failed to get subscriptions for account {0}. Please refresh the account. - - - - - Unidentified error with azure authentication - - - Specified tenant with ID '{0}' not found. - - - Something failed with the authentication, or your tokens have been deleted from the system. Please try adding your account to Azure Data Studio again. - - - Token retrieval failed with an error. Open developer tools to view the error - - - No access token returned from Microsoft OAuth - - - The user had no unique identifier within AAD - - - Work or school account - - - Error when adding your account to the cache. - - - Error when getting your account from the cache - - - Error when parsing your account from the cache - - - Open - - - Cancel - - - Ignore Tenant - - - Your tenant '{0} ({1})' requires you to re-authenticate again to access {2} resources. Press Open to start the authentication process. - - - Microsoft Corp - - - Microsoft Account - - - Error when removing your account from the cache. - - - - - Unable to access subscription {0} ({1}). Please [refresh the account](command:azure.resource.signin) to try again. {2} - - - No Subscriptions found. - - - - - No Resources found - - - - - {0} - Loading... - - - {0} ({1}/{2} subscriptions) - {0} is the display name of the azure account -{1} is the number of selected subscriptions in this account -{2} is the number of total subscriptions in this account - - - Requests from this account have been throttled. To retry, please select a smaller number of subscriptions. - - - An error occurred while loading Azure resources: {0} - - - - - Sign in to Azure... - - - - - Token cache successfully cleared - - - Failed to clear token cache - - - - + Azure - - Azure (US Government) + + Refresh - - Azure (US National) + + Azure: Refresh All Accounts - - Azure (Germany) + + Select Subscriptions - - Azure (China) + + Azure: Sign In - - - - A call was made to azdata.accounts.getSecurityToken, this method is deprecated and will be removed in future releases. Please use getAccountSecurityToken instead. + + Start Cloud Shell - - No Azure auth method selected. You must select what method of authentication you want to use. + + Azure (Preview) - - No Azure auth method available. You must enable the auth methods in ADS configuration. + + The list of tenant IDs to ignore when querying azure resources. Each element is a tenant id. - - - - Server could not start. This could be a permissions error or an incompatibility on your system. You can try enabling device code authentication from settings. + + Azure - - Authentication failed due to a nonce mismatch, please close Azure Data Studio and try again. + + Azure Account Configuration - - Authentication failed due to a state mismatch, please close ADS and try again. + + Azure Authentication Method - - Azure Auth Code Grant + + Code Grant Method - - - - Add {0} account + + Device Code Method - - Timed out when waiting for device code login. + + Should Azure China integration be enabled - - Error encountered when trying to check for login results + + Should Azure Germany integration be enabled - - Azure Device Code + + Should Azure public cloud integration be enabled - - - - SQL server + + Should US Government Azure cloud (Fairfax) integration be enabled - - - - SQL database + + Should US National Azure cloud integration be enabled - - - - Failed to get subscriptions for account {0} (tenant '{1}'). {2} + + Disable system keychain integration. Credentials will be stored in a flat file in the user's home directory. - - - - You must enable preview features in order to use Azure Cloud Shell. + + Should Personally Identifiable Information (PII) be logged in the console view locally - - Sign in - - - You are not currently signed into any Azure accounts, Please sign in and then try again. - - - Select an Azure account - - - You must select an Azure account for this feature to work. - - - A tenant is required for this feature. Your Azure subscription seems to have no tenants. - - - Starting cloud shell… - - - You must select a tenant for this feature to work. - - - - - No Resources found. - - - - - Loading ... - - - - - Azure SQL DB managed instance - - - - - Azure Data Explorer Cluster - - - - - Log Analytics workspace - - - - - Azure Database for PostgreSQL server - - - - - If you have not launched Azure Cloud Shell from this account before, please visit https://shell.azure.com/ to get started. Once you are set up, you can use AzureCloud Shell directly in Azure Data Studio. - - - Open Azure Shell - - - OK - - - Select Bash or PowerShell for Azure Cloud Shell - - - You must pick a shell type - - - Azure Cloud Shell (Preview) {0} ({1}) - - - Shell closed. - - - - - - SQL managed instance – Azure Arc - - - - - PostgreSQL Hyperscale – Azure Arc - - - - - Loading ... - - - Show Azure accounts - - - Failed to load some Azure accounts. {0} - - + \ No newline at end of file diff --git a/resources/xlf/en/azurehybridtoolkit.xlf b/resources/xlf/en/azurehybridtoolkit.xlf index 85402d6453..35206d435a 100644 --- a/resources/xlf/en/azurehybridtoolkit.xlf +++ b/resources/xlf/en/azurehybridtoolkit.xlf @@ -1,20 +1,20 @@ - - Azure SQL Hybrid Cloud Toolkit + + Jupyter Books Opens up Azure SQL Hybrid Cloud Toolkit Jupyter Book - - Open Azure SQL Hybrid Cloud Toolkit Jupyter Book + + Azure SQL Hybrid Cloud Toolkit Azure SQL Hybrid Cloud Toolkit - - Jupyter Books + + Open Azure SQL Hybrid Cloud Toolkit Jupyter Book - + \ No newline at end of file diff --git a/resources/xlf/en/big-data-cluster.xlf b/resources/xlf/en/big-data-cluster.xlf index efd4e2f364..4cb616e50e 100644 --- a/resources/xlf/en/big-data-cluster.xlf +++ b/resources/xlf/en/big-data-cluster.xlf @@ -1,370 +1,46 @@ - - - Support for managing SQL Server Big Data Clusters + + + Error deleting mount - - SQL Server Big Data Clusters + + Error retrieving BDC status from {0} - - Connect to Existing Controller + + Error retrieving cluster config from {0} - - Create New Controller + + Error retrieving endpoints from {0} - - Remove Controller + + Error creating mount - - Refresh + + Error refreshing mount - - Manage + + Error getting mount status - - Mount HDFS + + Error during authentication - - Refresh Mount + + You do not have permission to log into this cluster using Windows Authentication - - Delete Mount + + This cluster does not support Windows authentication - - Big Data Cluster - - - No SQL Big Data Cluster controllers registered. [Learn More](https://docs.microsoft.com/sql/big-data-cluster/big-data-cluster-overview) -[Connect Controller](command:bigDataClusters.command.connectController) - - - Loading controllers... - - - Ignore SSL verification errors against SQL Server Big Data Cluster endpoints such as HDFS, Spark, and Controller if true - - - SQL Server Big Data Cluster - - - SQL Server Big Data Cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes - - - Version - - - SQL Server 2019 - - - Deployment target - - - New Azure Kubernetes Service Cluster - - - Existing Azure Kubernetes Service Cluster - - - Existing Kubernetes Cluster (kubeadm) - - - Existing Azure Red Hat OpenShift cluster - - - Existing OpenShift cluster - - - SQL Server Big Data Cluster settings - - - Cluster name - - - Controller username - - - Password - - - Confirm password - - - Azure settings - - - Subscription id - - - Use my default Azure subscription - - - Resource group name - - - Region - - - AKS cluster name - - - VM size - - - VM count - - - Storage class name - - - Capacity for data (GB) - - - Capacity for logs (GB) - - - I accept {0}, {1} and {2}. - - - Microsoft Privacy Statement - - - azdata License Terms - - - SQL Server License Terms - - - - - Creating - - - Waiting - - - Ready - - - Deleting - - - Deleted - - - Applying Upgrade - - - Upgrading - - - Applying Managed Upgrade - - - Managed Upgrading - - - Rollback - - - Rollback In Progress - - - Rollback Complete - - - Error - - - Creating Secrets - - - Waiting For Secrets - - - Creating Groups - - - Waiting For Groups - - - Creating Resources - - - Waiting For Resources - - - Creating Kerberos Delegation Setup - - - Waiting For Kerberos Delegation Setup - - - Waiting For Deletion - - - Waiting For Upgrade - - - Upgrade Paused - - - Running - - - Application Proxy - - - Cluster Management Service - - - Gateway to access HDFS files, Spark - - - Management Proxy - - - Management Proxy - - - SQL Server Master Instance Front-End - - - Metrics Dashboard - - - Log Search Dashboard - - - Spark Diagnostics and Monitoring Dashboard - - - Spark Jobs Management and Monitoring Dashboard - - - HDFS File System Proxy - - - Proxy for running Spark statements, jobs, applications - - - SQL Server - - - HDFS - - - Spark - - - Control - - - Gateway - - - App - - - Healthy - - - Unhealthy - - - Unexpected error retrieving BDC Endpoints: {0} - - + - - Status Icon - - - Instance - - - State - - - View - - - N/A - - - Health Status Details - - - Metrics and Logs - - - Health Status - - - Node Metrics - - - SQL Metrics - - - Logs - - - View Node Metrics {0} - - - View SQL Metrics {0} - - - View Kibana Logs {0} - - - Last Updated : {0} - - - Basic - - - Windows Authentication + + Add Add New Controller - - URL - - - Username - - - Password - - - Remember Password - - - Cluster Management URL - - - Authentication type - - - Cluster Connection - - - Add - - - Cancel - - - OK - - - Refresh - - - Troubleshoot + + Basic Big Data Cluster overview @@ -375,92 +51,29 @@ Cluster Overview - - Service Endpoints - Cluster Properties Cluster State - - Service Name - - - Service + + Copy Endpoint - - Endpoint '{0}' copied to clipboard + + Health Status - - Copy + + Health Status Details - - View Details + + Instance - - View Error Details - - - Connect to Controller - - - Mount Configuration - - - Mounting HDFS folder on path {0} - - - Refreshing HDFS Mount on path {0} - - - Deleting HDFS Mount on path {0} - - - Mount creation has started - - - Refresh mount request submitted - - - Delete mount request submitted - - - Mounting HDFS folder is complete - - - Mounting is likely to complete, check back later to verify - - - Mount HDFS Folder - - - HDFS Path - - - Path to a new (non-existing) directory which you want to associate with the mount - - - Remote URI - - - The URI to the remote data source. Example for ADLS: abfs://fs@saccount.dfs.core.windows.net/ - - - Credentials - - - Mount credentials for authentication to remote data source for reads - - - Refresh Mount - - - Delete Mount + + Last Updated : {0} Loading cluster state completed @@ -468,89 +81,476 @@ Loading health status completed - - Username is required + + Logs - - Password is required - - - Unexpected error retrieving BDC Endpoints: {0} + + Metrics and Logs The dashboard requires a connection. Please click retry to enter your credentials. + + Node Metrics + + + N/A + + + Refresh + + + Service + + + Service Endpoints + + + Service Name + + + SQL Metrics + + + State + + + Status Icon + + + Troubleshoot + Unexpected error occurred: {0} + + View + + + View Details + + + View Error Details + + + View Kibana Logs {0} + + + View Node Metrics {0} + + + View SQL Metrics {0} + + + Cancel + + + Cluster Management URL + + + Connect to Controller + + + Endpoint '{0}' copied to clipboard + + + Delete Mount + + + Deleting HDFS Mount on path {0} + + + Delete mount request submitted + + + Unexpected error retrieving BDC Endpoints: {0} + + + Password is required + + + Username is required + + + Cluster Connection + + + Windows Authentication + + + Mount credentials for authentication to remote data source for reads + + + Credentials + + + Mount HDFS Folder + + + Bad formatting of credentials at {0} + + + Unknown error occurred during the mount process + Login to controller failed Login to controller failed: {0} - - Bad formatting of credentials at {0} + + Path to a new (non-existing) directory which you want to associate with the mount + + + HDFS Path + + + Mount Configuration + + + The URI to the remote data source. Example for ADLS: abfs://fs@saccount.dfs.core.windows.net/ + + + Remote URI + + + Mounting HDFS folder is complete Error mounting folder: {0} - - Unknown error occurred during the mount process + + Mounting is likely to complete, check back later to verify - - - - This cluster does not support Windows authentication + + Mounting HDFS folder on path {0} - - Error during authentication + + Mount creation has started - - You do not have permission to log into this cluster using Windows Authentication + + OK - - Error retrieving cluster config from {0} + + Password - - Error retrieving endpoints from {0} + + Refresh Mount - - Error retrieving BDC status from {0} + + Refreshing HDFS Mount on path {0} - - Error creating mount + + Refresh mount request submitted - - Error getting mount status + + Remember Password - - Error refreshing mount + + Authentication type - - Error deleting mount + + URL - - - - Controller endpoint information was not found + + Username - - Big Data Cluster Dashboard - - - - Yes - - - No - - - Are you sure you want to remove '{0}'? - - + Unexpected error loading saved controllers: {0} - + + + + Healthy + + + Unhealthy + + + Application Proxy + + + Cluster Management Service + + + Gateway to access HDFS files, Spark + + + Metrics Dashboard + + + Log Search Dashboard + + + Proxy for running Spark statements, jobs, applications + + + Management Proxy + + + Management Proxy + + + Spark Jobs Management and Monitoring Dashboard + + + SQL Server Master Instance Front-End + + + HDFS File System Proxy + + + Spark Diagnostics and Monitoring Dashboard + + + Unexpected error retrieving BDC Endpoints: {0} + + + App + + + Control + + + Gateway + + + HDFS + + + Spark + + + SQL Server + + + Applying Upgrade + + + Applying Managed Upgrade + + + Creating + + + Creating Groups + + + Creating Kerberos Delegation Setup + + + Creating Resources + + + Creating Secrets + + + Deleted + + + Deleting + + + Error + + + Managed Upgrading + + + Ready + + + Rollback + + + Rollback Complete + + + Rollback In Progress + + + Running + + + Upgrade Paused + + + Upgrading + + + Waiting + + + Waiting For Deletion + + + Waiting For Groups + + + Waiting For Kerberos Delegation Setup + + + Waiting For Resources + + + Waiting For Secrets + + + Waiting For Upgrade + + + + + Big Data Cluster Dashboard - + + + Controller endpoint information was not found + + + Are you sure you want to remove '{0}'? + + + No + + + Yes + + + + + SQL Server 2019 + + + I accept {0}, {1} and {2}. + + + azdata License Terms + + + SQL Server License Terms + + + AKS cluster name + + + Region + + + Resource group name + + + Azure settings + + + Subscription id + + + Use my default Azure subscription + + + VM count + + + VM size + + + Cluster name + + + SQL Server Big Data Cluster settings + + + Confirm password + + + Controller username + + + Capacity for data (GB) + + + Deployment target + + + Existing Azure Kubernetes Service Cluster + + + Existing Azure Red Hat OpenShift cluster + + + Existing Kubernetes Cluster (kubeadm) + + + Existing OpenShift cluster + + + New Azure Kubernetes Service Cluster + + + Capacity for logs (GB) + + + Password + + + Storage class name + + + Big Data Cluster + + + Ignore SSL verification errors against SQL Server Big Data Cluster endpoints such as HDFS, Spark, and Controller if true + + + No SQL Big Data Cluster controllers registered. [Learn More](https://docs.microsoft.com/sql/big-data-cluster/big-data-cluster-overview) +[Connect Controller](command:bigDataClusters.command.connectController) + + + Loading controllers... + + + Connect to Existing Controller + + + Create New Controller + + + Delete Mount + + + Manage + + + Mount HDFS + + + Refresh + + + Refresh Mount + + + Remove Controller + + + Support for managing SQL Server Big Data Clusters + + + Microsoft Privacy Statement + + + SQL Server Big Data Cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes + + + SQL Server Big Data Cluster + + + SQL Server Big Data Clusters + + + Version + + \ No newline at end of file diff --git a/resources/xlf/en/cms.xlf b/resources/xlf/en/cms.xlf index 2fd79104e1..c170c3fec0 100644 --- a/resources/xlf/en/cms.xlf +++ b/resources/xlf/en/cms.xlf @@ -1,50 +1,288 @@ + + + Add Server Group + + + Cancel + + + OK + + + Server Group Description + + + Server Group Name + + + Are you sure you want to delete + + + Are you sure you want to delete + + + Azure SQL Servers cannot be used as Central Management Servers + + + Central Management Server Group already has a Registered Server with the name {0} + + + {0} already has a Server Group with the name {1} + + + No + + + Yes + + + + + Add Central Management Server... + + + + + No resources found + + + + + Unexpected error occurred while loading saved servers {0} + + + Loading ... + + + + + You cannot add a shared registered server with the same name as the Configuration Server + + - - SQL Server Central Management Servers + + Edition - - Support for managing SQL Server Central Management Servers + + Compatibility Level - - Central Management Servers + + Owner - - Microsoft SQL Server + + Pricing Tier - - Central Management Servers + + Type - - Refresh - - - Refresh Server Group - - - Delete - - - New Server Registration... - - - Delete - - - New Server Group... - - - Add Central Management Server - - - Delete + + Version MSSQL configuration - - Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' + + Declares the application workload type when connecting to a server + + + Application intent + + + The name of the application + + + Application name + + + When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider + + + Asynchronous processing + + + Attach DB filename + + + The name of the primary file, including the full path name, of an attachable database + + + Attached DB file name + + + Azure Active Directory - Universal with MFA support + + + Windows Authentication + + + SQL Login + + + Specifies the method of authenticating with SQL Server + + + Authentication type + + + Default column encryption setting for all the commands on the connection + + + Column encryption + + + Number of attempts to restore connection + + + Connect retry count + + + Delay between attempts to restore connection + + + Connect retry interval + + + The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error + + + Connect timeout + + + Custom name of the connection + + + Name (optional) + + + When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process + + + Context connection + + + The SQL Server language record name + + + Current language + + + When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed + + + Encrypt + + + The name or network address of the instance of SQL Server that acts as a failover partner + + + Failover partner + + + The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed + + + Load balance timeout + + + The maximum number of connections allowed in the pool + + + Max pool size + + + The minimum number of connections allowed in the pool + + + Min pool size + + + Multi subnet failover + + + When true, multiple result sets can be returned and read from one connection + + + Multiple active result sets + + + Size in bytes of the network packets used to communicate with an instance of SQL Server + + + Packet size + + + Indicates the password to be used when connecting to the data source + + + Password + + + When false, security-sensitive information, such as the password, is not returned as part of the connection + + + Persist security info + + + When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool + + + Pooling + + + Port + + + Used by SQL Server in Replication + + + Replication + + + Description of the SQL Server instance + + + Server Description + + + Name of the SQL Server instance + + + Server + + + When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate + + + Trust server certificate + + + Indicates which server type system then provider will expose through the DataReader + + + Type system version + + + Indicates the user ID to be used when connecting to the data source + + + User name + + + The name of the workstation connecting to SQL Server + + + Workstation Id + + + Microsoft SQL Server + + + Support for managing SQL Server Central Management Servers + + + SQL Server Central Management Servers Should column definitions be aligned? @@ -64,20 +302,56 @@ [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown - - [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information + + Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up. Number of minutes to retain log files for backend services. Default is 1 week. - - Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up. + + Microsoft SQL Server + + + Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' + + + New Server Registration... + + + New Server Group... + + + Delete + + + Delete + + + Delete + + + Central Management Servers + + + Refresh + + + Refresh Server Group + + + Add Central Management Server + + + Central Management Servers + + + [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information [Optional] Do not show unsupported platform warnings - - Recovery Model + + Compatibility Level Last Database Backup @@ -85,17 +359,11 @@ Last Log Backup - - Compatibility Level - Owner - - Version - - - Edition + + Recovery Model Computer Name @@ -103,279 +371,11 @@ OS Version - + Edition - - Pricing Tier - - - Compatibility Level - - - Owner - - + Version - - Type - - - Microsoft SQL Server - - - Name (optional) - - - Custom name of the connection - - - Server - - - Name of the SQL Server instance - - - Server Description - - - Description of the SQL Server instance - - - Authentication type - - - Specifies the method of authenticating with SQL Server - - - SQL Login - - - Windows Authentication - - - Azure Active Directory - Universal with MFA support - - - User name - - - Indicates the user ID to be used when connecting to the data source - - - Password - - - Indicates the password to be used when connecting to the data source - - - Application intent - - - Declares the application workload type when connecting to a server - - - Asynchronous processing - - - When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider - - - Connect timeout - - - The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error - - - Current language - - - The SQL Server language record name - - - Column encryption - - - Default column encryption setting for all the commands on the connection - - - Encrypt - - - When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed - - - Persist security info - - - When false, security-sensitive information, such as the password, is not returned as part of the connection - - - Trust server certificate - - - When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate - - - Attached DB file name - - - The name of the primary file, including the full path name, of an attachable database - - - Context connection - - - When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process - - - Port - - - Connect retry count - - - Number of attempts to restore connection - - - Connect retry interval - - - Delay between attempts to restore connection - - - Application name - - - The name of the application - - - Workstation Id - - - The name of the workstation connecting to SQL Server - - - Pooling - - - When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool - - - Max pool size - - - The maximum number of connections allowed in the pool - - - Min pool size - - - The minimum number of connections allowed in the pool - - - Load balance timeout - - - The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed - - - Replication - - - Used by SQL Server in Replication - - - Attach DB filename - - - Failover partner - - - The name or network address of the instance of SQL Server that acts as a failover partner - - - Multi subnet failover - - - Multiple active result sets - - - When true, multiple result sets can be returned and read from one connection - - - Packet size - - - Size in bytes of the network packets used to communicate with an instance of SQL Server - - - Type system version - - - Indicates which server type system then provider will expose through the DataReader - - - - - No resources found - - - - - Add Central Management Server... - - - - - Unexpected error occurred while loading saved servers {0} - - - Loading ... - - - - - Central Management Server Group already has a Registered Server with the name {0} - - - Azure SQL Servers cannot be used as Central Management Servers - - - Are you sure you want to delete - - - Yes - - - No - - - Add Server Group - - - OK - - - Cancel - - - Server Group Name - - - Server Group Description - - - {0} already has a Server Group with the name {1} - - - Are you sure you want to delete - - - - - You cannot add a shared registered server with the same name as the Configuration Server - - + \ No newline at end of file diff --git a/resources/xlf/en/dacpac.xlf b/resources/xlf/en/dacpac.xlf index 56abce7e29..dc0cfb0002 100644 --- a/resources/xlf/en/dacpac.xlf +++ b/resources/xlf/en/dacpac.xlf @@ -1,199 +1,199 @@ - - - Dacpac - - - Data-tier Application Wizard - - - DacFx - - - Full path to folder where .DACPAC and .BACPAC files are saved by default - - - - Target Server - - - Source Server - - - Source Database - - - Target Database - - - File Location - - - Select file - - - Summary of settings - - - Version - - - Setting - - - Value - Database Name - - Open + + Deploy - - Upgrade Existing Database + + Select Deploy Dacpac Settings + + + Deploy a data-tier application .dacpac file to an instance of SQL Server [Deploy Dacpac] + + + Review the deploy plan + + + Export + + + Select Export Bacpac Settings + + + Export the schema and data from a database to the logical .bacpac file format [Export Bacpac] + + + Extract + + + Select Extract Dacpac Settings + + + Extract a data-tier application from an instance of SQL Server to a .dacpac file [Extract Dacpac] + + + Generate Script + + + Import + + + Select Import Bacpac Settings + + + Create a database from a .bacpac file [Import Bacpac] New Database - - {0} of the deploy actions listed may result in data loss. Please ensure you have a backup or snapshot available in the event of an issue with the deployment. - Proceed despite possible data loss - - No data loss will occur from the listed deploy actions. + + Select an Operation + + + Source Database + + + Source Server + + + Summary + + + Target Database + + + Target Server + + + Upgrade Existing Database + + + Version (use x.x.x.x where x is a number) + + + Open + + + Data Loss The deploy actions may result in data loss. Please ensure you have a backup or snapshot available in the event of an issue with the deployment. + + {0} of the deploy actions listed may result in data loss. Please ensure you have a backup or snapshot available in the event of an issue with the deployment. + + + Operations that may result in data loss are marked with a warning sign + + + A database with the same name already exists on the instance of SQL Server + + + default + + + Generating deploy plan failed '{0}' + + + Deploy plan operations + + + File Location + + + File name cannot end with a period + + + Generating deploy script failed '{0}' + + + Invalid file characters + + + No data loss will occur from the listed deploy actions. + + + Name of object that will be affected by deployment + + + Object + Operation + + {0} operation failed '{1}' + Operation(Create, Alter, Delete) that will occur during deployment + + Reserved file name. Choose another name and try again + + + This file name is reserved for use by Windows. Choose another name and try again + + + Save + + + You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete. + + + Select file + + + Setting + + + Summary of settings + + + File name is over 255 characters + + + File name cannot end with a whitespace + Type Type of object that will be affected by deployment - - Object - - - Name of object that will be affected by deployment - - - Data Loss - - - Operations that may result in data loss are marked with a warning sign - - - Save - - - Version (use x.x.x.x where x is a number) - - - Deploy a data-tier application .dacpac file to an instance of SQL Server [Deploy Dacpac] - - - Extract a data-tier application from an instance of SQL Server to a .dacpac file [Extract Dacpac] - - - Create a database from a .bacpac file [Import Bacpac] - - - Export the schema and data from a database to the logical .bacpac file format [Export Bacpac] - - - Data-tier Application Wizard - - - Select an Operation - - - Select Deploy Dacpac Settings - - - Review the deploy plan - - - Summary - - - Select Extract Dacpac Settings - - - Select Import Bacpac Settings - - - Select Export Bacpac Settings - - - Deploy - - - Extract - - - Import - - - Export - - - Generate Script - - - You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete. - - - default - - - Deploy plan operations - - - A database with the same name already exists on the instance of SQL Server - Undefined name - - File name cannot end with a period + + Value + + + Version File name cannot be whitespace - - Invalid file characters + + Data-tier Application Wizard - - This file name is reserved for use by Windows. Choose another name and try again + + + + DacFx - - Reserved file name. Choose another name and try again + + Full path to folder where .DACPAC and .BACPAC files are saved by default - - File name cannot end with a whitespace + + Dacpac - - File name is over 255 characters + + Data-tier Application Wizard - - Generating deploy plan failed '{0}' - - - Generating deploy script failed '{0}' - - - {0} operation failed '{1}' - - + \ No newline at end of file diff --git a/resources/xlf/en/data-workspace.xlf b/resources/xlf/en/data-workspace.xlf index e14e5bec54..035d662149 100644 --- a/resources/xlf/en/data-workspace.xlf +++ b/resources/xlf/en/data-workspace.xlf @@ -1,35 +1,175 @@ - - - Data workspace + + + All Project Types - - Data workspace + + Select + + + No provider was found for project type with id: '{0}' + + + No provider was found for the following projects: {0} + + + Failed to load the project provider extension '{0}'. Error message: {1} + + + Local + + + Refresh + + + Create new project + + + Type + + + Browse + + + Browse... + + + The selected clone path '{0}' does not exist or is not a directory. + + + Create + + + Enter Project Name + + + The selected {0} file '{1}' does not exist or is not a file. + + + Enter remote git repository URL + + + Git repository URL + + + Select location to clone repository locally + + + Local clone path + + + Location + + + Name cannot be empty + + + OK + + + Open + + + Open Existing Project + + + Project + + + Project '{0}' is already opened. + + + There is already a directory named '{0}' in the selected location: '{1}'. + + + Directory '{0}' already exists in the selected location, please choose another + + + Select project file + + + Select location to create project + + + Location + + + Enter project name + + + Name + + + The selected project location '{0}' does not exist or is not a directory. + + + Some projects failed to load. To view more details, [open the developer console](command:workbench.action.toggleDevTools) + + + Remote git repository + + + Azure Data Studio needs to be restarted for the project to be created and added to the workspace, do this now? + + + Select + + + Select Project Location + + + Select Project Type + + + Target Platform + + + Select workspace ({0}) file + + + File '{0}' doesn't exist + + + Error during git clone. View git output for more details + + + Cloning git repository '{0}'... + + + Prior {0} for the current project will appear here, please run to see the results. + + + List of opened projects should not be undefined after refresh from disk. + + + Project name is null + + + + + Close Workspace Projects + + Data workspace + + + Data workspace + Projects + + Manage + New - - Refresh - - - Close Workspace - - - Remove Project - - - Create new or open existing to get started. -[Create new](command:projects.new) -[Open existing](command:projects.openExisting) -To learn more about projects [read our docs](https://aka.ms/azuredatastudio-projects). + + Open existing No projects open in current workspace. @@ -38,8 +178,11 @@ To learn more about projects [read our docs](https://aka.ms/azuredatastudio-proj To learn more about projects [read our docs](https://aka.ms/azuredatastudio-projects). - - Open existing + + Create new or open existing to get started. +[Create new](command:projects.new) +[Open existing](command:projects.openExisting) +To learn more about projects [read our docs](https://aka.ms/azuredatastudio-projects). Full path to folder where new projects are saved by default. @@ -47,154 +190,11 @@ To learn more about projects [read our docs](https://aka.ms/azuredatastudio-proj Always show information message when the current workspace folders contain projects that have not been added to the workspace's projects. - - Manage - - - - - Failed to load the project provider extension '{0}'. Error message: {1} - - - No provider was found for the following projects: {0} - - - Select - - - All Project Types - - - No provider was found for project type with id: '{0}' - - - Azure Data Studio needs to be restarted for the project to be created and added to the workspace, do this now? - - - Some projects failed to load. To view more details, [open the developer console](command:workbench.action.toggleDevTools) - - - File '{0}' doesn't exist - - - Project name is null - - - Prior {0} for the current project will appear here, please run to see the results. - - - Cloning git repository '{0}'... - - - Error during git clone. View git output for more details - - - List of opened projects should not be undefined after refresh from disk. - - - OK - - - Browse - - - Browse... - - - Open - - - Create - - - Select - - - Create new project - - - Type - - - Name - - - Enter project name - - - Enter Project Name - - - Location - - - Select location to create project - - - The selected project location '{0}' does not exist or is not a directory. - - - There is already a directory named '{0}' in the selected location: '{1}'. - - - Directory '{0}' already exists in the selected location, please choose another - - - Select Project Type - - - Select Project Location - - - Name cannot be empty - - - Target Platform - - - Open Existing Project - - - The selected {0} file '{1}' does not exist or is not a file. - - - The selected clone path '{0}' does not exist or is not a directory. - - - Project - - - Location - - - Select project file - - - Select workspace ({0}) file - - - Project '{0}' is already opened. - - - Local - - - Remote git repository - - - Git repository URL - - - Enter remote git repository URL - - - Local clone path - - - Select location to clone repository locally - - + Refresh - + + Remove Project + + \ No newline at end of file diff --git a/resources/xlf/en/import.xlf b/resources/xlf/en/import.xlf index 088ad45037..c4274a64f0 100644 --- a/resources/xlf/en/import.xlf +++ b/resources/xlf/en/import.xlf @@ -1,138 +1,50 @@ - - - Flat File Import configuration - - - [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown - - - - - {0} Started - - - Starting {0} - - - Failed to start {0}: {1} - - - Installing {0} to {1} - - - Installing {0} Service - - - Installed {0} - - - Downloading {0} - - - ({0} KB) - - - Downloading {0} - - - Done downloading {0} - - - Extracted {0} ({1}/{2}) - - - - Give Feedback - - - service component could not start - - - Server the database is in - - - Database the table is created in - Invalid file location. Please try a different input file + + Allow Nulls + Browse - - Open - - - Location of the file to be imported - - - New table name - - - Table schema - - - Import Data - - - Next - Column Name Data Type - - Primary Key - - - Allow Nulls - - - This operation analyzed the input file structure to generate the preview below for up to the first 50 rows. - - - This operation was unsuccessful. Please try a different input file. - - - Refresh - - - Import information - - - Import status - - - Server name + + Database the table is created in Database name - - Table name - - - Table schema - File to be imported - - ✔ You have successfully inserted the data into a table. + + Location of the file to be imported - - Please connect to a server before using this wizard. + + Import Data - - SQL Server Import extension does not support this type of connection + + Import information - - Import flat file wizard + + Import new file + + + Import status + + + Next + + + Open Specify Input File @@ -146,8 +58,96 @@ Summary - - Import new file + + Primary Key - + + This operation analyzed the input file structure to generate the preview below for up to the first 50 rows. + + + This operation was unsuccessful. Please try a different input file. + + + Refresh + + + Table schema + + + Server the database is in + + + Server name + + + ✔ You have successfully inserted the data into a table. + + + Table name + + + Table schema + + + New table name + + + Import flat file wizard + + + Please connect to a server before using this wizard. + + + SQL Server Import extension does not support this type of connection + + + Give Feedback + + + service component could not start + + + + + Downloading {0} + + + Done downloading {0} + + + ({0} KB) + + + Downloading {0} + + + Extracted {0} ({1}/{2}) + + + Failed to start {0}: {1} + + + Installing {0} Service + + + Installing {0} to {1} + + + Installed {0} + + + {0} Started + + + Starting {0} + + + + + Flat File Import configuration + + + [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown + + \ No newline at end of file diff --git a/resources/xlf/en/kusto.xlf b/resources/xlf/en/kusto.xlf index a88c46ceae..edbb9e9ab8 100644 --- a/resources/xlf/en/kusto.xlf +++ b/resources/xlf/en/kusto.xlf @@ -1,26 +1,107 @@ + + + Account does not exist. + + + The configured Azure account for {0} does not have sufficient permissions for Azure Key Vault to access a column master key for Always Encrypted. + + + Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account is available. Please add a linked Azure account and retry the query. + + + + + Done installing {0} + + + Downloading {0} + + + ({0} KB) + + + Downloading {0} + + + Failed to start {0} + + + Installed {0} + + + Installing {0} to {1} + + + Installing {0} + + + {0} Started + + + Starting {0} + + + + + Unsupported platform + + + Notebooks + + + Only .ipynb Notebooks are supported + + + + + Cancel operation? + + + Cancel + + + Search Server Names + + + $(sync~spin) {0}... + + + + + Error notifying of node change: {0} + + + Root + + + Session for node {0} does not exist + + + + + {0} component exited unexpectedly. Please restart Azure Data Studio. + + + View Known Issues + + - - New Notebook - - - Open Notebook - Database Name Size (MB) - - Status + + % of Cluster data capacity used Total Machines in the cluster - - % of Cluster data capacity used + + Status Name @@ -28,17 +109,89 @@ Size (MB) - - Name - - - Type - KUSTO configuration - - Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' + + The name of the application + + + Application name + + + Azure Active Directory - Universal with MFA support + + + No Authentication + + + User Authentication + + + Specifies the method of authenticating with Kusto Server + + + Authentication type + + + Number of attempts to restore connection + + + Connect retry count + + + Delay between attempts to restore connection + + + Connect retry interval + + + The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error + + + Connect timeout + + + Custom name of the connection + + + Name (optional) + + + The name of the initial catalog or database in the data source + + + Database + + + The name or network address of the instance of Kusto Server that acts as a failover partner + + + Failover partner + + + Indicates the password to be used when connecting to the data source + + + Password + + + Kusto cluster name + + + Cluster + + + Indicates the user ID to be used when connecting to the data source + + + User name + + + The name of the workstation connecting to Kusto Server + + + Workstation Id Should column definitions be aligned? @@ -52,179 +205,26 @@ [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown - - [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information - Azure Data Explorer (Kusto) - - Name (optional) + + Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' - - Custom name of the connection + + [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information - - Cluster + + New Notebook - - Kusto cluster name + + Open Notebook - - Database + + Type - - The name of the initial catalog or database in the data source + + Name - - Authentication type - - - Specifies the method of authenticating with Kusto Server - - - Azure Active Directory - Universal with MFA support - - - No Authentication - - - User Authentication - - - User name - - - Indicates the user ID to be used when connecting to the data source - - - Password - - - Indicates the password to be used when connecting to the data source - - - Connect timeout - - - The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error - - - Connect retry count - - - Number of attempts to restore connection - - - Connect retry interval - - - Delay between attempts to restore connection - - - Application name - - - The name of the application - - - Workstation Id - - - The name of the workstation connecting to Kusto Server - - - Failover partner - - - The name or network address of the instance of Kusto Server that acts as a failover partner - - - - - View Known Issues - - - {0} component exited unexpectedly. Please restart Azure Data Studio. - - - - - Unsupported platform - - - Notebooks - - - Only .ipynb Notebooks are supported - - - - - Session for node {0} does not exist - - - Error notifying of node change: {0} - - - Root - - - - - $(sync~spin) {0}... - - - Cancel - - - Cancel operation? - - - Search Server Names - - - - - {0} Started - - - Starting {0} - - - Failed to start {0} - - - Installing {0} to {1} - - - Installing {0} - - - Installed {0} - - - Downloading {0} - - - ({0} KB) - - - Downloading {0} - - - Done installing {0} - - - - - Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account is available. Please add a linked Azure account and retry the query. - - - Account does not exist. - - - The configured Azure account for {0} does not have sufficient permissions for Azure Key Vault to access a column master key for Always Encrypted. - - + \ No newline at end of file diff --git a/resources/xlf/en/machine-learning.xlf b/resources/xlf/en/machine-learning.xlf index 9180e580c5..ae78c91ed4 100644 --- a/resources/xlf/en/machine-learning.xlf +++ b/resources/xlf/en/machine-learning.xlf @@ -1,643 +1,230 @@ - - - Machine Learning - - - Machine Learning - - - Tasks - - - Documents - - - Configurations - - - Endpoints - - - Manage packages in database - - - Manage external languages - - - Make prediction - - - Manage models - - - Import model - - - Machine Learning Configurations - - - Local path to a preexisting Python installation used by Machine Learning. - - - Enable Python package management in database. - - - Enable R package management in database. - - - Local path to a preexisting R installation used by Machine Learning. - - - Install Machine Learning Dependencies - - - Enable External script - - - - Yes - - - No - - - Package management is not supported for the server. Make sure you have Python or R installed. - - - The extension failed to load because of it's dependency to Notebook extension. Please check the output log for Notebook extension to get more details - - - '{0}' is required for package management. Please make sure it is installed and set up correctly. - - - Failed to complete task '{0}'. Error: {1} - - - Cannot find Python executable '{0}'. Please make sure Python is installed and configured correctly - - - Cannot find R executable '{0}'. Please make sure R is installed and configured correctly - - - Verifying package management dependencies - - - Verifying model management dependencies - - - No Result returned - - - The required packages are not installed - - - External script is required for package management. Are you sure you want to enable that. - - - Failed to enable External script. - - - External script configuration is required for this action. - - - Are you sure you want to install required packages? - - - The following Python packages are required to install: {0} - - - The following R packages are required to install: {0} - - - Are you sure you want to delete model '{0}? - - - Installing required packages ... - - - Required packages are already installed. - - - Failed to get installed python packages. Error: {0} - - - No connection selected - - - Notebook extension is not loaded - - - MSSQL extension is not loaded - - - Machine Learning Services Enabled - - - Failed to modify Machine Learning Services configurations - - - Enable - - - Disable - - - Config - - - Enabled - - - Action - - - External Execute Script - - - Python - - - R - - - Error while downloading - - - Invalid model id. model url: {0} - - - Model doesn't have any artifact. model url: {0} - - - Downloading - - - Python executable is not configured - - - R executable is not configured - - - Installing dependencies ... - - - Could not find the specified resource - - - Latest - - - Package info request failed with error: {0} {1} - Error: {0} - - Not supported event args - - - Installed - - - Installed - - - Platform - - - Delete - - - Edit - - - Install - - - Cancel - - - Close - - - OK - - - Save - - - Name - - - Add new - - - File Browser - - - Languages - - - Target - - - localhost - - - Language extension path - - - Language extension location - - - Extension file Name - - - Environment variables - - - Parameters - - - Selected Path - - - Failed to install language - - - Failed to update language - - - Failed to update the model - - - No models found - - - Select table - - - Select Database - - - No models found - - - Select another Azure ML workspace - - - Select another database or table - - - Database - - - Select a database to store the new model. - - - Select an existing table that conforms the model schema or create a new one to store the imported model. - - - Table - - - Select a model table to view the list of existing / imported models. - - - Select a database where existing / imported models are stored. - - - Existing table - - - New table - - - Name - - - File - - - Description - - - Date created - - - Date imported - - - Framework - - - Framework version - - - Version - - - ... - - - Azure account - - - Azure sign in or refresh account - - - Source database - - - Select the database containing the dataset to apply the prediction. - - - Source table - - - Select the table containing the dataset to apply the prediction. - - - Model Input mapping - - - Model output - - - Source columns - - - Type - - - Display name - - - Model input - - - Select column... - - - Select database with models - - - Select tables with models - - - Select database - - - Select table - - - Name - - - Azure subscription - - - Resource group - - - Azure ML workspace - - - Filter - - - Models - - - Azure models - - - Local models - - - Source location - - - Select model source type - - - ‘File Upload’ is selected. This allows you to import a model file from your local machine into a model database in this SQL instance. Click ‘Next’ to continue.​ - - - ‘Azure Machine Learning’ is selected. This allows you to import models stored in Azure Machine Learning workspaces in a model database in this SQL instance. Click ‘Next’ to continue.​​ - - - ‘File Upload’ is selected. This allows you to upload a model file from your local machine. Click ‘Next’ to continue.​​ - - - ‘Imported Models’ is selected. This allows you to choose from models stored in a model table in your database. Click ‘Next’ to continue.​ - - - ‘Azure Machine Learning’ is selected. This allows you to choose from models stored in Azure Machine Learning workspaces. Click ‘Next’ to continue.​ - - - Select or enter the location to import the models to - - - Map source data to model - - - Enter model details - - - Source files - - - File paths of the models to import - - - ONNX runtime is not supported in current server - - - Models - - - Import - - - Predict - - - Import models - - - View and import models - - - Machine Learning models can be stored in one or more databases and tables. Select the model database and table to view the models within them. - - - The models are stored in one or more databases and tables. Select the model database and table to view models in them. - - - Learn more. - - - Import or view models - - - Edit model - - - Import or view machine learning models stored in database - - - Make predictions - - - Generate a predicted value or scores using a managed model - - - Create notebook - - - Run experiments and create models in a notebook - - - Model registered successfully - - - Model updated successfully - - - Model failed to register - - - File upload - - - Upload model file - - - Azure Machine Learning - - - Import from Azure Machine Learning - - - Select imported model - - - Imported models - - - Downloading Model from Azure - - - Invalid Azure resource - - - Invalid model to register - - - Invalid model to predict - - - Please select valid source table and model parameters - - - Please select a valid model - - - Please select a valid table - - - Click to review warning details - - - Differences in data type - - - The data type of the source table column does not match the required input field’s type. - - - The data type of output column does not match the output field’s type. - - - Model name is required. - - - Please select at least one model to import. - - - Failed to update the model - - - Table meets requirements! - - - Select models table - - - Invalid table structure! - - - Failed to register the model: {0} ,file: {1} - - - Invalid table for importing models. database name: {0} ,table name: {1} - - - Table schema is not supported for model import. Database name: {0}, table name: {1}. - - - Failed to load model parameters' - - - unsupported - - - Machine Learning - Machine Learning for SQL databases Useful links + + Machine Learning + Video tutorials - - Show more + + Database - - Show less + + Select a database to store the new model. + + + Edit + + + Existing table + + + Cancel + + + Languages + + + Close + + + localhost + + + OK + + + Save + + + Target + + + Delete + + + Environment variables + + + Language extension location + + + Extension file Name + + + Language extension path + + + File Browser + + + Install + + + Failed to install language + + + Installed + + + Installed + + + Name + + + Platform + + + Add new + + + Parameters + + + Selected Path + + + Failed to update language Learn more - - SQL machine learning documentation + + Showing {0} model(s) - - Machine Learning extension in Azure Data Studio + + Cannot find Python executable '{0}'. Please make sure Python is installed and configured correctly - - Learn how to use Machine Learning extension in Azure Data Studio, to manage packages, make predictions, and import models. + + Cannot find R executable '{0}'. Please make sure R is installed and configured correctly - - Learn how to use machine learning in SQL Server and SQL on Azure, to run Python and R scripts on relational data. + + Action - - SQL Server Machine Learning Services (Python and R) + + Enabled - - Get started with Machine Learning Services on SQL Server and how to install it on Windows and Linux. + + Config - - Machine Learning Services in Azure SQL Managed Instance + + Failed to modify Machine Learning Services configurations - - Get started with Machine Learning Services in Azure SQL Managed Instances. + + External script is required for package management. Are you sure you want to enable that. + + + Are you sure you want to install required packages? + + + Disable + + + Error while downloading + + + Downloading + + + Enable + + + Failed to enable External script. + + + Machine Learning Services Enabled + + + External Execute Script + + + External script configuration is required for this action. + + + Package info request failed with error: {0} {1} + + + The following Python packages are required to install: {0} + + + The following R packages are required to install: {0} + + + Failed to get installed python packages. Error: {0} + + + Installing required packages ... + + + Required packages are already installed. + + + Verifying model management dependencies + + + Verifying package management dependencies + + + Installing dependencies ... + + + Invalid model id. model url: {0} + + + Latest + + + Package management is not supported for the server. Make sure you have Python or R installed. + + + MSSQL extension is not loaded + + + Model doesn't have any artifact. model url: {0} + + + No Result returned + + + Notebook extension is not loaded + + + No connection selected + + + Python executable is not configured + + + Python + + + R executable is not configured + + + R + + + The required packages are not installed + + + Could not find the specified resource + + + Failed to complete task '{0}'. Error: {1} + + + '{0}' is required for package management. Please make sure it is installed and set up correctly. Install the Microsoft ODBC driver for SQL Server @@ -645,14 +232,427 @@ This document explains how to install the Microsoft ODBC Driver for SQL Server. - - Machine learning and AI with ONNX in SQL Database Edge Preview + + Select a database where existing / imported models are stored. + + + Select a model table to view the list of existing / imported models. + + + Import models + + + Azure account + + + Resource group + + + Filter + + + Import from Azure Machine Learning + + + Azure Machine Learning + + + ‘Azure Machine Learning’ is selected. This allows you to import models stored in Azure Machine Learning workspaces in a model database in this SQL instance. Click ‘Next’ to continue.​​ + + + ‘Azure Machine Learning’ is selected. This allows you to choose from models stored in Azure Machine Learning workspaces. Click ‘Next’ to continue.​ + + + Azure ML workspace + + + Models + + + Select another Azure ML workspace + + + No models found + + + Azure models + + + Azure sign in or refresh account + + + Azure subscription + + + ... + + + The data type of the source table column does not match the required input field’s type. + + + Differences in data type + + + Click to review warning details + + + Map source data to model + + + Are you sure you want to delete model '{0}? + + + Run experiments and create models in a notebook + + + Create notebook + + + Date created + + + Models + + + Description + + + Downloading Model from Azure + + + Edit model + + + File + + + Framework + + + Framework version + + + Import or view machine learning models stored in database + + + Import + + + Failed to register the model: {0} ,file: {1} + + + Import or view models + + + Date imported + + + ‘Imported Models’ is selected. This allows you to choose from models stored in a model table in your database. Click ‘Next’ to continue.​ + + + Select imported model + + + Invalid Azure resource + + + Invalid table for importing models. database name: {0} ,table name: {1} + + + Table schema is not supported for model import. Database name: {0}, table name: {1}. + + + Please select a valid table + + + Please select valid source table and model parameters + + + Invalid model to predict + + + Invalid model to register + + + Please select a valid model + + + Learn more. + + + Failed to load model parameters' + + + Upload model file + + + File upload + + + ‘File Upload’ is selected. This allows you to import a model file from your local machine into a model database in this SQL instance. Click ‘Next’ to continue.​ + + + ‘File Upload’ is selected. This allows you to upload a model file from your local machine. Click ‘Next’ to continue.​​ + + + Local models + + + Generate a predicted value or scores using a managed model + + + Make predictions + + + Enter model details + + + Model failed to register + + + Select or enter the location to import the models to + + + Source files + + + File paths of the models to import + + + Model name is required. + + + Model registered successfully + + + Table meets requirements! + + + Invalid table structure! + + + Select model source type + + + Source location + + + Failed to update the model + + + Model updated successfully + + + Select another database or table + + + No models found + + + Please select at least one model to import. + + + Name + + + ONNX runtime is not supported in current server + + + The data type of output column does not match the output field’s type. + + + Predict + + + Imported models + + + Select Database + + + Select database with models + + + Select table + + + Select tables with models + + + Select models table + + + unsupported + + + Failed to update the model + + + Version + + + The models are stored in one or more databases and tables. Select the model database and table to view models in them. + + + Machine Learning models can be stored in one or more databases and tables. Select the model database and table to view the models within them. + + + View and import models + + + No + + + Yes + + + New table + + + Not supported event args + + + The extension failed to load because of it's dependency to Notebook extension. Please check the output log for Notebook extension to get more details Get started with machine learning in Azure SQL Database Edge - - Showing {0} model(s) + + Machine learning and AI with ONNX in SQL Database Edge Preview - + + Source database + + + Select the database containing the dataset to apply the prediction. + + + Source columns + + + Source table + + + Select the table containing the dataset to apply the prediction. + + + Type + + + Display name + + + Model Input mapping + + + Model input + + + Model output + + + Name + + + Select column... + + + Select database + + + Select table + + + Show less + + + Show more + + + Learn how to use machine learning in SQL Server and SQL on Azure, to run Python and R scripts on relational data. + + + SQL machine learning documentation + + + Learn how to use Machine Learning extension in Azure Data Studio, to manage packages, make predictions, and import models. + + + Machine Learning extension in Azure Data Studio + + + Get started with Machine Learning Services on SQL Server and how to install it on Windows and Linux. + + + SQL Server Machine Learning Services (Python and R) + + + Get started with Machine Learning Services in Azure SQL Managed Instances. + + + Machine Learning Services in Azure SQL Managed Instance + + + Table + + + Select an existing table that conforms the model schema or create a new one to store the imported model. + + + + + Machine Learning + + + Machine Learning + + + Install Machine Learning Dependencies + + + Enable External script + + + Import model + + + Manage external languages + + + Manage models + + + Manage packages in database + + + Make prediction + + + Machine Learning Configurations + + + Enable Python package management in database. + + + Enable R package management in database. + + + Local path to a preexisting Python installation used by Machine Learning. + + + Local path to a preexisting R installation used by Machine Learning. + + + Configurations + + + Documents + + + Endpoints + + + Tasks + + \ No newline at end of file diff --git a/resources/xlf/en/mssql.xlf b/resources/xlf/en/mssql.xlf index 97a9bcd1f6..0722dde4d7 100644 --- a/resources/xlf/en/mssql.xlf +++ b/resources/xlf/en/mssql.xlf @@ -1,12 +1,676 @@ + + + Copy + + + Application Proxy + + + Cluster Management Service + + + Gateway to access HDFS files, Spark + + + Metrics Dashboard + + + Log Search Dashboard + + + Proxy for running Spark statements, jobs, applications + + + Management Proxy + + + Management Proxy + + + Spark Jobs Management and Monitoring Dashboard + + + SQL Server Master Instance Front-End + + + HDFS File System Proxy + + + Spark Diagnostics and Monitoring Dashboard + + + Metrics Dashboard + + + Log Search Dashboard + + + Spark Jobs Management and Monitoring Dashboard + + + Spark Diagnostics and Monitoring Dashboard + + + + + Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account was selected. Please retry the query and select a linked Azure account when prompted. + + + Please select a linked Azure account: + + + The configured Azure account for {0} does not have sufficient permissions for Azure Key Vault to access a column master key for Always Encrypted. + + + Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account is available. Please add a linked Azure account and retry the query. + + + + + Error applying permission changes: {0} + + + Applying permission changes to '{0}'. + + + Applying permission changes recursively under '{0}' + + + Permission changes applied successfully. + + + + + Bad Request + + + Unauthorized + + + Forbidden + + + Not Found + + + Internal Server Error + + + Invalid Data Structure + + + Unable to create WebHDFS client due to missing options: ${0} + + + '${0}' is undefined. + + + Unexpected Redirect + + + Unknown Error + + + + + Node Command called without any node passed + + + Access + + + Add + + + Add User or Group + + + Apply + + + Apply Recursively + + + Default + + + Default User and Groups + + + Delete + + + Enter name + + + Unexpected error occurred while applying changes : {0} + + + Everyone else + + + Execute + + + Group + + + Group + + + Inherit Defaults + + + Location : + + + Manage Access + + + Named Users and Groups + + + Owner + + + - Owner + + + - Owning Group + + + Permissions + + + Read + + + Sticky Bit + + + User + + + User or Group Icon + + + Write + + + Please connect to the Spark cluster before View {0} History. + + + Get Application Id Failed. {0} + + + Local file will be uploaded to HDFS. + + + Local file {0} does not existed. + + + No SQL Server Big Data Cluster found. + + + Submitting job {0} ... + + + Uploading file from local {0} to HDFS folder: {1} + + + Spark History Url: {0} + + + .......................... Submit Spark Job End ............................ + + + Spark Job Submission Failed. {0} + + + The Spark Job has been submitted. + + + Upload file to cluster Failed. {0} + + + Upload file to cluster Succeeded! + + + YarnUI Url: {0} + + + + + This sample code loads the file into a data frame and shows the first 10 results. + + + An error occurred converting the SQL document to a Notebook. Error : {0} + + + An error occurred converting the Notebook document to SQL. Error : {0} + + + Could not find the controller endpoint for this instance + + + Notebooks + + + Only .ipynb Notebooks are supported + + + + + Stream operation canceled by the user + + + + + Cancel operation? + + + Cancel + + + Search Server Names + + + $(sync~spin) {0}... + + + + + Some missing properties in connectionInfo.options: {0} + + + ConnectionInfo.options is undefined. + + + ConnectionInfo is undefined. + + + + + NOTICE: This file has been truncated at {0} for preview. + + + The file has been truncated at {0} for preview. + + + + + All Files + + + Error on copying path: {0} + + + Error on deleting files: {0} + + + Enter directory name + + + Upload + + + Creating directory + + + An unexpected error occurred while opening the Manage Access dialog: {0} + + + Error on making directory: {0} + + + Operation was canceled + + + Are you sure you want to delete this file? + + + Are you sure you want to delete this folder and its contents? + + + Error on previewing file: {0} + + + Generating preview + + + Save operation was canceled + + + Error on saving file: {0} + + + Saving HDFS Files + + + Upload operation was canceled + + + Error uploading files: {0} + + + Uploading files to HDFS + + + + + Cannot delete a connection. Only subfolders and files can be deleted. + + + Error: {0} + + + + + HDFS + + + Error notifying of node change: {0} + + + Please provide the password to connect to HDFS: + + + Please provide the username to connect to HDFS: + + + Root + + + Session for node {0} does not exist + + + + + No + + + Yes + + + + + The selected server does not belong to a SQL Server Big Data Cluster + + + Select other SQL Server + + + Error Get File Path: {0} + + + No SQL Server is selected. + + + Please select SQL Server with Big Data Cluster. + + + + + ADVANCED + + + Reference Files + + + Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;) + + + Reference Jars + + + Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;) + + + Reference py Files + + + Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;) + + + Configuration Values + + + List of name value pairs containing Spark configuration values. Encoded as JSON dictionary. Example: '{"name":"value", "name2":"value2"}'. + + + Driver Cores + + + Amount of CPU cores to allocate to the driver. + + + Driver Memory + + + Amount of memory to allocate to the driver. Specify units as part of value. Example 512M or 2G. + + + Executor Cores + + + Amount of CPU cores to allocate to the executor. + + + Executor Count + + + Number of instances of the executor to run. + + + Executor Memory + + + Amount of memory to allocate to the executor. Specify units as part of value. Example 512M or 2G. + + + Queue Name + + + Name of the Spark queue to execute the session in. + + + + + Arguments + + + Command line arguments used in your main class, multiple arguments should be split by space. + + + Path to a .jar or .py file + + + GENERAL + + + The specified HDFS file does not exist. + + + {0} does not exist in Cluster or exception thrown. + + + Job Name + + + Enter a name ... + + + The selected local file will be uploaded to HDFS: {0} + + + Main Class + + + JAR/py File + + + Property JAR/py File is not specified. + + + Property Job Name is not specified. + + + Property Main Class is not specified. + + + Error in locating the file due to Error: {0} + + + Spark Cluster + + + Select + + + + + Cancel + + + Submit + + + New Job + + + Parameters for SparkJobSubmissionDialog is illegal + + + .......................... Submit Spark Job Start .......................... + + + {0} Spark Job Submission: + + + + + Get Application Id time out. {0}[Log] {1} + + + livyBatchId is invalid. + + + Property Path is not specified. + + + Parameters for SparkJobSubmissionModel is illegal + + + Property localFilePath or hdfsFolderPath is not specified. + + + submissionArgs is invalid. + + + + + No Spark job batch id is returned from response.{0}[Error] {1} + + + No log is returned within response.{0}[Error] {1} + + + + + Error: {0}. + + + Please provide the password to connect to the BDC Controller + + + {0}Please provide the username to connect to the BDC Controller: + + + Username and password are required + + + + + Done installing {0} + + + Downloading {0} + + + ({0} KB) + + + Downloading {0} + + + Extracted {0} ({1}/{2}) + + + Failed to start {0} + + + Installed {0} + + + Installing {0} to {1} + + + Installing {0} + + + {0} Started + + + Starting {0} + + + + + {0} component exited unexpectedly. Please restart Azure Data Studio. + + + View Known Issues + + + + Edition + + + Compatibility Level + + + Owner + + + Pricing Tier + + + Type + + + Version + + + Last backup + + + Name + + + Size (MB) + + + Status + + + Enable/disable default JSON formatter (requires restart) + Associate schemas to JSON files in the current project - - A URL to a schema or a relative path to a schema in the current directory - An array of file patterns to match against when resolving JSON files to schemas. @@ -16,81 +680,221 @@ The schema definition for the given URL. The schema only needs to be provided to avoid accesses to the schema URL. - - Enable/disable default JSON formatter (requires restart) + + A URL to a schema or a relative path to a schema in the current directory - - Upload files + + MSSQL configuration - - New directory + + Declares the application workload type when connecting to a server - - Delete + + Application intent - - Preview + + The name of the application - - Save + + Application name - - Copy Path + + When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider - - Manage Access + + Asynchronous processing - - New Notebook + + Attach DB filename - - Open Notebook + + The name of the primary file, including the full path name, of an attachable database - - Tasks and information about your SQL Server Big Data Cluster + + Attached DB file name - - SQL Server Big Data Cluster + + Azure Active Directory - Universal with MFA support - - Submit Spark Job + + Windows Authentication - - New Spark Job + + SQL Login - - View Spark History + + Specifies the method of authenticating with SQL Server - - View Yarn History + + Authentication type - - Tasks + + Enables or disables Always Encrypted for the connection - - Install Packages + + Always Encrypted - - Configure Python for Notebooks + + Number of attempts to restore connection - - Cluster -Dashboard + + Connect retry count - - Search: Servers + + Delay between attempts to restore connection - - Search: Clear Search Server Results + + Connect retry interval - - Service Endpoints + + The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error - - Notebooks + + Connect timeout - - Show Log File + + Custom name of the connection + + + Name (optional) + + + When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process + + + Context connection + + + The SQL Server language record name + + + Current language + + + The name of the initial catalog or database int the data source + + + Database + + + Azure Attestation + + + Host Guardian Service + + + Specifies a protocol for attesting a server-side enclave used with Always Encrypted with secure enclaves + + + Attestation Protocol + + + Specifies an endpoint for attesting a server-side enclave used with Always Encrypted with secure enclaves + + + Enclave Attestation URL + + + When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed + + + Encrypt + + + The name or network address of the instance of SQL Server that acts as a failover partner + + + Failover partner + + + The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed + + + Load balance timeout + + + The maximum number of connections allowed in the pool + + + Max pool size + + + The minimum number of connections allowed in the pool + + + Min pool size + + + Multi subnet failover + + + When true, multiple result sets can be returned and read from one connection + + + Multiple active result sets + + + Size in bytes of the network packets used to communicate with an instance of SQL Server + + + Packet size + + + Indicates the password to be used when connecting to the data source + + + Password + + + When false, security-sensitive information, such as the password, is not returned as part of the connection + + + Persist security info + + + When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool + + + Pooling + + + Port + + + Used by SQL Server in Replication + + + Replication + + + Name of the SQL Server instance + + + Server + + + When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate + + + Trust server certificate + + + Indicates which server type system the provider will expose through the DataReader + + + Type system version + + + Indicates the user ID to be used when connecting to the data source + + + User name + + + The name of the workstation connecting to SQL Server + + + Workstation Id Disabled @@ -104,15 +908,6 @@ Dashboard Export SQL as Notebook - - MSSQL configuration - - - Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' - - - Number of XML characters to store after running a query - Should column definitions be aligned? @@ -128,42 +923,78 @@ Dashboard Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines - - [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown - - - [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information - - - Number of minutes to retain log files for backend services. Default is 1 week. - - - Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up. - - - Should IntelliSense be enabled + + [Optional] Do not show unsupported platform warnings Should IntelliSense error checking be enabled - - Should IntelliSense suggestions be enabled + + Should IntelliSense be enabled Should IntelliSense quick info be enabled + + Should IntelliSense suggestions be enabled + Should IntelliSense suggestions be lowercase - - Maximum number of rows to return before the server stops processing your query. + + [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown - - Maximum size of text and ntext data returned from a SELECT statement + + Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up. + + + Number of minutes to retain log files for backend services. Default is 1 week. + + + Microsoft SQL Server + + + Enable Parameterization for Always Encrypted + + + Enable SET ANSI_DEFAULTS + + + Enable SET ANSI_NULL_DFLT_ON + + + Enable SET ANSI_NULLS + + + Enable SET ANSI_PADDING + + + Enable SET ANSI_WARNINGS + + + Enable SET ARITHABORT option + + + Enable SET CURSOR_CLOSE_ON_COMMIT + + + Enable SET DEADLOCK_PRIORITY option + + + Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false' An execution time-out of 0 indicates an unlimited wait (no time-out) + + Enable SET IMPLICIT_TRANSACTIONS + + + Enable SET LOCK TIMEOUT option (in milliseconds) + + + Number of XML characters to store after running a query + Enable SET NOCOUNT option @@ -173,62 +1004,65 @@ Dashboard Enable SET PARSEONLY option - - Enable SET ARITHABORT option - - - Enable SET STATISTICS TIME option - - - Enable SET STATISTICS IO option - - - Enable SET XACT_ABORT ON option - - - Enable SET TRANSACTION ISOLATION LEVEL option - - - Enable SET DEADLOCK_PRIORITY option - - - Enable SET LOCK TIMEOUT option (in milliseconds) - Enable SET QUERY_GOVERNOR_COST_LIMIT - - Enable SET ANSI_DEFAULTS - Enable SET QUOTED_IDENTIFIER - - Enable SET ANSI_NULL_DFLT_ON + + Maximum number of rows to return before the server stops processing your query. - - Enable SET IMPLICIT_TRANSACTIONS + + Enable SET STATISTICS IO option - - Enable SET CURSOR_CLOSE_ON_COMMIT + + Enable SET STATISTICS TIME option - - Enable SET ANSI_PADDING + + Maximum size of text and ntext data returned from a SELECT statement - - Enable SET ANSI_WARNINGS + + Enable SET TRANSACTION ISOLATION LEVEL option - - Enable SET ANSI_NULLS + + Enable SET XACT_ABORT ON option - - Enable Parameterization for Always Encrypted + + [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information - - [Optional] Do not show unsupported platform warnings + + Copy Path - - Recovery Model + + Delete + + + Manage Access + + + New directory + + + Preview + + + Save + + + Upload files + + + New Notebook + + + Open Notebook + + + Name + + + Compatibility Level Last Database Backup @@ -236,17 +1070,11 @@ Dashboard Last Log Backup - - Compatibility Level - Owner - - Version - - - Edition + + Recovery Model Computer Name @@ -254,885 +1082,57 @@ Dashboard OS Version - + Edition - - Pricing Tier - - - Compatibility Level - - - Owner - - + Version - - Type + + Tasks and information about your SQL Server Big Data Cluster - - Microsoft SQL Server + + SQL Server Big Data Cluster - - Name (optional) - - - Custom name of the connection - - - Server - - - Name of the SQL Server instance - - - Database - - - The name of the initial catalog or database int the data source - - - Authentication type - - - Specifies the method of authenticating with SQL Server - - - SQL Login - - - Windows Authentication - - - Azure Active Directory - Universal with MFA support - - - User name - - - Indicates the user ID to be used when connecting to the data source - - - Password - - - Indicates the password to be used when connecting to the data source - - - Application intent - - - Declares the application workload type when connecting to a server - - - Asynchronous processing - - - When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider - - - Connect timeout - - - The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error - - - Current language - - - The SQL Server language record name - - - Always Encrypted - - - Enables or disables Always Encrypted for the connection - - - Attestation Protocol - - - Specifies a protocol for attesting a server-side enclave used with Always Encrypted with secure enclaves - - - Azure Attestation - - - Host Guardian Service - - - Enclave Attestation URL - - - Specifies an endpoint for attesting a server-side enclave used with Always Encrypted with secure enclaves - - - Encrypt - - - When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed - - - Persist security info - - - When false, security-sensitive information, such as the password, is not returned as part of the connection - - - Trust server certificate - - - When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate - - - Attached DB file name - - - The name of the primary file, including the full path name, of an attachable database - - - Context connection - - - When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process - - - Port - - - Connect retry count - - - Number of attempts to restore connection - - - Connect retry interval - - - Delay between attempts to restore connection - - - Application name - - - The name of the application - - - Workstation Id - - - The name of the workstation connecting to SQL Server - - - Pooling - - - When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool - - - Max pool size - - - The maximum number of connections allowed in the pool - - - Min pool size - - - The minimum number of connections allowed in the pool - - - Load balance timeout - - - The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed - - - Replication - - - Used by SQL Server in Replication - - - Attach DB filename - - - Failover partner - - - The name or network address of the instance of SQL Server that acts as a failover partner - - - Multi subnet failover - - - Multiple active result sets - - - When true, multiple result sets can be returned and read from one connection - - - Packet size - - - Size in bytes of the network packets used to communicate with an instance of SQL Server - - - Type system version - - - Indicates which server type system the provider will expose through the DataReader - - - Name - - - Status - - - Size (MB) - - - Last backup - - - Name - - - - - Node Command called without any node passed - - - Manage Access - - - Location : - - - Permissions - - - - Owner - - - Owner - - - Group - - - - Owning Group - - - Everyone else - - - User - - - Group - - - Access - - - Default - - - Delete - - - Sticky Bit - - - Inherit Defaults - - - Read - - - Write - - - Execute - - - Add User or Group - - - Enter name - - - Add - - - Named Users and Groups - - - Default User and Groups - - - User or Group Icon - - - Apply - - - Apply Recursively - - - Unexpected error occurred while applying changes : {0} - - - Local file will be uploaded to HDFS. - - - .......................... Submit Spark Job End ............................ - - - Uploading file from local {0} to HDFS folder: {1} - - - Upload file to cluster Succeeded! - - - Upload file to cluster Failed. {0} - - - Submitting job {0} ... - - - The Spark Job has been submitted. - - - Spark Job Submission Failed. {0} - - - YarnUI Url: {0} - - - Spark History Url: {0} - - - Get Application Id Failed. {0} - - - Local file {0} does not existed. - - - No SQL Server Big Data Cluster found. - - - Please connect to the Spark cluster before View {0} History. - - - - - NOTICE: This file has been truncated at {0} for preview. - - - The file has been truncated at {0} for preview. - - - - - $(sync~spin) {0}... - - - Cancel - - - Cancel operation? - - - Search Server Names - - - - - No Spark job batch id is returned from response.{0}[Error] {1} - - - No log is returned within response.{0}[Error] {1} - - - - - {0}Please provide the username to connect to the BDC Controller: - - - Please provide the password to connect to the BDC Controller - - - Error: {0}. - - - Username and password are required - - - - - All Files - - - Upload - - - Uploading files to HDFS - - - Upload operation was canceled - - - Error uploading files: {0} - - - Creating directory - - - Operation was canceled - - - Error on making directory: {0} - - - Enter directory name - - - Error on deleting files: {0} - - - Are you sure you want to delete this folder and its contents? - - - Are you sure you want to delete this file? - - - Saving HDFS Files - - - Save operation was canceled - - - Error on saving file: {0} - - - Generating preview - - - Error on previewing file: {0} - - - Error on copying path: {0} - - - An unexpected error occurred while opening the Manage Access dialog: {0} - - - - - Invalid Data Structure - - - Unable to create WebHDFS client due to missing options: ${0} - - - '${0}' is undefined. - - - Bad Request - - - Unauthorized - - - Forbidden - - - Not Found - - - Internal Server Error - - - Unknown Error - - - Unexpected Redirect - - - - - ConnectionInfo is undefined. - - - ConnectionInfo.options is undefined. - - - Some missing properties in connectionInfo.options: {0} - - - - - View Known Issues - - - {0} component exited unexpectedly. Please restart Azure Data Studio. - - - - - This sample code loads the file into a data frame and shows the first 10 results. - - - An error occurred converting the SQL document to a Notebook. Error : {0} - - - An error occurred converting the Notebook document to SQL. Error : {0} - - + Notebooks - - Only .ipynb Notebooks are supported + + Search: Clear Search Server Results - - Could not find the controller endpoint for this instance + + Configure Python for Notebooks - - - - Applying permission changes recursively under '{0}' + + Service Endpoints - - Permission changes applied successfully. + + Install Packages - - Applying permission changes to '{0}'. + + New Spark Job - - Error applying permission changes: {0} + + Cluster +Dashboard - - - - Yes + + View Spark History - - No + + View Yarn History - - - - Select other SQL Server + + Search: Servers - - Please select SQL Server with Big Data Cluster. + + Show Log File - - No SQL Server is selected. + + Submit Spark Job - - The selected server does not belong to a SQL Server Big Data Cluster + + Tasks - - Error Get File Path: {0} - - - - - Parameters for SparkJobSubmissionDialog is illegal - - - New Job - - - Cancel - - - Submit - - - {0} Spark Job Submission: - - - .......................... Submit Spark Job Start .......................... - - - - - Parameters for SparkJobSubmissionModel is illegal - - - submissionArgs is invalid. - - - livyBatchId is invalid. - - - Get Application Id time out. {0}[Log] {1} - - - Property localFilePath or hdfsFolderPath is not specified. - - - Property Path is not specified. - - - - - GENERAL - - - Enter a name ... - - - Job Name - - - Spark Cluster - - - Path to a .jar or .py file - - - The selected local file will be uploaded to HDFS: {0} - - - JAR/py File - - - Main Class - - - Arguments - - - Command line arguments used in your main class, multiple arguments should be split by space. - - - Property Job Name is not specified. - - - Property JAR/py File is not specified. - - - Property Main Class is not specified. - - - {0} does not exist in Cluster or exception thrown. - - - The specified HDFS file does not exist. - - - Select - - - Error in locating the file due to Error: {0} - - - - - ADVANCED - - - Reference Jars - - - Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;) - - - Reference py Files - - - Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;) - - - Reference Files - - - Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;) - - - Driver Memory - - - Amount of memory to allocate to the driver. Specify units as part of value. Example 512M or 2G. - - - Driver Cores - - - Amount of CPU cores to allocate to the driver. - - - Executor Memory - - - Amount of memory to allocate to the executor. Specify units as part of value. Example 512M or 2G. - - - Executor Cores - - - Amount of CPU cores to allocate to the executor. - - - Executor Count - - - Number of instances of the executor to run. - - - Queue Name - - - Name of the Spark queue to execute the session in. - - - Configuration Values - - - List of name value pairs containing Spark configuration values. Encoded as JSON dictionary. Example: '{"name":"value", "name2":"value2"}'. - - - - - Please provide the username to connect to HDFS: - - - Please provide the password to connect to HDFS: - - - Session for node {0} does not exist - - - Error notifying of node change: {0} - - - HDFS - - - Root - - - - - Error: {0} - - - Cannot delete a connection. Only subfolders and files can be deleted. - - - - - Stream operation canceled by the user - - - - - Metrics Dashboard - - - Log Search Dashboard - - - Spark Jobs Management and Monitoring Dashboard - - - Spark Diagnostics and Monitoring Dashboard - - - Copy - - - Application Proxy - - - Cluster Management Service - - - Gateway to access HDFS files, Spark - - - Management Proxy - - - Management Proxy - - - SQL Server Master Instance Front-End - - - Metrics Dashboard - - - Log Search Dashboard - - - Spark Diagnostics and Monitoring Dashboard - - - Spark Jobs Management and Monitoring Dashboard - - - HDFS File System Proxy - - - Proxy for running Spark statements, jobs, applications - - - - - {0} Started - - - Starting {0} - - - Failed to start {0} - - - Installing {0} to {1} - - - Installing {0} - - - Installed {0} - - - Downloading {0} - - - ({0} KB) - - - Downloading {0} - - - Done installing {0} - - - Extracted {0} ({1}/{2}) - - - - - Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account is available. Please add a linked Azure account and retry the query. - - - Please select a linked Azure account: - - - Azure Data Studio needs to contact Azure Key Vault to access a column master key for Always Encrypted, but no linked Azure account was selected. Please retry the query and select a linked Azure account when prompted. - - - The configured Azure account for {0} does not have sufficient permissions for Azure Key Vault to access a column master key for Always Encrypted. - - + \ No newline at end of file diff --git a/resources/xlf/en/notebook.xlf b/resources/xlf/en/notebook.xlf index 1dd88d192e..6a2c9141f8 100644 --- a/resources/xlf/en/notebook.xlf +++ b/resources/xlf/en/notebook.xlf @@ -1,186 +1,5 @@ - - - Notebook Core Extensions - - - Defines the Data-procotol based Notebook contribution and many Notebook commands and contributions. - - - Notebook configuration - - - Local path to python installation used by Notebooks. - - - Local path to a preexisting python installation used by Notebooks. - - - Do not show prompt to update Python. - - - The amount of time (in minutes) to wait before shutting down a server after all notebooks are closed. (Enter 0 to not shutdown) - - - Override editor default settings in the Notebook editor. Settings include background color, current line color and border - - - Maximum number of rows returned per table in the Notebook editor - - - Notebooks contained in these books will automatically be trusted. - - - Maximum depth of subdirectories to search for Books (Enter 0 for infinite) - - - Collapse Book items at root level in the Notebooks Viewlet - - - Download timeout in milliseconds for GitHub books - - - Notebooks that are pinned by the user for the current workspace - - - Allow Jupyter server to run as root user - - - New Notebook - - - Open Notebook - - - Analyze in Notebook - - - Run Cell - - - Clear Cell Result - - - Run Cells - - - Add Code Cell - - - Add Text Cell - - - Add Cell - - - Analyze in Notebook - - - New Notebook - - - Open Notebook - - - Set context for Notebook - - - Set kernel for Notebook - - - Extra kernels - - - IDs of the extra kernels to enable - - - Configuration options for Jupyter kernels. This is automatically managed and not recommended to be manually edited. - - - Reinstall Notebook dependencies - - - Configure Python for Notebooks - - - Manage Packages - - - SQL Server 2019 Guide - - - Jupyter Books - - - Save Jupyter Book - - - Trust Jupyter Book - - - Search Jupyter Book - - - Notebooks - - - Provided Jupyter Books - - - Pinned notebooks - - - Get localized SQL Server 2019 guide - - - Open Jupyter Book - - - Close Jupyter Book - - - Close Notebook - - - Remove Notebook - - - Add Notebook - - - Add Markdown File - - - Reveal in Books - - - Create Jupyter Book - - - Open Notebooks in Folder - - - Add Remote Jupyter Book - - - Pin Notebook - - - Unpin Notebook - - - Move to ... - - - - - ... Ensuring {0} exists - - - Process exited with error code: {0}. StdErr Output: {1} - - localhost @@ -188,106 +7,61 @@ Could not find the specified package - + - - Yes - - - No - - - This sample code loads the file into a data frame and shows the first 10 results. - - - Spark kernels require a connection to a SQL Server Big Data Cluster master instance. - - - Non-MSSQL providers are not supported for spark kernels. - - - All Files - - - Select Folder - - - Select Jupyter Book - - - Folder already exists. Are you sure you want to delete and replace this folder? - - - Open Notebook - - - Open Markdown - - - Open External Link - - - Jupyter Book is now trusted in the workspace. - - - Jupyter Book is already trusted in this workspace. - - - Jupyter Book is no longer trusted in this workspace - - - Jupyter Book is already untrusted in this workspace. - - - Jupyter Book {0} is now pinned in the workspace. - - - Jupyter Book {0} is no longer pinned in this workspace - - - Failed to find a Table of Contents file in the specified Jupyter Book. - - - No Jupyter Books are currently selected in the viewlet. - - - Select Jupyter Book Section - - - Add to this level - - - Missing file : {0} from {1} + + Error: {0} has an incorrect toc.yml file Invalid toc file - - Error: {0} has an incorrect toc.yml file + + Add + + + Add Remote Jupyter Book + + + All Files + + + Jupyter Book + + + Jupyter Books are used to organize Notebooks. + + + Failed to find a Table of Contents file in the specified Jupyter Book. + + + No Jupyter Books are currently available on the provided link + + + Browse + + + Close + + + Close Jupyter Book {0} failed: {1} Configuration file missing - - Open Jupyter Book {0} failed: {1} + + File already exists. Are you sure you want to overwrite this file? - - Failed to read Jupyter Book {0}: {1} + + Folder already exists. Are you sure you want to delete and replace this folder? - - Open notebook {0} failed: {1} + + Content folder - - Open markdown {0} failed: {1} + + Content folder (Optional) - - Open untitled notebook {0} as untitled failed: {1} - - - Open link {0} failed: {1} - - - Close Jupyter Book {0} failed: {1} + + Create File {0} already exists in the destination folder {1} @@ -296,62 +70,86 @@ Error while editing Jupyter Book {0}: {1} - - Error while selecting a Jupyter Book or a section to edit: {0} + + File Extension - - Failed to find section {0} in {1}. + + File Name - - URL - - - Repository URL - - - Location - - - Add Remote Jupyter Book - - - GitHub - - - Shared File - - - Releases - - - Jupyter Book - - - Version - - - Language - - - No Jupyter Books are currently available on the provided link - - - The url provided is not a Github release url - - - Search - - - Add - - - Close + + Http Request failed with error: {0} {1} - - - Remote Jupyter Book download is in progress + + Add to this level + + + Select Jupyter Book + + + Select Jupyter Book Section + + + Select Folder + + + Language + + + Learn more. + + + Location + + + Missing file : {0} from {1} + + + Jupyter Book is already trusted in this workspace. + + + Jupyter Book is already untrusted in this workspace. + + + Jupyter Books not Found + + + Jupyter Book {0} is now pinned in the workspace. + + + Jupyter Book is now trusted in the workspace. + + + Jupyter Book {0} is no longer pinned in this workspace + + + Jupyter Book is no longer trusted in this workspace + + + Content folder path does not exist + + + Error while trying to access: {0} + + + Downloading to {0} + + + File {0} already exists in the destination folder + + + Save location path is not valid. + + + No + + + Releases not Found + + + Error while creating remote Jupyter Book directory Remote Jupyter Book download is complete @@ -359,407 +157,115 @@ Error while downloading remote Jupyter Book + + Remote Jupyter Book download is in progress + Error while decompressing remote Jupyter Book - - Error while creating remote Jupyter Book directory - - - Downloading Remote Jupyter Book - Resource not Found - - Jupyter Books not Found - - - Releases not Found - - - The selected Jupyter Book is not valid - - - Http Request failed with error: {0} {1} - - - Downloading to {0} - - - New Jupyter Book (Preview) - - - Jupyter Books are used to organize Notebooks. - - - Learn more. - - - Content folder - - - Browse - - - Create - - - Name - - - Save location - - - Content folder (Optional) - - - Content folder path does not exist + + This sample code loads the file into a data frame and shows the first 10 results. Save location path does not exist. - - Error while trying to access: {0} + + Downloading Remote Jupyter Book - - New Notebook (Preview) + + The selected Jupyter Book is not valid + + + Yes + + + Name + + + New Jupyter Book (Preview) New Markdown (Preview) - - File Extension + + New Notebook (Preview) - - File already exists. Are you sure you want to overwrite this file? + + Spark kernels require a connection to a SQL Server Big Data Cluster master instance. + + + No Jupyter Books are currently selected in the viewlet. + + + GitHub + + + Shared File + + + Open Jupyter Book {0} failed: {1} + + + Open External Link + + + Open link {0} failed: {1} + + + Open Markdown + + + Open markdown {0} failed: {1} + + + Open Notebook + + + Open notebook {0} failed: {1} + + + Open untitled notebook {0} as untitled failed: {1} + + + Non-MSSQL providers are not supported for spark kernels. + + + Failed to read Jupyter Book {0}: {1} + + + Releases + + + Repository URL + + + Save location + + + Search + + + Failed to find section {0} in {1}. + + + Error while selecting a Jupyter Book or a section to edit: {0} Title - - File Name + + URL - - Save location path is not valid. + + The url provided is not a Github release url - - File {0} already exists in the destination folder - - - - - Notebook dependencies installation is in progress - - - Python download is complete - - - Error while downloading python setup - - - Downloading python package - - - Unpacking python package - - - Error while creating python installation directory - - - Error while unpacking python bundle - - - Installing Notebook dependencies - - - Installing Notebook dependencies, see Tasks view for more information - - - Notebook dependencies installation is complete - - - Cannot overwrite an existing Python installation while python is running. Please close any active notebooks before proceeding. - - - Another Python installation is currently in progress. Waiting for it to complete. - - - Active Python notebook sessions will be shutdown in order to update. Would you like to proceed now? - - - Python {0} is now available in Azure Data Studio. The current Python version (3.6.6) will be out of support in December 2021. Would you like to update to Python {0} now? - - - Python {0} will be installed and will replace Python 3.6.6. Some packages may no longer be compatible with the new version or may need to be reinstalled. A notebook will be created to help you reinstall all pip packages. Would you like to continue with the update now? - - - Installing Notebook dependencies failed with error: {0} - - - Downloading local python for platform: {0} to {1} - - - Encountered an error when trying to retrieve list of installed packages: {0} - - - Encountered an error when getting Python user path: {0} - - - Yes - - - No - - - Don't Ask Again - - - - - Install - - - The specified install location is invalid. - - - No Python installation was found at the specified location. - - - Configure Python to run {0} kernel - - - Configure Python to run kernels - - - Configure Python Runtime - - - Install Dependencies - - - Python installation was declined. - - - - - Code - - - Text - - - What type of cell do you want to add? - - - - - Notebooks - - - Only .ipynb Notebooks are supported - - - Are you sure you want to reinstall? - - - - - Browse - - - Select - - - The {0} kernel requires a Python runtime to be configured and dependencies to be installed. - - - Notebook kernels require a Python runtime to be configured and dependencies to be installed. - - - Installation Type - - - Python Install Location - - - Python runtime configured! - - - {0} (Python {1}) - - - No supported Python versions found. - - - {0} (Default) - - - New Python installation - - - Use existing Python installation - - - {0} (Custom) - - - - - Name - - - Existing Version - - - Required Version - - - Kernel - - - Install required kernel dependencies - - - Could not retrieve packages for kernel {0} - - - - - Shutdown of Notebook server failed: {0} - - - - - Error stopping Notebook Server: {0} - - - Notebook process exited prematurely with error code: {0}. StdErr Output: {1} - - - Error sent from Jupyter: {0} - - - ... Jupyter is running at {0} - - - ... Starting Notebook server - - - - - A notebook path is required - - - - - Cannot start a session, the manager is not yet initialized - - - Could not find Knox gateway endpoint - - - {0}Please provide the username to connect to the BDC Controller: - - - Please provide the password to connect to the BDC Controller - - - Error: {0}. - - - A connection to the cluster controller is required to run Spark jobs - - - - - Manage Packages - - - Close - - - - - Installed - - - Name - - + Version - - Delete - - - Uninstall selected packages - - - Package Type - - - Location - - - {0} {1} packages found - - - Are you sure you want to uninstall the specified packages? - - - Uninstalling {0} - - - Completed uninstall for {0} - - - Failed to uninstall {0}. Error: {1} - - - - - N/A - - - Search {0} packages - - - Add new - - - Search - - - Install - - - Package Name - - - Package Version - - - Package Summary - - - Could not find any valid versions for the specified package - - - Installing {0} {1} - - - Completed install for {0} {1} - - - Failed to install {0} {1}. Error: {2} - - - - - Package info request failed with error: {0} {1} - - + This sample code loads the file into a data frame and shows the first 10 results. @@ -770,22 +276,516 @@ Notebooks - + + + + ... Ensuring {0} exists + + + Process exited with error code: {0}. StdErr Output: {1} + + + + + Browse + + + The {0} kernel requires a Python runtime to be configured and dependencies to be installed. + + + Notebook kernels require a Python runtime to be configured and dependencies to be installed. + + + Use existing Python installation + + + Installation Type + + + Python Install Location + + + New Python installation + + + Python runtime configured! + + + Select + + + {0} (Custom) + + + {0} (Default) + + + {0} (Python {1}) + + + No supported Python versions found. + + + + + The specified install location is invalid. + + + Install + + + Configure Python Runtime + + + Install Dependencies + + + Python installation was declined. + + + No Python installation was found at the specified location. + + + Configure Python to run {0} kernel + + + Configure Python to run kernels + + + + + Existing Version + + + Kernel + + + Name + + + Install required kernel dependencies + + + Required Version + + + Could not retrieve packages for kernel {0} + + + + + Add new + + + Completed install for {0} {1} + + + Failed to install {0} {1}. Error: {2} + + + Installing {0} {1} + + + Install + + + N/A + + + Could not find any valid versions for the specified package + + + Package Name + + + Package Summary + + + Package Version + + + Search {0} packages + + + Search + + + + + Completed uninstall for {0} + + + Failed to uninstall {0}. Error: {1} + + + Uninstalling {0} + + + Are you sure you want to uninstall the specified packages? + + + Delete + + + Installed + + + Location + + + Version + + + {0} {1} packages found + + + Package Type + + + Name + + + Uninstall selected packages + + + + + Close + + + Manage Packages + + + + + Code + + + What type of cell do you want to add? + + + Text + + + + + Are you sure you want to reinstall? + + + Notebooks + + + Only .ipynb Notebooks are supported + + + + + A notebook path is required + + + + + Don't Ask Again + + + Installing Notebook dependencies failed with error: {0} + + + Downloading local python for platform: {0} to {1} + + + Encountered an error when getting Python user path: {0} + + + Notebook dependencies installation is complete + + + Notebook dependencies installation is in progress + + + Installing Notebook dependencies, see Tasks view for more information + + + Encountered an error when trying to retrieve list of installed packages: {0} + + + Error while creating python installation directory + + + Python download is complete + + + Error while downloading python setup + + + Downloading python package + + + Cannot overwrite an existing Python installation while python is running. Please close any active notebooks before proceeding. + + + Error while unpacking python bundle + + + Unpacking python package + + + Python {0} is now available in Azure Data Studio. The current Python version (3.6.6) will be out of support in December 2021. Would you like to update to Python {0} now? + + + Python {0} will be installed and will replace Python 3.6.6. Some packages may no longer be compatible with the new version or may need to be reinstalled. A notebook will be created to help you reinstall all pip packages. Would you like to continue with the update now? + + + Active Python notebook sessions will be shutdown in order to update. Would you like to proceed now? + + + Installing Notebook dependencies + + + Another Python installation is currently in progress. Waiting for it to complete. + + + No + + + Yes + + + + + Shutdown of Notebook server failed: {0} + + + + + Error: {0}. + + + A connection to the cluster controller is required to run Spark jobs + + + Cannot start a session, the manager is not yet initialized + + + Could not find Knox gateway endpoint + + + Please provide the password to connect to the BDC Controller + + + {0}Please provide the username to connect to the BDC Controller: + + + + + Package info request failed with error: {0} {1} + + + + + Error sent from Jupyter: {0} + + + ... Starting Notebook server + + + ... Jupyter is running at {0} + + + Notebook process exited prematurely with error code: {0}. StdErr Output: {1} + + + Error stopping Notebook Server: {0} + + + + Download and open '{0}'? + + + File open request failed with error: {0} {1} + + + Could not find the specified file + Action {0} is not supported for this handler Cannot open link {0} as only HTTP, HTTPS, and File links are supported - - Download and open '{0}'? + + + + Jupyter Books - - Could not find the specified file + + IDs of the extra kernels to enable - - File open request failed with error: {0} {1} + + Extra kernels - + + Configuration options for Jupyter kernels. This is automatically managed and not recommended to be manually edited. + + + Defines the Data-procotol based Notebook contribution and many Notebook commands and contributions. + + + Notebook Core Extensions + + + Allow Jupyter server to run as root user + + + Analyze in Notebook + + + Collapse Book items at root level in the Notebooks Viewlet + + + Add Cell + + + Add Code Cell + + + Add Text Cell + + + Clear Cell Result + + + New Notebook + + + Open Notebook + + + Run Cell + + + Run Cells + + + Notebook configuration + + + Do not show prompt to update Python. + + + The amount of time (in minutes) to wait before shutting down a server after all notebooks are closed. (Enter 0 to not shutdown) + + + Maximum depth of subdirectories to search for Books (Enter 0 for infinite) + + + Maximum number of rows returned per table in the Notebook editor + + + Override editor default settings in the Notebook editor. Settings include background color, current line color and border + + + Notebooks that are pinned by the user for the current workspace + + + Local path to python installation used by Notebooks. + + + Download timeout in milliseconds for GitHub books + + + Notebooks contained in these books will automatically be trusted. + + + Local path to a preexisting python installation used by Notebooks. + + + Pinned notebooks + + + Get localized SQL Server 2019 guide + + + Provided Jupyter Books + + + SQL Server 2019 Guide + + + Notebooks + + + Add Markdown File + + + Add Notebook + + + Analyze in Notebook + + + Close Jupyter Book + + + Close Notebook + + + Configure Python for Notebooks + + + Create Jupyter Book + + + Set context for Notebook + + + Set kernel for Notebook + + + Manage Packages + + + Move to ... + + + New Notebook + + + Open Jupyter Book + + + Open Notebook + + + Open Notebooks in Folder + + + Add Remote Jupyter Book + + + Pin Notebook + + + Reinstall Notebook dependencies + + + Remove Notebook + + + Reveal in Books + + + Save Jupyter Book + + + Search Jupyter Book + + + Trust Jupyter Book + + + Unpin Notebook + + \ No newline at end of file diff --git a/resources/xlf/en/profiler.xlf b/resources/xlf/en/profiler.xlf index e370769c10..4452a7dc5d 100644 --- a/resources/xlf/en/profiler.xlf +++ b/resources/xlf/en/profiler.xlf @@ -7,26 +7,26 @@ Start - - Start New Profiler Session - - - Invalid templates list, cannot open dialog + + Failed to create a session Invalid dialog owner, cannot open dialog + + Enter session name: + Invalid provider type, cannot open dialog Select session template: - - Enter session name: + + Invalid templates list, cannot open dialog - - Failed to create a session + + Start New Profiler Session - + \ No newline at end of file diff --git a/resources/xlf/en/query-history.xlf b/resources/xlf/en/query-history.xlf index 1671386beb..41de59825f 100644 --- a/resources/xlf/en/query-history.xlf +++ b/resources/xlf/en/query-history.xlf @@ -1,17 +1,17 @@ - - Query History + + Clear All History View and run previously executed queries - - Clear All History + + Query History Toggle Query History Capture - + \ No newline at end of file diff --git a/resources/xlf/en/resource-deployment.xlf b/resources/xlf/en/resource-deployment.xlf index ef0887eb79..141c1f8181 100644 --- a/resources/xlf/en/resource-deployment.xlf +++ b/resources/xlf/en/resource-deployment.xlf @@ -1,262 +1,74 @@ - - - SQL Server Deployment extension for Azure Data Studio - - - Provides a notebook-based experience to deploy Microsoft SQL Server - - - New Deployment… - - - Deployment - - - SQL Server container image - - - Run SQL Server container image with docker - - - Version - - - SQL Server 2017 - - - SQL Server 2019 - - - Deploy SQL Server 2017 container images - - - Deploy SQL Server 2019 container images - - - Container name - - - SQL Server password - - - Confirm password - - - Port - - - SQL Server on Windows - - - Run SQL Server on Windows, select a version to get started. - - - Microsoft Privacy Statement - - - Deployment configuration - - - Location of the azdata package used for the install command - - - SQL Server on Azure Virtual Machine - - - Create SQL virtual machines on Azure. Best for migrations and applications requiring OS-level access. - - - Deploy Azure SQL virtual machine - - - Script to notebook - - - I accept {0}, {1} and {2}. - - - Azure SQL VM License Terms - - - azdata License Terms - - - Azure information - - - Azure locations - - - VM information - - - Image - - - VM image SKU - - - Publisher - - - Virtual machine name - - - Size - - - Storage account - - - Storage account name - - - Storage account SKU type - - - Administrator account - - - Username - - - Password - - - Confirm password - - - Summary - - - Azure SQL Database - - - Create a SQL database, database server, or elastic pool in Azure. - - - Create in Azure portal - - - Select - - - Resource Type - - - Single Database - - - Elastic Pool - - - Database Server - - - I accept {0}, {1} and {2}. - - - Azure SQL DB License Terms - - - azdata License Terms - - - Azure SQL managed instance - - - Create a SQL Managed Instance in either Azure or a customer-managed environment - - - Open in Portal - - - Resource Type - - - I accept {0} and {1}. - - - Azure SQL MI License Terms - - - Azure SQL Managed Instance provides full SQL Server access and feature compatibility for migrating SQL Servers to Azure, or developing new applications. {0}. - - - Learn More - - + + Unknown field type: "{0}" + + + Deployment cannot continue. Azure Data CLI license terms were declined.You can either Accept EULA to continue or Cancel this operation + + + Deployment cannot continue. Azure Data CLI license terms have not yet been accepted. Please accept the EULA to enable the features that requires Azure Data CLI. + Azure Account + + Azure Location + + + Resource Group + Subscription (selected subset) Change the currently selected subscriptions through the 'Select Subscriptions' action on an account listed in the 'Azure' tree view of the 'Connections' viewlet - - Resource Group - - - Azure Location - - - Browse - - - Select - - - Kube config file path - - - No cluster context information found - - - Sign in… + + No Refresh - - Yes - - - No + + New resource group name Create a new resource group - - New resource group name + + Sign in… + + + Yes + + + Select Realm - - Unknown field type: "{0}" + + Required tool '{0}' [ {1} ] is being installed now. - - Options Source with id:{0} is already defined + + Accept EULA & Select - - Value Provider with id:{0} is already defined + + Browse - - No Options Source defined for id: {0} - - - No Value Provider defined for id: {0} + + Attempt to get isPassword for unknown variable:{0} Attempt to get variable value for unknown variable:{0} - - Attempt to get isPassword for unknown variable:{0} + + No cluster context information found + + + Kube config file path FieldInfo.options was not defined for field type: {0} @@ -264,20 +76,53 @@ FieldInfo.options must be an object if it is not an array + + Options Source with id:{0} is already defined + + + No Options Source defined for id: {0} + When FieldInfo.options is an object it must have 'optionsType' property When optionsType is not {0} then it must be {1} - - Deployment cannot continue. Azure Data CLI license terms have not yet been accepted. Please accept the EULA to enable the features that requires Azure Data CLI. + + The task "{0}" has failed. - - Deployment cannot continue. Azure Data CLI license terms were declined.You can either Accept EULA to continue or Cancel this operation + + Description - - Accept EULA & Select + + An error occurred opening the output notebook. {1}{2}. + + + Install tools + + + Options + + + Required Version + + + Status + + + The task "{0}" failed and no output Notebook was generated. + + + Tool + + + Version + + + View error detail + + + Discovered Path or Additional Information The '{0}' extension is required to deploy this resource, do you want to install it now? @@ -288,47 +133,23 @@ Installing extension '{0}'... + + Required tools + Unknown extension '{0}' - - Select the deployment options - Filter resources... - - Categories - - - There are some errors on this page, click 'Show Details' to view the errors. - - - Script - - - Run - - - View error detail - - - An error occurred opening the output notebook. {1}{2}. - - - The task "{0}" has failed. - - - The task "{0}" failed and no output Notebook was generated. + + SQL Server All - - On-premises - - - SQL Server + + Cloud Hybrid @@ -336,1071 +157,493 @@ PostgreSQL - - Cloud + + On-premises - - Description + + Categories - - Tool + + Select the deployment options - - Status + + Run - - Version + + Script - - Required Version + + There are some errors on this page, click 'Show Details' to view the errors. - - Discovered Path or Additional Information + + Value Provider with id:{0} is already defined - - Required tools + + No Value Provider defined for id: {0} - - Install tools - - - Options - - - Required tool '{0}' [ {1} ] is being installed now. - - - - - An error ocurred while loading or parsing the config file:{0}, error is:{1} - - - Path: {0} is not a file, please select a valid kube config file. - - - File: {0} not found. Please select a kube config file. - - - Unexpected error fetching accounts: {0} - - - Unexpected error fetching available kubectl storage classes : {0} - - - Unexpected error fetching subscriptions for account {0}: {1} - - - The selected account '{0}' is no longer available. Click sign in to add it again or select a different account. - - - - Error Details: {0}. - - - The access token for selected account '{0}' is no longer valid. Please click the sign in button and refresh the account or select a different account. - - - Unexpected error fetching resource groups for subscription {0}: {1} - - - {0} doesn't meet the password complexity requirement. For more information: https://docs.microsoft.com/sql/relational-databases/security/password-policy - - - {0} doesn't match the confirmation password - - - - - Deploy Azure SQL VM - - - Script to Notebook - - - Please fill out the required fields marked with red asterisks. - - - Azure settings - - - Azure Account - - - Subscription - - - Resource Group - - - Region - - - Virtual machine settings - - - Virtual machine name - - - Administrator account username - - - Administrator account password - - - Confirm password - - - Image - - - Image SKU - - - Image Version - - - Size - - - Click here to learn more about pricing and supported VM sizes - - - Networking - - - Configure network settings - - - New virtual network - - - Virtual Network - - - New subnet - - - Subnet - - - Public IP - - - New public ip - - - Enable Remote Desktop (RDP) inbound port (3389) - - - SQL Servers settings - - - SQL connectivity - - - Port - - - Enable SQL authentication - - - Username - - - Password - - - Confirm password - - - - - Save config files - - - Script to Notebook - - - Save config files - - - Config files saved to {0} - - - Deploy SQL Server 2019 Big Data Cluster on a new AKS cluster - - - Deploy SQL Server 2019 Big Data Cluster on an existing AKS cluster - - - Deploy SQL Server 2019 Big Data Cluster on an existing kubeadm cluster - - - Deploy SQL Server 2019 Big Data Cluster on an existing Azure Red Hat OpenShift cluster - - - Deploy SQL Server 2019 Big Data Cluster on an existing OpenShift cluster - - - - - Not Installed - - - Installed - - - Installing… - - - Error - - - Failed - - - • brew is needed for deployment of the tools and needs to be pre-installed before necessary tools can be deployed - - - • curl is needed for installation and needs to be pre-installed before necessary tools can be deployed - - - Could not find 'Location' in the output: - - - output: - - - Error installing tool '{0}' [ {1} ].{2}Error: {3}{2}See output channel '{4}' for more details - - - Error installing tool. See output channel '{0}' for more details - - - Installation commands completed but version of tool '{0}' could not be detected so our installation attempt has failed. Detection Error: {1}{2}Cleaning up previous installations would help. - - - Failed to detect version post installation. See output channel '{0}' for more details - - - A possibly way to uninstall is using this command:{0} >{1} - - - {0}See output channel '{1}' for more details - - - Cannot install tool:{0}::{1} as installation commands are unknown for your OS distribution, Please install {0} manually before proceeding - - - Search Paths for tool '{0}': {1} - - - Error retrieving version information. See output channel '{0}' for more details - - - Error retrieving version information.{0}Invalid output received, get version command output: '{1}' - - - - - Deploy Azure SQL DB - - - Script to Notebook - - - Please fill out the required fields marked with red asterisks. - - - Azure SQL Database - Azure account settings - - - Azure account settings - - - Azure account - - - Subscription - - - Server - - - Resource group - - - Database settings - - - Firewall rule name - - - SQL database name - - - Database collation - - - Collation for database - - - Enter IP addresses in IPv4 format. - - - Min IP address in firewall IP range - - - Max IP address in firewall IP range - - - Min IP address - - - Max IP address - - - Create a firewall rule for your local client IP in order to connect to your database through Azure Data Studio after creation is completed. - - - Create a firewall rule - - - - - Runs commands against Kubernetes clusters - - - kubectl - - - Unable to parse the kubectl version command output: "{0}" - - - updating your brew repository for kubectl installation … - - - installing kubectl … - - - updating repository information … - - - getting packages needed for kubectl installation … - - - downloading and installing the signing key for kubectl … - - - adding the kubectl repository information … - - - installing kubectl … - - - deleting previously downloaded kubectl.exe if one exists … - - - downloading and installing the latest kubectl.exe … - - - deleting previously downloaded kubectl if one exists … - - - downloading the latest kubectl release … - - - making kubectl executable … - - - cleaning up any previously backed up version in the install location if they exist … - - - backing up any existing kubectl in the install location … - - - moving kubectl into the install location in the PATH … - - - - - Open Notebook - - - OK - - - Notebook type - - + The resource type: {0} is not defined - + The notebook {0} does not exist - + - - Deployments - >>> {0} … errored out: {1} >>> Ignoring error in execution and continuing tool deployment - - stdout: - stderr: + + stdout: + >>> {0} … exited with code: {1} >>> {0} … exited with signal: {1} - + + Deployments + + Download failed, status code: {0}, message: {1} - - - - Service scale settings (Instances) + + + + Manages Azure resources - - Service storage settings (GB per Instance) + + Azure CLI - - Features + + adding the azure-cli repository information … - - Yes + + getting packages needed for azure-cli installation … - - No + + updating repository information before installing azure-cli … - - Deployment configuration profile + + updating repository information again for azure-cli … - - Select the target configuration profile + + deleting previously downloaded azurecli.msi if one exists … - - Note: The settings of the deployment profile can be customized in later steps. + + displaying the installation log … - - Loading profiles + + downloading and installing the signing key for azure-cli … - - Loading profiles completed + + downloading azurecli.msi and installing azure-cli … - - Deployment configuration profile + + installing azure-cli … - - Failed to load the deployment profiles: {0} + + download and invoking script to install azure-cli … - - SQL Server Master + + updating your brew repository for azure-cli installation … - - Compute + + + + adding the azdata repository information … - - Data + + getting packages needed for azdata installation … - - HDFS + Spark + + updating repository information … - - Service + + deleting previously downloaded Azdata.msi if one exists … - - Data + + displaying the installation log … - - Logs + + downloading and installing the signing key for azdata … - - Storage type + + downloading Azdata.msi and installing azdata-cli … - - Basic authentication + + installing azdata … - - Active Directory authentication + + tapping into the brew repository for azdata-cli … - - High Availability + + updating the brew repository for azdata-cli installation … - - Feature + + Azure Data command line interface - - Please select a deployment profile. + + Azure Data CLI - - - + + + + Packages and runs applications in isolated containers + + + docker + + + + + Runs commands against Kubernetes clusters + + + kubectl + + + adding the kubectl repository information … + + + getting packages needed for kubectl installation … + + + updating repository information … + + + backing up any existing kubectl in the install location … + + + cleaning up any previously backed up version in the install location if they exist … + + + deleting previously downloaded kubectl if one exists … + + + deleting previously downloaded kubectl.exe if one exists … + + + downloading and installing the signing key for kubectl … + + + downloading and installing the latest kubectl.exe … + + + downloading the latest kubectl release … + + + installing kubectl … + + + installing kubectl … + + + making kubectl executable … + + + moving kubectl into the install location in the PATH … + + + updating your brew repository for kubectl installation … + + + Unable to parse the kubectl version command output: "{0}" + + + + + Error retrieving version information.{0}Invalid output received, get version command output: '{1}' + + + Error retrieving version information. See output channel '{0}' for more details + + + • brew is needed for deployment of the tools and needs to be pre-installed before necessary tools can be deployed + + + • curl is needed for installation and needs to be pre-installed before necessary tools can be deployed + + + Error + + + Failed + + + Installed + + + Installing… + + + Not Installed + + + Error installing tool '{0}' [ {1} ].{2}Error: {3}{2}See output channel '{4}' for more details + + + Error installing tool. See output channel '{0}' for more details + + + Failed to detect version post installation. See output channel '{0}' for more details + + + Installation commands completed but version of tool '{0}' could not be detected so our installation attempt has failed. Detection Error: {1}{2}Cleaning up previous installations would help. + + + A possibly way to uninstall is using this command:{0} >{1} + + + {0}See output channel '{1}' for more details + + + Search Paths for tool '{0}': {1} + + + Could not find 'Location' in the output: + + + output: + + + Cannot install tool:{0}::{1} as installation commands are unknown for your OS distribution, Please install {0} manually before proceeding + + + + + Azure account + + + Server + + + Azure SQL Database - Azure account settings + + + Azure account settings + + + Subscription + + + Database collation + + + Collation for database + + + SQL database name + + + Database settings + + + Max IP address in firewall IP range + + + Max IP address + + + Create a firewall rule for your local client IP in order to connect to your database through Azure Data Studio after creation is completed. + + + Firewall rule name + + + Create a firewall rule + + + Enter IP addresses in IPv4 format. + + Please fill out the required fields marked with red asterisks. - - Azure settings + + Deploy Azure SQL DB - - Configure the settings to create an Azure Kubernetes Service cluster - - - Subscription id - - - Use my default Azure subscription - - - The default subscription will be used if you leave this field blank. - - - View available Azure subscriptions - - - New resource group name - - - Location - - - View available Azure locations - - - AKS cluster name - - - VM count - - - VM size - - - View available VM sizes - - - - - The cluster name must consist only of alphanumeric lowercase characters or '-' and must start and end with an alphanumeric character. - - - Cluster settings - - - Configure the SQL Server Big Data Cluster settings - - - Cluster name - - - Admin username - - - This username will be used for controller and SQL Server. Username for the gateway will be root. - - - Password - - - This password can be used to access the controller, SQL Server and gateway. - - - Confirm password - - - Authentication mode - - - Basic - - - Active Directory - - - Docker settings - - - Registry - - - Repository - - - Image tag - - - Username - - - Password - - - Active Directory settings - - - Organizational unit - - - Distinguished name for the organizational unit. For example: OU=bdc,DC=contoso,DC=com. - - - Domain controller FQDNs - - - Use comma to separate the values. - - - Fully qualified domain names for the domain controller. For example: DC1.CONTOSO.COM. Use comma to separate multiple FQDNs. - - - Domain DNS IP addresses - - - Use comma to separate the values. - - - Domain DNS servers' IP Addresses. Use comma to separate multiple IP addresses. - - - Domain DNS name - - - If not provided, the domain DNS name will be used as the default value. - - - Cluster admin group - - - The Active Directory group for cluster admin. - - - Cluster users - - - Use comma to separate the values. - - - The Active Directory users/groups with cluster users role. Use comma to separate multiple users/groups. - - - Service account username - - - Domain service account for Big Data Cluster - - - Service account password - - - App owners - - - Use comma to separate the values. - - - The Active Directory users or groups with app owners role. Use comma to separate multiple users/groups. - - - App readers - - - Use comma to separate the values. - - - The Active Directory users or groups of app readers. Use comma as separator them if there are multiple users/groups. - - - Subdomain - - - A unique DNS subdomain to use for this SQL Server Big Data Cluster. If not provided, the cluster name will be used as the default value. - - - Account prefix - - - A unique prefix for AD accounts SQL Server Big Data Cluster will generate. If not provided, the subdomain name will be used as the default value. If a subdomain is not provided, the cluster name will be used as the default value. - - - Password - - - - - Service settings - - - Scale settings - - - SQL Server master instances - - - Compute pool instances - - - Data pool instances - - - Spark pool instances - - - Storage pool (HDFS) instances - - - Include Spark in storage pool - - - DNS name - - - Port - - - Controller - - - Controller DNS name - - - Controller port - - - SQL Server Master - - - SQL Server Master DNS name - - - SQL Server Master port - - - Gateway - - - Gateway DNS name - - - Gateway port - - - Management proxy - - - Management proxy DNS name - - - Management proxy port - - - Application proxy - - - Application proxy DNS name - - - Application proxy port - - - Readable secondary - - - Readable secondary DNS name - - - Readable secondary port - - - Endpoint settings - - - Use controller settings - - - By default Controller storage settings will be applied to other services as well, you can expand the advanced storage settings to configure storage for other services. - - - Controller's data storage class - - - Controller's data storage claim size - - - Controller's logs storage class - - - Controller's logs storage claim size - - - Storage pool (HDFS) - - - Storage pool's data storage class - - - Storage pool's data storage claim size - - - Storage pool's logs storage class - - - Storage pool's logs storage claim size - - - Data pool - - - Data pool's data storage class - - - Data pool's data storage claim size - - - Data pool's logs storage class - - - Data pool's logs storage claim size - - - SQL Server master's data storage class - - - SQL Server master's data storage claim size - - - SQL Server master's logs storage class - - - SQL Server master's logs storage claim size - - - Service name - - - Storage class for data - - - Claim size for data (GB) - - - Storage class for logs - - - Claim size for logs (GB) - - - Storage settings - - - Storage settings - - - Invalid Spark configuration, you must check the 'Include Spark' checkbox or set the 'Spark pool instances' to at least 1. - - - - - Summary - - - Deployment target - - - Kube config - - - Cluster context - - - Cluster settings - - - Deployment profile - - - Cluster name - - - Controller username - - - Authentication mode - - - Active Directory - - - Basic - - - Organizational unit - - - Domain controller FQDNs - - - Domain DNS IP addresses - - - Domain DNS name - - - Cluster admin group - - - Cluster users - - - App owners - - - App readers - - - Subdomain - - - Account prefix - - - Service account username - - - Azure settings - - - Subscription id - - - Default Azure Subscription - - + Resource group - - Location + + Script to Notebook - - AKS cluster name + + Min IP address in firewall IP range - - VM size + + Min IP address - - VM count + + + + No servers found in current subscription. +Select a different subscription containing at least one server - - Scale settings + + No servers found - - SQL Server master instances + + Sign in to an Azure account first - - Compute pool instances + + + + Collation name must be between 1 and 100 characters long. - - Data pool instances + + Collation name cannot contain only numbers. - - Spark pool instances + + Collation name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - Storage pool (HDFS) instances + + Firewall name must be between 1 and 100 characters long. - - (Spark included) + + Firewall name cannot contain only numbers. - - Service + + Firewall name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - Storage class for data + + Upper case letters are not allowed for firewall name - - Claim size for data (GB) + + Max Ip address is invalid - - Storage class for logs + + Min Ip address is invalid - - Claim size for logs (GB) + + Database name must be unique in the current server. - - Storage settings + + Database name must be between 1 and 100 characters long. - - Controller + + Database name cannot contain only numbers. - - Storage pool (HDFS) + + Database name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - Data + + + + Azure Account - - SQL Server Master + + Region - - SQL Server Master + + Azure settings - - Gateway + + Subscription - - Application proxy + + Please fill out the required fields marked with red asterisks. - - Management proxy + + New subnet - - Readable secondary + + New virtual network - - Endpoint settings + + Configure network settings - - - - Target cluster context + + Networking - - Select the kube config file and then select a cluster context from the list + + New public ip - - Please select a cluster context. + + Deploy Azure SQL VM - - Kube config file path + + Public IP - - Browse + + Resource Group - - Cluster Contexts + + Script to Notebook - - No cluster information is found in the config file or an error ocurred while loading the config file + + Confirm password - - Select + + Password - - Failed to load the config file + + Username - + + SQL connectivity + + + Enable SQL authentication + + + Port + + + SQL Servers settings + + + Subnet + + + Virtual Network + + + Confirm password + + + Administrator account password + + + Administrator account username + + + Image + + + Image Version + + + Virtual machine name + + + Enable Remote Desktop (RDP) inbound port (3389) + + + Size + + + Image SKU + + + Virtual machine settings + + + Click here to learn more about pricing and supported VM sizes + + Password must be between 12 and 123 characters long. @@ -1408,10 +651,74 @@ Password must have 3 of the following: 1 lower case character, 1 upper case character, 1 number, and 1 special character. - + + + + Create a new new public Ip + + + Enter name for new public IP + + + Create a new sub network + + + Enter name for new subnet + + + Create a new virtual network + + + Enter name for new virtual network + + + Public IP name must be between 1 and 80 characters long + + + Subnet name must be between 1 and 80 characters long + + + Virtual Network name must be between 2 and 64 characters long + + + + + Local (inside VM only) + + + Private (within Virtual Network) + + + Public (Internet) + + + Password and confirm password must match. + + + Username must be between 2 and 128 characters long. + + + Username cannot contain special characters \/""[]:|<>+=;,?* . + + - - Virtual machine name must be between 1 and 15 characters long. + + Password and confirm password must match. + + + Username must be between 1 and 20 characters long. + + + Username must not include reserved words. + + + Username cannot contain special characters \/""[]:|<>+=;,?*@& . + + + Username cannot end with period + + + Virtual machine name must be unique in the current resource group. Virtual machine name cannot contain only numbers. @@ -1422,286 +729,979 @@ Virtual machine name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - Virtual machine name must be unique in the current resource group. - - - Username must be between 1 and 20 characters long. - - - Username cannot end with period - - - Username cannot contain special characters \/""[]:|<>+=;,?*@& . - - - Username must not include reserved words. - - - Password and confirm password must match. + + Virtual machine name must be between 1 and 15 characters long. Select a valid virtual machine size. - - - - Enter name for new virtual network + + + + Deploy SQL Server 2019 Big Data Cluster on an existing AKS cluster - - Enter name for new subnet + + Deploy SQL Server 2019 Big Data Cluster on an existing Azure Red Hat OpenShift cluster - - Enter name for new public IP + + Deploy SQL Server 2019 Big Data Cluster on an existing kubeadm cluster - - Virtual Network name must be between 2 and 64 characters long + + Deploy SQL Server 2019 Big Data Cluster on an existing OpenShift cluster - - Create a new virtual network + + Deploy SQL Server 2019 Big Data Cluster on a new AKS cluster - - Subnet name must be between 1 and 80 characters long + + Config files saved to {0} - - Create a new sub network + + Save config files - - Public IP name must be between 1 and 80 characters long + + Script to Notebook - - Create a new new public Ip + + Save config files - - - - Private (within Virtual Network) + + + + AKS cluster name - - Local (inside VM only) + + View available Azure locations - - Public (Internet) + + Configure the settings to create an Azure Kubernetes Service cluster - - Username must be between 2 and 128 characters long. + + Azure settings - - Username cannot contain special characters \/""[]:|<>+=;,?* . + + Location - - Password and confirm password must match. + + Please fill out the required fields marked with red asterisks. - + + New resource group name + + + The default subscription will be used if you leave this field blank. + + + Subscription id + + + View available Azure subscriptions + + + Use my default Azure subscription + + + VM count + + + VM size + + + View available VM sizes + + + + + Account prefix + + + A unique prefix for AD accounts SQL Server Big Data Cluster will generate. If not provided, the subdomain name will be used as the default value. If a subdomain is not provided, the cluster name will be used as the default value. + + + Active Directory settings + + + Password + + + This password can be used to access the controller, SQL Server and gateway. + + + Password + + + Admin username + + + This username will be used for controller and SQL Server. Username for the gateway will be root. + + + App owners + + + The Active Directory users or groups with app owners role. Use comma to separate multiple users/groups. + + + Use comma to separate the values. + + + App readers + + + The Active Directory users or groups of app readers. Use comma as separator them if there are multiple users/groups. + + + Use comma to separate the values. + + + Authentication mode + + + Active Directory + + + Basic + + + Cluster admin group + + + The Active Directory group for cluster admin. + + + Cluster name + + + The cluster name must consist only of alphanumeric lowercase characters or '-' and must start and end with an alphanumeric character. + + + Configure the SQL Server Big Data Cluster settings + + + Cluster settings + + + Cluster users + + + The Active Directory users/groups with cluster users role. Use comma to separate multiple users/groups. + + + Use comma to separate the values. + + + Confirm password + + + Image tag + + + Password + + + Registry + + + Repository + + + Docker settings + + + Username + + + Fully qualified domain names for the domain controller. For example: DC1.CONTOSO.COM. Use comma to separate multiple FQDNs. + + + Domain controller FQDNs + + + Use comma to separate the values. + + + Domain DNS IP addresses + + + Domain DNS servers' IP Addresses. Use comma to separate multiple IP addresses. + + + Use comma to separate the values. + + + Domain DNS name + + + Service account password + + + Service account username + + + Domain service account for Big Data Cluster + + + Organizational unit + + + Distinguished name for the organizational unit. For example: OU=bdc,DC=contoso,DC=com. + + + If not provided, the domain DNS name will be used as the default value. + + + Subdomain + + + A unique DNS subdomain to use for this SQL Server Big Data Cluster. If not provided, the cluster name will be used as the default value. + + + + + Note: The settings of the deployment profile can be customized in later steps. + + + Please select a deployment profile. + + + Service + + + Storage type + + + Active Directory authentication + + + Basic authentication + + + Compute + + + Data + + + Data + + + Features + + + Feature + + + High Availability + + + HDFS + Spark + + + Failed to load the deployment profiles: {0} + + + Loading profiles + + + Loading profiles completed + + + Logs + + + SQL Server Master + + + No + + + Deployment configuration profile + + + Service scale settings (Instances) + + + Service storage settings (GB per Instance) + + + Select the target configuration profile + + + Deployment configuration profile + + + Yes + + + + + By default Controller storage settings will be applied to other services as well, you can expand the advanced storage settings to configure storage for other services. + + + Application proxy DNS name + + + Application proxy port + + + Application proxy + + + Compute pool instances + + + Controller DNS name + + + Controller port + + + Controller + + + DNS name + + + Claim size for data (GB) + + + Data pool + + + Data pool instances + + + Storage class for data + + + Endpoint settings + + + Gateway DNS name + + + Gateway port + + + Gateway + + + Include Spark in storage pool + + + Storage class for logs + + + Claim size for logs (GB) + + + SQL Server Master DNS name + + + SQL Server Master port + + + SQL Server master instances + + + SQL Server Master + + + Port + + + Readable secondary DNS name + + + Readable secondary port + + + Readable secondary + + + Service name + + + Management proxy DNS name + + + Management proxy port + + + Management proxy + + + Service settings + + + Invalid Spark configuration, you must check the 'Include Spark' checkbox or set the 'Spark pool instances' to at least 1. + + + Spark pool instances + + + Storage pool (HDFS) + + + Storage pool (HDFS) instances + + + Storage settings + + + Storage settings + + + Controller's data storage claim size + + + Controller's data storage class + + + Controller's logs storage claim size + + + Controller's logs storage class + + + Data pool's data storage claim size + + + Data pool's data storage class + + + Data pool's logs storage claim size + + + Data pool's logs storage class + + + Scale settings + + + SQL Server master's data storage claim size + + + SQL Server master's data storage class + + + SQL Server master's logs storage claim size + + + SQL Server master's logs storage class + + + Use controller settings + + + Storage pool's data storage claim size + + + Storage pool's data storage class + + + Storage pool's logs storage claim size + + + Storage pool's logs storage class + + + + + Account prefix + + + AKS cluster name + + + App owners + + + App readers + + + Application proxy + + + Authentication mode + + + Active Directory + + + Basic + + + Azure settings + + + Cluster admin group + + + Cluster context + + + Cluster name + + + Cluster settings + + + Cluster users + + + Compute pool instances + + + Controller + + + Controller username + + + Claim size for data (GB) + + + Data pool instances + + + Storage class for data + + + Data + + + Default Azure Subscription + + + Deployment profile + + + Deployment target + + + Domain controller FQDNs + + + Domain DNS IP addresses + + + Domain DNS name + + + Service account username + + + Endpoint settings + + + Gateway + + + Kube config + + + Location + + + Storage class for logs + + + Claim size for logs (GB) + + + SQL Server master instances + + + SQL Server Master + + + Organizational unit + + + Readable secondary + + + Resource group + + + Scale settings + + + Service + + + Management proxy + + + Spark pool instances + + + SQL Server Master + + + Storage pool (HDFS) + + + Storage pool (HDFS) instances + + + Storage settings + + + Subdomain + + + Subscription id + + + VM count + + + VM size + + + (Spark included) + + + Summary + + + + + Please select a cluster context. + + + Failed to load the config file + + + Select the kube config file and then select a cluster context from the list + + + Target cluster context + + + Browse + + + Cluster Contexts + + + No cluster information is found in the config file or an error ocurred while loading the config file + + + Kube config file path + + + Select + + + + + OK + + + Open Notebook + + + Notebook type + + + + + + Error Details: {0}. + + + The selected account '{0}' is no longer available. Click sign in to add it again or select a different account. + + + The access token for selected account '{0}' is no longer valid. Please click the sign in button and refresh the account or select a different account. + + + Unexpected error fetching accounts: {0} + + + Unexpected error fetching resource groups for subscription {0}: {1} + + + Unexpected error fetching subscriptions for account {0}: {1} + + + File: {0} not found. Please select a kube config file. + + + Path: {0} is not a file, please select a valid kube config file. + + + An error ocurred while loading or parsing the config file:{0}, error is:{1} + + + {0} doesn't meet the password complexity requirement. For more information: https://docs.microsoft.com/sql/relational-databases/security/password-policy + + + {0} doesn't match the confirmation password + + + Unexpected error fetching available kubectl storage classes : {0} + + Review your configuration - - - - Min Ip address is invalid - - - Max Ip address is invalid - - - Firewall name cannot contain only numbers. - - - Firewall name must be between 1 and 100 characters long. - - - Firewall name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - - Upper case letters are not allowed for firewall name - - - Database name cannot contain only numbers. - - - Database name must be between 1 and 100 characters long. - - - Database name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - - Database name must be unique in the current server. - - - Collation name cannot contain only numbers. - - - Collation name must be between 1 and 100 characters long. - - - Collation name cannot contain special characters \/""[]:|<>+=;,?*@&, . - - - - - Sign in to an Azure account first - - - No servers found - - - No servers found in current subscription. -Select a different subscription containing at least one server - - - - - Deployment pre-requisites - - - Some tools were still not discovered. Please make sure that they are installed, running and discoverable - - - To proceed, you must accept the terms of the End User License Agreement(EULA) - - - Loading required tools information completed - - - Loading required tools information - - - Accept terms of use - - - '{0}' [ {1} ] does not meet the minimum version requirement, please uninstall it and restart Azure Data Studio. - - - All required tools are installed now. - - - Following tools: {0} were still not discovered. Please make sure that they are installed, running and discoverable - - - '{0}' was not discovered and automated installation is not currently supported. Install '{0}' manually or ensure it is started and discoverable. Once done please restart Azure Data Studio. See [{1}] . - - - You will need to restart Azure Data Studio if the tools are installed manually to pick up the change. You may find additional details in 'Deployments' and 'Azure Data CLI' output channels - - - Tool: {0} is not installed, you can click the "{1}" button to install it. - - - Tools: {0} are not installed, you can click the "{1}" button to install them. - - - No tools required - - + Download and launch installer, URL: {0} - - Downloading from: {0} - Successfully downloaded: {0} + + Downloading from: {0} + Launching: {0} Successfully launched: {0} - - - - Packages and runs applications in isolated containers - - - docker - - - - - Manages Azure resources - - - Azure CLI - - - deleting previously downloaded azurecli.msi if one exists … - - - downloading azurecli.msi and installing azure-cli … - - - displaying the installation log … - - - updating your brew repository for azure-cli installation … - - - installing azure-cli … - - - updating repository information before installing azure-cli … - - - getting packages needed for azure-cli installation … - - - downloading and installing the signing key for azure-cli … - - - adding the azure-cli repository information … - - - updating repository information again for azure-cli … - - - download and invoking script to install azure-cli … - - - - - Azure Data command line interface - - - Azure Data CLI - - - deleting previously downloaded Azdata.msi if one exists … - - - downloading Azdata.msi and installing azdata-cli … - - - displaying the installation log … - - - tapping into the brew repository for azdata-cli … - - - updating the brew repository for azdata-cli installation … - - - installing azdata … - - - updating repository information … - - - getting packages needed for azdata installation … - - - downloading and installing the signing key for azdata … - - - adding the azdata repository information … - - + Deployment options - + + + + To proceed, you must accept the terms of the End User License Agreement(EULA) + + + Some tools were still not discovered. Please make sure that they are installed, running and discoverable + + + Tools: {0} are not installed, you can click the "{1}" button to install them. + + + Tool: {0} is not installed, you can click the "{1}" button to install it. + + + All required tools are installed now. + + + No tools required + + + Following tools: {0} were still not discovered. Please make sure that they are installed, running and discoverable + + + '{0}' [ {1} ] does not meet the minimum version requirement, please uninstall it and restart Azure Data Studio. + + + '{0}' was not discovered and automated installation is not currently supported. Install '{0}' manually or ensure it is started and discoverable. Once done please restart Azure Data Studio. See [{1}] . + + + You will need to restart Azure Data Studio if the tools are installed manually to pick up the change. You may find additional details in 'Deployments' and 'Azure Data CLI' output channels + + + Loading required tools information + + + Loading required tools information completed + + + Deployment pre-requisites + + + Accept terms of use + + + + + Location of the azdata package used for the install command + + + I accept {0} and {1}. + + + Azure SQL MI License Terms + + + Create a SQL Managed Instance in either Azure or a customer-managed environment + + + Azure SQL managed instance + + + Azure SQL Managed Instance provides full SQL Server access and feature compatibility for migrating SQL Servers to Azure, or developing new applications. {0}. + + + Learn More + + + Open in Portal + + + Resource Type + + + I accept {0}, {1} and {2}. + + + azdata License Terms + + + Azure SQL DB License Terms + + + Create a SQL database, database server, or elastic pool in Azure. + + + Azure SQL Database + + + Select + + + Create in Azure portal + + + I accept {0}, {1} and {2}. + + + azdata License Terms + + + Azure SQL VM License Terms + + + Azure information + + + Azure locations + + + Script to notebook + + + Deploy Azure SQL virtual machine + + + Create SQL virtual machines on Azure. Best for migrations and applications requiring OS-level access. + + + SQL Server on Azure Virtual Machine + + + Image + + + VM image SKU + + + Confirm password + + + Password + + + Publisher + + + Storage account name + + + Storage account + + + Storage account SKU type + + + Username + + + Administrator account + + + VM information + + + Summary + + + Virtual machine name + + + Size + + + Deployment + + + New Deployment… + + + Deployment configuration + + + Confirm password + + + Container name + + + Deploy SQL Server 2017 container images + + + Deploy SQL Server 2019 container images + + + SQL Server password + + + Port + + + Provides a notebook-based experience to deploy Microsoft SQL Server + + + SQL Server Deployment extension for Azure Data Studio + + + Microsoft Privacy Statement + + + Resource Type + + + Run SQL Server container image with docker + + + SQL Server container image + + + Run SQL Server on Windows, select a version to get started. + + + SQL Server on Windows + + + SQL Server 2017 + + + SQL Server 2019 + + + Database Server + + + Elastic Pool + + + Single Database + + + Version + + \ No newline at end of file diff --git a/resources/xlf/en/schema-compare.xlf b/resources/xlf/en/schema-compare.xlf index 717b3e18e1..044f7c80b3 100644 --- a/resources/xlf/en/schema-compare.xlf +++ b/resources/xlf/en/schema-compare.xlf @@ -1,466 +1,17 @@ - - - SQL Server Schema Compare - - - SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs. - - - Schema Compare - - - - OK - - - Cancel - - - Source - - - Target - - - File - - - Data-tier Application File (.dacpac) - - - Database - - - Type - - - Server - - - Database - - - Schema Compare - - - A different source schema has been selected. Compare to see the comparison? - - - A different target schema has been selected. Compare to see the comparison? - - - Different source and target schemas have been selected. Compare to see the comparison? - - - Yes - - - No - - - Source file - - - Target file - - - Source Database - - - Target Database - - - Source Server - - - Target Server - - - default - - - Open - - - Select source file - - - Select target file - - - Reset - - - Options have changed. Recompare to see the comparison? - - - Schema Compare Options - - - General Options - - - Include Object Types - - - Compare Details - - - Are you sure you want to update the target? - - - Press Compare to refresh the comparison. - - - Generate script to deploy changes to target - - - No changes to script - - - Apply changes to target - - - No changes to apply - - - Please note that include/exclude operations can take a moment to calculate affected dependencies - - - Delete - - - Change - - - Add - - - Comparison between Source and Target - - - Initializing Comparison. This might take a moment. - - - To compare two schemas, first select a source schema and target schema, then press Compare. - - - No schema differences were found. - - - Type - - - Source Name - - - Include - - - Action - - - Target Name - - - Generate script is enabled when the target is a database - - - Apply is enabled when the target is a database - - - Cannot exclude {0}. Included dependents exist, such as {1} - - - Cannot include {0}. Excluded dependents exist, such as {1} - - - Cannot exclude {0}. Included dependents exist - - - Cannot include {0}. Excluded dependents exist - - - Compare - - - Stop - - - Generate script - - - Options - - - Apply - - - Switch direction - - - Switch source and target - - - Select Source - - - Select Target - - - Open .scmp file - - - Load source, target, and options saved in an .scmp file - - - Save .scmp file - - - Save source and target, options, and excluded elements - - - Save - - - Do you want to connect to {0}? - - - Select connection - - - Ignore Table Options - - - Ignore Semicolon Between Statements - - - Ignore Route Lifetime - - - Ignore Role Membership - - - Ignore Quoted Identifiers - - - Ignore Permissions - - - Ignore Partition Schemes - - - Ignore Object Placement On Partition Scheme - - - Ignore Not For Replication - - - Ignore Login Sids - - - Ignore Lock Hints On Indexes - - - Ignore Keyword Casing - - - Ignore Index Padding - - - Ignore Index Options - - - Ignore Increment - - - Ignore Identity Seed - - - Ignore User Settings Objects - - - Ignore Full Text Catalog FilePath - - - Ignore Whitespace - - - Ignore With Nocheck On ForeignKeys - - - Verify Collation Compatibility - - - Unmodifiable Object Warnings - - - Treat Verification Errors As Warnings - - - Script Refresh Module - - - Script New Constraint Validation - - - Script File Size - - - Script Deploy StateChecks - - - Script Database Options - - - Script Database Compatibility - - - Script Database Collation - - - Run Deployment Plan Executors - - - Register DataTier Application - - - Populate Files On File Groups - - - No Alter Statements To Change Clr Types - - - Include Transactional Scripts - - - Include Composite Objects - - - Allow Unsafe Row Level Security Data Movement - - - Ignore With No check On Check Constraints - - - Ignore Fill Factor - - - Ignore File Size - - - Ignore Filegroup Placement - - - Do Not Alter Replicated Objects - - - Do Not Alter Change Data Capture Objects - - - Disable And Reenable Ddl Triggers - - - Deploy Database In Single User Mode - - - Create New Database - - - Compare Using Target Collation - - - Comment Out Set Var Declarations - - - Block When Drift Detected - - - Block On Possible Data Loss - - - Backup Database Before Changes - - - Allow Incompatible Platform + + Aggregates Allow Drop Blocking Assemblies - - Drop Constraints Not In Source + + Allow Incompatible Platform - - Drop Dml Triggers Not In Source - - - Drop Extended Properties Not In Source - - - Drop Indexes Not In Source - - - Ignore File And Log File Path - - - Ignore Extended Properties - - - Ignore Dml Trigger State - - - Ignore Dml Trigger Order - - - Ignore Default Schema - - - Ignore Ddl Trigger State - - - Ignore Ddl Trigger Order - - - Ignore Cryptographic Provider FilePath - - - Verify Deployment - - - Ignore Comments - - - Ignore Column Collation - - - Ignore Authorizer - - - Ignore AnsiNulls - - - Generate SmartDefaults - - - Drop Statistics Not In Source - - - Drop Role Members Not In Source - - - Drop Permissions Not In Source - - - Drop Objects Not In Source - - - Ignore Column Order - - - Aggregates + + Allow Unsafe Row Level Security Data Movement Application Roles @@ -474,33 +25,348 @@ Asymmetric Keys + + Audits + + + Backup Database Before Changes + + + Block On Possible Data Loss + + + Block When Drift Detected + Broker Priorities Certificates + + Clr User Defined Types + Column Encryption Keys Column Master Keys + + Comment Out Set Var Declarations + + + Compare Using Target Collation + Contracts + + Create New Database + + + Credentials + + + Cryptographic Providers + + + Database Audit Specifications + + + Database Encryption Keys + Database Options Database Roles + + Database Scoped Credentials + Database Triggers Defaults + + Deploy Database In Single User Mode + + + This property is used by SqlClr deployment to cause any blocking assemblies to be dropped as part of the deployment plan. By default, any blocking/referencing assemblies will block an assembly update if the referencing assembly needs to be dropped. + + + Specifies whether to attempt the action despite incompatible SQL Server platforms. + + + Do not block data motion on a table which has Row Level Security if this property is set to true. Default is false. + + + Backups the database before deploying any changes. + + + Specifies that the publish episode should be terminated if there is a possibility of data loss resulting from the publish operation. + + + Specifies whether to block updating a database whose schema no longer matches its registration or is unregistered. + + + Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE. + + + This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used. + + + Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database. + + + If true, the database is set to Single User Mode before deploying. + + + Specifies whether Data Definition Language (DDL) triggers are disabled at the beginning of the publish process and re-enabled at the end of the publish action. + + + If true, Change Data Capture objects are not altered. + + + Specifies whether objects that are replicated are identified during verification. + + + Specifies whether constraints that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. + + + Specifies whether DML triggers that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. + + + Specifies whether extended properties that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. + + + Specifies whether indexes that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. + + + Specifies whether objects that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. This value takes precedence over DropExtendedProperties. + + + Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database. + + + Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</ + + + Specifies whether statistics that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. + + + Automatically provides a default value when updating a table that contains data with a column that does not allow null values. + + + Specifies whether differences in the ANSI NULLS setting should be ignored or updated when you publish to a database. + + + Specifies whether differences in the Authorizer should be ignored or updated when you publish to a database. + + + Specifies whether differences in the column collations should be ignored or updated when you publish to a database. + + + Specifies whether differences in table column order should be ignored or updated when you publish to a database. + + + Specifies whether differences in the comments should be ignored or updated when you publish to a database. + + + Specifies whether differences in the file path for the cryptographic provider should be ignored or updated when you publish to a database. + + + Specifies whether differences in the order of Data Definition Language (DDL) triggers should be ignored or updated when you publish to a database or server. + + + Specifies whether differences in the enabled or disabled state of Data Definition Language (DDL) triggers should be ignored or updated when you publish to a database. + + + Specifies whether differences in the default schema should be ignored or updated when you publish to a database. + + + Specifies whether differences in the order of Data Manipulation Language (DML) triggers should be ignored or updated when you publish to a database. + + + Specifies whether differences in the enabled or disabled state of DML triggers should be ignored or updated when you publish to a database. + + + Specifies whether extended properties should be ignored. + + + Specifies whether differences in the paths for files and log files should be ignored or updated when you publish to a database. + + + Specifies whether differences in the file sizes should be ignored or whether a warning should be issued when you publish to a database. + + + Specifies whether differences in the placement of objects in FILEGROUPs should be ignored or updated when you publish to a database. + + + Specifies whether differences in the fill factor for index storage should be ignored or whether a warning should be issued when you publish to a database. + + + Specifies whether differences in the file path for the full-text catalog should be ignored or whether a warning should be issued when you publish to a database. + + + Specifies whether differences in the seed for an identity column should be ignored or updated when you publish updates to a database. + + + Specifies whether differences in the increment for an identity column should be ignored or updated when you publish to a database. + + + Specifies whether differences in the index options should be ignored or updated when you publish to a database. + + + Specifies whether differences in the index padding should be ignored or updated when you publish to a database. + + + Specifies whether differences in the casing of keywords should be ignored or updated when you publish to a database. + + + Specifies whether differences in the lock hints on indexes should be ignored or updated when you publish to a database. + + + Specifies whether differences in the security identification number (SID) should be ignored or updated when you publish to a database. + + + Specifies whether the not for replication settings should be ignored or updated when you publish to a database. + + + Specifies whether an object's placement on a partition scheme should be ignored or updated when you publish to a database. + + + Specifies whether differences in partition schemes and functions should be ignored or updated when you publish to a database. + + + Specifies whether permissions should be ignored. + + + Specifies whether differences in the quoted identifiers setting should be ignored or updated when you publish to a database. + + + Specifies whether differences in the role membership of logins should be ignored or updated when you publish to a database. + + + Specifies whether differences in the amount of time that SQL Server retains the route in the routing table should be ignored or updated when you publish to a database. + + + Specifies whether differences in the semi-colons between T-SQL statements will be ignored or updated when you publish to a database. + + + Specifies whether differences in the table options will be ignored or updated when you publish to a database. + + + Specifies whether differences in the user settings objects will be ignored or updated when you publish to a database. + + + Specifies whether differences in white space will be ignored or updated when you publish to a database. + + + Specifies whether differences in the value of the WITH NOCHECK clause for check constraints will be ignored or updated when you publish to a database. + + + Specifies whether differences in the value of the WITH NOCHECK clause for foreign keys will be ignored or updated when you publish to a database. + + + Include all composite elements as part of a single publish operation. + + + Specifies whether transactional statements should be used where possible when you publish to a database. + + + Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement + + + Specifies whether a new file is also created when a new FileGroup is created in the target database. + + + Specifies whether the schema is registered with the database server. + + + Specifies whether DeploymentPlanExecutor contributors should be run when other operations are executed. + + + Specifies whether differences in the database collation should be ignored or updated when you publish to a database. + + + Specifies whether differences in the database compatibility should be ignored or updated when you publish to a database. + + + Specifies whether target database properties should be set or updated as part of the publish action. + + + Specifies whether statements are generated in the publish script to verify that the database name and server name match the names specified in the database project. + + + Controls whether size is specified when adding a file to a filegroup. + + + At the end of publish all of the constraints will be verified as one set, avoiding data errors caused by a check or foreign key constraint in the middle of publish. If set to False, your constraints will be published without checking the corresponding data. + + + Include refresh statements at the end of the publish script. + + + Specifies whether errors encountered during publish verification should be treated as warnings. The check is performed against the generated deployment plan before the plan is executed against your target database. Plan verification detects problems such as the loss of target-only objects (such as indexes) that must be dropped to make a change. Verification will also detect situations where dependencies (such as a table or view) exist because of a reference to a composite project, but do not exist in the target database. You might choose to do this to get a complete list of all issues, instead of having the publish action stop on the first error. + + + Specifies whether warnings should be generated when differences are found in objects that cannot be modified, for example, if the file size or file paths were different for a file. + + + Specifies whether collation compatibility is verified. + + + Specifies whether checks should be performed before publishing that will stop the publish action if issues are present that might block successful publishing. For example, your publish action might stop if you have foreign keys on the target database that do not exist in the database project, and that will cause errors when you publish. + + + Disable And Reenable Ddl Triggers + + + Do Not Alter Change Data Capture Objects + + + Do Not Alter Replicated Objects + + + Drop Constraints Not In Source + + + Drop Dml Triggers Not In Source + + + Drop Extended Properties Not In Source + + + Drop Indexes Not In Source + + + Drop Objects Not In Source + + + Drop Permissions Not In Source + + + Drop Role Members Not In Source + + + Drop Statistics Not In Source + + + Endpoints + + + Error Messages + + + Event Notifications + + + Event Sessions + Extended Properties @@ -510,33 +376,174 @@ External File Formats - - External Streams - External Streaming Jobs + + External Streams + External Tables + + File Tables + Filegroups Files - - File Tables - Full Text Catalogs Full Text Stoplists + + General Options + + + Generate SmartDefaults + + + Ignore AnsiNulls + + + Ignore Authorizer + + + Ignore Column Collation + + + Ignore Column Order + + + Ignore Comments + + + Ignore Cryptographic Provider FilePath + + + Ignore Ddl Trigger Order + + + Ignore Ddl Trigger State + + + Ignore Default Schema + + + Ignore Dml Trigger Order + + + Ignore Dml Trigger State + + + Ignore Extended Properties + + + Ignore File And Log File Path + + + Ignore File Size + + + Ignore Filegroup Placement + + + Ignore Fill Factor + + + Ignore Full Text Catalog FilePath + + + Ignore Identity Seed + + + Ignore Increment + + + Ignore Index Options + + + Ignore Index Padding + + + Ignore Keyword Casing + + + Ignore Lock Hints On Indexes + + + Ignore Login Sids + + + Ignore Not For Replication + + + Ignore Object Placement On Partition Scheme + + + Ignore Partition Schemes + + + Ignore Permissions + + + Ignore Quoted Identifiers + + + Ignore Role Membership + + + Ignore Route Lifetime + + + Ignore Semicolon Between Statements + + + Ignore Table Options + + + Ignore User Settings Objects + + + Ignore Whitespace + + + Ignore With No check On Check Constraints + + + Ignore With Nocheck On ForeignKeys + + + Include Composite Objects + + + Include Transactional Scripts + + + Linked Server Logins + + + Linked Servers + + + Logins + + + Master Keys + Message Types + + No Alter Statements To Change Clr Types + + + Include Object Types + Partition Functions @@ -546,21 +553,57 @@ Permissions + + Populate Files On File Groups + Queues + + Register DataTier Application + Remote Service Bindings Role Membership + + Routes + Rules + + Run Deployment Plan Executors + Scalar Valued Functions + + Schema Compare Options + + + Script Database Collation + + + Script Database Compatibility + + + Script Database Options + + + Script Deploy StateChecks + + + Script File Size + + + Script New Constraint Validation + + + Script Refresh Module + Search Property Lists @@ -570,6 +613,18 @@ Sequences + + Server Audit Specifications + + + Server Role Membership + + + Server Roles + + + Server Triggers + Services @@ -585,11 +640,17 @@ Synonyms + + Table Valued Functions + Tables - - Table Valued Functions + + Treat Verification Errors As Warnings + + + Unmodifiable Object Warnings User Defined Data Types @@ -597,320 +658,259 @@ User Defined Table Types - - Clr User Defined Types - Users + + Verify Collation Compatibility + + + Verify Deployment + Views Xml Schema Collections - - Audits + + Reset - - Credentials + + Are you sure you want to update the target? - - Cryptographic Providers + + Compare Details - - Database Audit Specifications + + Do you want to connect to {0}? - - Database Encryption Keys + + Press Compare to refresh the comparison. - - Database Scoped Credentials + + Action - - Endpoints + + Add - - Error Messages + + Apply is enabled when the target is a database - - Event Notifications + + Apply changes to target - - Event Sessions + + No changes to apply - - Linked Server Logins - - - Linked Servers - - - Logins - - - Master Keys - - - Routes - - - Server Audit Specifications - - - Server Role Membership - - - Server Roles - - - Server Triggers - - - Specifies whether differences in the table options will be ignored or updated when you publish to a database. - - - Specifies whether differences in the semi-colons between T-SQL statements will be ignored or updated when you publish to a database. - - - Specifies whether differences in the amount of time that SQL Server retains the route in the routing table should be ignored or updated when you publish to a database. - - - Specifies whether differences in the role membership of logins should be ignored or updated when you publish to a database. - - - Specifies whether differences in the quoted identifiers setting should be ignored or updated when you publish to a database. - - - Specifies whether permissions should be ignored. - - - Specifies whether differences in partition schemes and functions should be ignored or updated when you publish to a database. - - - Specifies whether an object's placement on a partition scheme should be ignored or updated when you publish to a database. - - - Specifies whether the not for replication settings should be ignored or updated when you publish to a database. - - - Specifies whether differences in the security identification number (SID) should be ignored or updated when you publish to a database. - - - Specifies whether differences in the lock hints on indexes should be ignored or updated when you publish to a database. - - - Specifies whether differences in the casing of keywords should be ignored or updated when you publish to a database. - - - Specifies whether differences in the index padding should be ignored or updated when you publish to a database. - - - Specifies whether differences in the index options should be ignored or updated when you publish to a database. - - - Specifies whether differences in the increment for an identity column should be ignored or updated when you publish to a database. - - - Specifies whether differences in the seed for an identity column should be ignored or updated when you publish updates to a database. - - - Specifies whether differences in the user settings objects will be ignored or updated when you publish to a database. - - - Specifies whether differences in the file path for the full-text catalog should be ignored or whether a warning should be issued when you publish to a database. - - - Specifies whether differences in white space will be ignored or updated when you publish to a database. - - - Specifies whether differences in the value of the WITH NOCHECK clause for foreign keys will be ignored or updated when you publish to a database. - - - Specifies whether collation compatibility is verified. - - - Specifies whether warnings should be generated when differences are found in objects that cannot be modified, for example, if the file size or file paths were different for a file. - - - Specifies whether errors encountered during publish verification should be treated as warnings. The check is performed against the generated deployment plan before the plan is executed against your target database. Plan verification detects problems such as the loss of target-only objects (such as indexes) that must be dropped to make a change. Verification will also detect situations where dependencies (such as a table or view) exist because of a reference to a composite project, but do not exist in the target database. You might choose to do this to get a complete list of all issues, instead of having the publish action stop on the first error. - - - Include refresh statements at the end of the publish script. - - - At the end of publish all of the constraints will be verified as one set, avoiding data errors caused by a check or foreign key constraint in the middle of publish. If set to False, your constraints will be published without checking the corresponding data. - - - Controls whether size is specified when adding a file to a filegroup. - - - Specifies whether statements are generated in the publish script to verify that the database name and server name match the names specified in the database project. - - - Specifies whether target database properties should be set or updated as part of the publish action. - - - Specifies whether differences in the database compatibility should be ignored or updated when you publish to a database. - - - Specifies whether differences in the database collation should be ignored or updated when you publish to a database. - - - Specifies whether DeploymentPlanExecutor contributors should be run when other operations are executed. - - - Specifies whether the schema is registered with the database server. - - - Specifies whether a new file is also created when a new FileGroup is created in the target database. - - - Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement - - - Specifies whether transactional statements should be used where possible when you publish to a database. - - - Include all composite elements as part of a single publish operation. - - - Do not block data motion on a table which has Row Level Security if this property is set to true. Default is false. - - - Specifies whether differences in the value of the WITH NOCHECK clause for check constraints will be ignored or updated when you publish to a database. - - - Specifies whether differences in the fill factor for index storage should be ignored or whether a warning should be issued when you publish to a database. - - - Specifies whether differences in the file sizes should be ignored or whether a warning should be issued when you publish to a database. - - - Specifies whether differences in the placement of objects in FILEGROUPs should be ignored or updated when you publish to a database. - - - Specifies whether objects that are replicated are identified during verification. - - - If true, Change Data Capture objects are not altered. - - - Specifies whether Data Definition Language (DDL) triggers are disabled at the beginning of the publish process and re-enabled at the end of the publish action. - - - If true, the database is set to Single User Mode before deploying. - - - Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database. - - - This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used. - - - Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE. - - - Specifies whether to block updating a database whose schema no longer matches its registration or is unregistered. - - - Specifies that the publish episode should be terminated if there is a possibility of data loss resulting from the publish operation. - - - Backups the database before deploying any changes. - - - Specifies whether to attempt the action despite incompatible SQL Server platforms. - - - This property is used by SqlClr deployment to cause any blocking assemblies to be dropped as part of the deployment plan. By default, any blocking/referencing assemblies will block an assembly update if the referencing assembly needs to be dropped. - - - Specifies whether constraints that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. - - - Specifies whether DML triggers that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. - - - Specifies whether extended properties that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. - - - Specifies whether indexes that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. - - - Specifies whether differences in the paths for files and log files should be ignored or updated when you publish to a database. - - - Specifies whether extended properties should be ignored. - - - Specifies whether differences in the enabled or disabled state of DML triggers should be ignored or updated when you publish to a database. - - - Specifies whether differences in the order of Data Manipulation Language (DML) triggers should be ignored or updated when you publish to a database. - - - Specifies whether differences in the default schema should be ignored or updated when you publish to a database. - - - Specifies whether differences in the enabled or disabled state of Data Definition Language (DDL) triggers should be ignored or updated when you publish to a database. - - - Specifies whether differences in the order of Data Definition Language (DDL) triggers should be ignored or updated when you publish to a database or server. - - - Specifies whether differences in the file path for the cryptographic provider should be ignored or updated when you publish to a database. - - - Specifies whether checks should be performed before publishing that will stop the publish action if issues are present that might block successful publishing. For example, your publish action might stop if you have foreign keys on the target database that do not exist in the database project, and that will cause errors when you publish. - - - Specifies whether differences in the comments should be ignored or updated when you publish to a database. - - - Specifies whether differences in the column collations should be ignored or updated when you publish to a database. - - - Specifies whether differences in the Authorizer should be ignored or updated when you publish to a database. - - - Specifies whether differences in the ANSI NULLS setting should be ignored or updated when you publish to a database. - - - Automatically provides a default value when updating a table that contains data with a column that does not allow null values. - - - Specifies whether statistics that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. - - - Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</ - - - Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database. - - - Specifies whether objects that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. This value takes precedence over DropExtendedProperties. - - - Specifies whether differences in table column order should be ignored or updated when you publish to a database. - - - Schema Compare failed: {0} - - - Save scmp failed: '{0}' + + Stop Cancel schema compare failed: '{0}' + + Cannot exclude {0}. Included dependents exist + + + Cannot exclude {0}. Included dependents exist, such as {1} + + + Cannot include {0}. Excluded dependents exist + + + Cannot include {0}. Excluded dependents exist, such as {1} + + + Change + + + Compare + + + Schema Compare failed: {0} + + + Data-tier Application File (.dacpac) + + + Database + + + Delete + + + Schema Compare + + + Comparison between Source and Target + + + Generate script + + + Generate script is enabled when the target is a database + + + Generate script to deploy changes to target + Generate script failed: '{0}' - - Schema Compare Apply failed '{0}' + + No changes to script + + + Include + + + Please note that include/exclude operations can take a moment to calculate affected dependencies + + + No schema differences were found. + + + Open + + + Open .scmp file + + + Load source, target, and options saved in an .scmp file Open scmp failed: '{0}' - + + Options + + + Type + + + Save + + + Save .scmp file + + + Save source and target, options, and excluded elements + + + Save scmp failed: '{0}' + + + Select connection + + + Select source file + + + Select target file + + + Select Source + + + Source Name + + + To compare two schemas, first select a source schema and target schema, then press Compare. + + + Switch source and target + + + Switch direction + + + Select Target + + + Target Name + + + Type + + + Apply + + + Schema Compare Apply failed '{0}' + + + Initializing Comparison. This might take a moment. + + + No + + + Source + + + Target + + + Yes + + + Cancel + + + Database + + + default + + + A different source schema has been selected. Compare to see the comparison? + + + Different source and target schemas have been selected. Compare to see the comparison? + + + A different target schema has been selected. Compare to see the comparison? + + + File + + + OK + + + Server + + + Source Database + + + Source Server + + + Source file + + + Target Database + + + Target Server + + + Target file + + + Options have changed. Recompare to see the comparison? + + + + + SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs. + + + SQL Server Schema Compare + + + Schema Compare + + \ No newline at end of file diff --git a/resources/xlf/en/server-report.xlf b/resources/xlf/en/server-report.xlf index 0e9fa43763..2e5c57c013 100644 --- a/resources/xlf/en/server-report.xlf +++ b/resources/xlf/en/server-report.xlf @@ -4,14 +4,14 @@ XEvents operation failed. - - XEvents sessions started for PageContention and ObjectContention. - XEvents sessions not supported. + + XEvents sessions started for PageContention and ObjectContention. + XEvents sessions PageContention and ObjectContention removed. - + \ No newline at end of file diff --git a/resources/xlf/en/sql-assessment.xlf b/resources/xlf/en/sql-assessment.xlf index 0b450cf023..382d1be15c 100644 --- a/resources/xlf/en/sql-assessment.xlf +++ b/resources/xlf/en/sql-assessment.xlf @@ -1,61 +1,82 @@ - - - SQL Assessment + + + Target Type - - SQL Assessment for Azure Data Studio (Preview) provides a mechanism to evaluate the configuration of SQL Server for best practices. + + Click here - - SQL Assessment + + Recommendation - + + Rule Description + + + to learn more. + + + Database + + + Server + + - - Info - API Version - - Default Ruleset - - - SQL Server Instance Details - - - Version - - - Edition - - - Instance Name - - - OS Version - - - Target - - - Severity - - - Message - Check ID + + Help Link + + + Message + + + Severity + Tags + + Target + + + Instance Name + Learn More - - SQL Assessment Report + + OS Version + + + Default Ruleset + + + Info + + + SQL Server Instance Details + + + Edition + + + Version + + + Error + + + Information + + + Warning Results for database @@ -63,57 +84,22 @@ Results for server - - Error - - - Warning - - - Information - - - Help Link - {0}: {1} item(s) - - - - Database + + SQL Assessment Report - - Server - - - Target Type - - - Click here - - - to learn more. - - - Rule Description - - - Recommendation - - + - - Invoke assessment + + Cancel - - View applicable rules + + Open - - Invoke assessment for {0} - - - View applicable rules for {0} + + Report has been saved. Do you want to open it? Export as script @@ -121,34 +107,37 @@ Create HTML Report - - Report has been saved. Do you want to open it? - - - Open - - - Cancel + + View all rules and learn more on GitHub View all on GitHub - - View all rules and learn more on GitHub + + View applicable rules for {0} + + + View applicable rules + + + Invoke assessment for {0} + + + Invoke assessment SQL Assessment Information + + SQL Assessment Information copied + Copy OK - - SQL Assessment Information copied - - + << Back @@ -162,11 +151,22 @@ Error - - Warning - Information - + + Warning + + + + + SQL Assessment + + + SQL Assessment for Azure Data Studio (Preview) provides a mechanism to evaluate the configuration of SQL Server for best practices. + + + SQL Assessment + + \ No newline at end of file diff --git a/resources/xlf/en/sql-database-projects.xlf b/resources/xlf/en/sql-database-projects.xlf index d675d523a2..d366b4b332 100644 --- a/resources/xlf/en/sql-database-projects.xlf +++ b/resources/xlf/en/sql-database-projects.xlf @@ -1,163 +1,11 @@ - - - Projects - - - Database Projects - - - Design and publish SQL database schemas - - - New Database Project - - - Open Database Project - - - Close Database Project - - - Build - - - Publish - - - Deploy - - - Create Project From Database - - - Properties - - - Schema Compare - - - Delete - - - Exclude from project - - - Validate External Streaming Job - - - Add Script - - - Add Pre-Deployment Script - - - Add Post-Deployment Script - - - Add Table - - - Add View - - - Add Stored Procedure - - - Add External Streaming Job - - - Add Item... - - - Add Folder - - - Add Database Reference - - - Open Containing Folder - - - Edit .sqlproj File - - - Change Target Platform - - - Database Projects - - - Full path to .NET Core SDK on the machine. - - - Whether to prompt the user to install .NET Core when not detected. - - - No database projects currently open. -[New Project](command:sqlDatabaseProjects.new) -[Open Project](command:sqlDatabaseProjects.open) -[Create Project From Database](command:sqlDatabaseProjects.importDatabase) - - - Add SQL Binding - - - - SQL Database - - - Develop and publish schemas for SQL databases starting from an empty project - - - SQL Edge - - - Start with the core pieces to develop and publish schemas for SQL Edge - - - Add Item - - - Schema Compare - - - Build - - - Publish - - - Change Target Platform - - - Status - - - Time - - - Date - - - Target Platform - - - Target Server - - - Target Database - Build History - - Publish History - - - Success + + Date Failed @@ -165,164 +13,71 @@ In progress - - hr + + Publish History - - min + + Status - - sec + + Success - - msec + + Target Database + + + Target Platform + + + Target Server + + + Time + + + Add reference + + + Add database reference + + + Add Item + + + Add Package + + + Would you like to update Azure Function local.settings.json with the new connection string? at - - Data Sources + + Browse folder - - Database References + + Browse... - - SQL connection string + + Browse for profile - - Yes + + Build - - No - - - No (default) - - - Ok - - - Select - - - dacpac Files - - - Publish Settings File - - - File - - - Flat - - - Object Type - - - Schema - - - Schema/Object Type - - - DatabaseProject - - - Location - - - Would you like to reload your database project? - - - New {0} name: - - - Are you sure you want to delete {0}? - - - Are you sure you want to delete {0} and all of its contents? - - - Are you sure you want to delete the reference to {0}? - - - Current target platform: {0}. Select new target platform - - - Target platform of the project {0} is now {1} - - - Publish project - - - Publish + + Schema compare could not start because build failed Cancel - - Generate Script + + Cannot resolve path {0} - - Database + + A reference to project '{0}' cannot be added. Adding this project as a reference would cause a circular dependency - - Connection - - - Data sources - - - Connections - - - Data source - - - No data sources in this project - - - Load profile... - - - Error loading the publish profile. {0} - - - SQLCMD Variables - - - Name - - - Value - - - Reload values from project - - - Profile - - - Select connection - - - Server - - - default - - - Select publish profile to load - - - Select Profile - - - Don't use profile - - - Browse for profile + + Change Target Platform Choose action @@ -330,180 +85,80 @@ Choose SQLCMD variables to modify - - Enter new value for variable '{0}' - - - Reset all variables - - - <Create New> - - - Enter new database name - - - {0} (new) - Name is the name of a new database being created - - - Select database - - - Done - - - Name must not be empty - - - Select where to deploy the project to - - - Deploy to existing server - - - Deploy to docker container - - - Enter port number or press enter to use the default value - - - Enter connection string environment variable name - - - Enter connection string template - - - Enter password or press enter to use the generated password - - - Port must a be number - - - Value cannot be empty - - - Enter a template for SQL connection string - - - Would you like to update Azure Function local.settings.json with the new connection string? - - - Enter environment variable for SQL connection string - - - Deploying SQL Db Project Locally - - - Database project deployed successfully + + Circular reference from project {0} to project {1} Cleaning existing deployments... - - Creating deployment settings ... - - - Building and running the docker container ... - - - Docker container is not running - - - Failed to run the docker container - Connecting to SQL Server on Docker - - Failed to open a connection to the deployed database' - - - Failed to complete task '{0}'. Error: {1} - - - Failed to deploy project. Check output pane for more details. {0} - - - Failed to update app setting '{0}' - - - Updating app setting: '{0}' - Connection failed error: '{0}' - - Docker created id: '{0}' + + Connections - - Docker logs: '{0}' + + Connection string setting name - - Waiting for {0} seconds before another attempt for operation '{1}' + + Connection string setting specified in "local.settings.json" - - Running operation '{2}' Attempt {0} of {1} + + Create New - - Operation '{0}' completed successfully. Result: {1} + + Create - - Operation '{0}' failed. Re-trying... Current Result: {1}. Error: '{2}' + + Create project from database - - Operation '{0}' failed. Re-trying... Error: '{1}' + + Settings - - Add database reference + + Creating deployment settings ... - - Add reference + + Target platform of the project {0} is now {1} - - Type + + Dacpac file location is required for adding a reference to a database - - Project + + dacpac Files - - System database + + Dacpac references need to be located on the same drive as the project file. The project file is located at {0} Data-tier application (.dacpac) - - Select .dacpac + + Data Source - - Same database + + Data source - - Different database, same server + + Data sources - - Different database, different server + + Data Sources + + + Database location is required for adding a reference to a database Database name - - Database variable + + Database - - Server name - - - Server variable - - - Suppress errors caused by unresolved references in the referenced project - - - Example Usage - - - Enter a database name for this system database + + Database name is required for adding a reference to a different database A database name is required. The database variable is optional. @@ -514,252 +169,459 @@ Database project - - Dacpac references need to be located on the same drive as the project file. The project file is located at {0} - - - Reference type - - - Create project from database - - - Create - - - Source database - - - Target project - - - Settings - - - Name - - - Enter project name - - - Select location to create project - - - Browse folder - - - Select folder structure - - - Folder structure - - - Browse... - - - Select project location - - - The selected project location '{0}' does not exist or is not a directory. - - - There is already a directory named '{0}' in the selected location: '{1}'. - - - Multiple .sqlproj files selected; please select only one. - - - No .sqlproj file selected; please select one. - - - No {0} found - - - Missing 'version' entry in {0} - - - Unrecognized version: - - - Unknown data source type: - - - Invalid SQL connection string - - - Target information for extract is required to create database project. - - - Schema compare extension installation is required to run schema compare - - - Schema compare could not start because build failed - - - The targets, references, and system database references need to be updated to build this project. If the project is created in SSDT, it will continue to work in both tools. Do you want to update the project? - - - The system database references need to be updated to build this project. If the project is created in SSDT, it will continue to work in both tools. Do you want to update the project? + + A reference to this database already exists in this project Database reference type is required for adding a reference to a database - - System database selection is required for adding a reference to a system database - - - Dacpac file location is required for adding a reference to a database - - - Database location is required for adding a reference to a database - - - Database name is required for adding a reference to a different database - - - Invalid DSP in .sqlproj file - - - Invalid database reference in .sqlproj file + + Database References Database selection is required to create a project from a database - - A reference to this database already exists in this project + + Database variable - - Items with absolute path outside project folder are not supported. Please make sure the paths in the project file are relative to project folder. + + There is already a directory named '{0}' in the selected location: '{1}'. - - Cannot access parent of provided tree item + + The selected project location '{0}' does not exist or is not a directory. - - To successfully build, update the project to have one pre-deployment script and/or one post-deployment script + + default - - Cannot access provided database project. Only valid, open database projects can be reloaded. - - - Validation of external streaming job passed. - - - Project '{0}' is already opened. - - - A project named {0} already exists in {1}. - - - File {0} doesn't exist - - - File or directory '{0}' doesn't exist - - - Cannot resolve path {0} - - - A file with the name '{0}' already exists on disk at this location. Please choose another name. - - - A folder with the name '{0}' already exists on disk at this location. Please choose another name. - - - A folder with the name '{0}' already exists on disk at this location. Please choose another location. - - - Invalid input: {0} - - - Invalid value specified for the property '{0}' in .sqlproj file - - - Unable to construct connection: {0} - - - Circular reference from project {0} to project {1} - - - Error finding build files location: {0} - - - Build failed. Check output pane for more details. {0} - - - Unable to establish project context. Command invoked from unexpected location: {0} - - - Unable to locate '{0}' target: '{1}' - - - Unable to find {1} with path '{0}' - - - A {0} script already exists. The new script will not be included in build. - - - The variable name '{0}' is not valid. - - - A reference to project '{0}' cannot be added. Adding this project as a reference would cause a circular dependency - - - Unable to find SQLCMD variable '{0}' - - - Unable to find database reference {0} - - - Specified GUID is invalid: {0} - - - Invalid target platform: {0}. Supported target platforms: {1} + + DatabaseProject Delete + + Are you sure you want to delete {0}? + + + Are you sure you want to delete {0} and all of its contents? + + + Are you sure you want to delete the reference to {0}? + + + Failed to update app setting '{0}' + + + Updating app setting: '{0}' + + + Deploying SQL Db Project Locally + + + Failed to deploy project. Check output pane for more details. {0} + + + Failed to open a connection to the deployed database' + + + Database project deployed successfully + + + A {0} script already exists. The new script will not be included in build. + + + Deploy to docker container + + + Deploy to existing server + + + Different database, different server + + + Different database, same server + + + Docker created id: '{0}' + + + Failed to run the docker container + + + Docker container is not running + + + Docker logs: '{0}' + + + Done + + + Don't use profile + + + Start with the core pieces to develop and publish schemas for SQL Edge + + + SQL Edge + + + Develop and publish schemas for SQL databases starting from an empty project + + + SQL Database + + + Enter a template for SQL connection string + + + Enter connection string environment variable name + + + Enter environment variable for SQL connection string + + + Enter connection string template + + + Enter new database name + + + Enter new value for variable '{0}' + + + Enter password or press enter to use the generated password + + + Enter port number or press enter to use the default value + + + Enter a database name for this system database + + + Error finding build files location: {0} + + + Example Usage + Exclude - - file - - - folder - - - Folder - - - Script - - - Table - - - View - - - Stored Procedure - - - Data Source - - - File Format - External Stream External Streaming Job - - Script.PreDeployment + + Validation of external streaming job passed. + + + Target information for extract is required to create database project. + + + File + + + A file with the name '{0}' already exists on disk at this location. Please choose another name. + + + File Format + + + file + + + File or directory '{0}' doesn't exist + + + Flat + + + A folder with the name '{0}' already exists on disk at this location. Please choose another name. + + + A folder with the name '{0}' already exists on disk at this location. Please choose another location. + + + Folder + + + folder + + + Folder structure + + + Generate Script + + + hr + + + Input + + + Invalid DSP in .sqlproj file + + + Invalid database reference in .sqlproj file + + + Specified GUID is invalid: {0} + + + Invalid input: {0} + + + Cannot access provided database project. Only valid, open database projects can be reloaded. + + + Invalid value specified for the property '{0}' in .sqlproj file + + + Invalid SQL connection string + + + Invalid target platform: {0}. Supported target platforms: {1} + + + Load profile... + + + Location + + + min + + + Missing 'version' entry in {0} + + + msec + + + Multiple .sqlproj files selected; please select only one. + + + Name must not be empty + + + New + + + New {0} name: + + + No Azure functions in the current active file + + + No Azure functions projects found in the workspace + + + No {0} found + + + No data sources in this project + + + File {0} doesn't exist + + + No .sqlproj file selected; please select one. + + + No + + + No (default) + + + The variable name '{0}' is not valid. + + + Object Type + + + Ok + + + Output + + + Items with absolute path outside project folder are not supported. Please make sure the paths in the project file are relative to project folder. + + + Cannot access parent of provided tree item + + + Port must a be number Script.PostDeployment + + Script.PreDeployment + + + To successfully build, update the project to have one pre-deployment script and/or one post-deployment script + + + Profile + + + Error loading the publish profile. {0} + + + Build failed. Check output pane for more details. {0} + + + A project named {0} already exists in {1}. + + + Project '{0}' is already opened. + + + Project + + + Select location to create project + + + Name + + + Enter project name + + + Publish + + + Publish + + + Publish project + + + Publish Settings File + + + Type + + + Reference type + + + Would you like to reload your database project? + + + Reload values from project + + + Reset all variables + + + Operation '{0}' failed. Re-trying... Current Result: {1}. Error: '{2}' + + + Operation '{0}' failed. Re-trying... Error: '{1}' + + + Running operation '{2}' Attempt {0} of {1} + + + Operation '{0}' completed successfully. Result: {1} + + + Waiting for {0} seconds before another attempt for operation '{1}' + + + Building and running the docker container ... + + + Same database + + + Schema + + + Schema Compare + + + Schema compare extension installation is required to run schema compare + + + Schema/Object Type + + + Script + + + sec + + + Select an Azure function in the current file to add SQL binding to + + + Select type of binding + + + Select connection + + + Select .dacpac + + + Select database + + + Select where to deploy the project to + + + Select folder structure + + + Select Profile + + + Select publish profile to load + + + Select project location + + + Select + + + Current target platform: {0}. Select new target platform + + + Server + + + Server name + + + Server variable + + + Source database + + + SQLCMD Variables + + + Value + + + Name + + + SQL connection string + + + Install + The .NET Core SDK cannot be located. Project build will not work. Please install .NET Core SDK version 3.1 or update the .NET Core SDK location in settings if already installed. @@ -769,58 +631,201 @@ Update Location - - Install - Don't Ask Again Database Projects - - Input - - - Output - - - Select type of binding - - - Select an Azure function in the current file to add SQL binding to - SQL object to query SQL table to upsert into - - Connection string setting name + + Stored Procedure - - Connection string setting specified in "local.settings.json" + + Suppress errors caused by unresolved references in the referenced project - - No Azure functions in the current active file + + System database - + + System database selection is required for adding a reference to a system database + + + Table + + + Connection + + + Target project + + + Failed to complete task '{0}'. Error: {1} + + + Unable to construct connection: {0} + + + Unable to find {1} with path '{0}' + + + Unable to find database reference {0} + + + Unable to find SQLCMD variable '{0}' + + + Unable to locate '{0}' target: '{1}' + + + Unable to establish project context. Command invoked from unexpected location: {0} + + + Unknown data source type: + + + Unrecognized version: + + + The system database references need to be updated to build this project. If the project is created in SSDT, it will continue to work in both tools. Do you want to update the project? + + + The targets, references, and system database references need to be updated to build this project. If the project is created in SSDT, it will continue to work in both tools. Do you want to update the project? + + + Value cannot be empty + + + View + + + Yes + + >>> {0} … errored out: {1} + + stderr: + + + stdout: + >>> {0} … exited with code: {1} >>> {0} … exited with signal: {1} - - stdout: + + + + Database Projects - - stderr: + + Add Database Reference - + + Add SQL Binding + + + Build + + + Change Target Platform + + + Close Database Project + + + Create Project From Database + + + Delete + + + Deploy + + + Design and publish SQL database schemas + + + Database Projects + + + Edit .sqlproj File + + + Exclude from project + + + Whether to prompt the user to install .NET Core when not detected. + + + Full path to .NET Core SDK on the machine. + + + New Database Project + + + Add External Streaming Job + + + Add Folder + + + Add Item... + + + Add Post-Deployment Script + + + Add Pre-Deployment Script + + + Add Script + + + Add Stored Procedure + + + Add Table + + + Add View + + + Open Database Project + + + Open Containing Folder + + + Properties + + + Publish + + + Schema Compare + + + Validate External Streaming Job + + + No database projects currently open. +[New Project](command:sqlDatabaseProjects.new) +[Open Project](command:sqlDatabaseProjects.open) +[Create Project From Database](command:sqlDatabaseProjects.importDatabase) + + + Projects + + \ No newline at end of file diff --git a/resources/xlf/en/sql-migration.xlf b/resources/xlf/en/sql-migration.xlf index b1e06ec6e8..bc2408c8de 100644 --- a/resources/xlf/en/sql-migration.xlf +++ b/resources/xlf/en/sql-migration.xlf @@ -1,61 +1,38 @@ - - - Azure SQL Migration - - - Azure SQL migration description - - - Open Azure SQL migration notebooks - - - Azure SQL Migration - - - Migration Tasks - - - Azure SQL Migration - - - Migrate to Azure SQL - - - Feedback - - - New support request - - - Migration Context Menu - - - Complete cutover - - - Database details - - - Azure SQL Target details - - - Database Migration Service details - - - Copy migration details - - - Cancel migration - - + + An error occurred while accessing the selected account '{0}'. Select 'Link account' and refresh the account, or select a different account. Error '{1}' + + + The access token for selected account '{0}' is no longer valid. Select 'Link account' and refresh the account, or select a different account. + Migrate '{0}' to Azure SQL - - This is a blocking issue that will prevent the database migration from succeeding. + + Are you sure you want to cancel this migration? + + + Select the databases that you want to migrate to Azure SQL. + + + Active backup files + + + Active backup files (1 item) + + + Active backup files ({0} items) + + + All fields are required. + + + Apply + + + Assessment results for '{0}' Assessment in progress @@ -65,365 +42,14 @@ This may take some time. - - Azure SQL target - - - Based on the assessment results, all {0} of your databases in an online state can be migrated to Azure SQL. - - - An error occurred while assessing the server '{0}'. - - - Refresh assessment - - - Choose your Azure SQL target - - - Subscription name for your Azure SQL target - - - Azure region for your Azure SQL target - - - Resource group for your Azure SQL target - - - Your Azure SQL target resource name - - - Azure SQL Managed Instance (PaaS) - - - SQL Server on Azure Virtual Machine (IaaS) - - - Select your target Azure subscription and your target Azure SQL Managed Instance - - - Select your target Azure Subscription and your target SQL Server on Azure Virtual Machine for your target. - - - To migrate to Azure SQL Managed Instance (PaaS), view assessment results and select one or more databases. - - - To migrate to SQL Server on Azure Virtual Machine (IaaS), view assessment results and select one or more databases. - - - View/Select - - - {0} of {1} databases selected. - - - To continue, select a target database. - - - Select the databases to migrate. - - - We have completed the assessment of your SQL Server instance '{0}'. - - - Assessment results for '{0}' - - - {0} out of {1} databases can be migrated - Databases that are not ready for migration to Azure SQL Managed Instance can be migrated to SQL Server on Azure Virtual Machines. - - Databases + + Assessment results - - SQL Server instance - - - Azure account - - - Select an Azure account linked to Azure Data Studio, or link one now. - - - Add a linked account and then try again. - - - Link account - - - {0} account linked - - - {0} accounts linked - - - Azure AD tenant - - - The access token for selected account '{0}' is no longer valid. Select 'Link account' and refresh the account, or select a different account. - - - An error occurred while accessing the selected account '{0}'. Select 'Link account' and refresh the account, or select a different account. Error '{1}' - - - Database backup - - - Select the location of the database backups to use during migration. - - - My database backups are on a network share - - - My database backups are in an Azure Storage Blob Container - - - Network share details - - - Network share path for your database backups. The migration process will automatically retrieve valid backup files from this network share. - - - Windows user account with read access to the network share location. - - - Provide the network share location where the backups are stored, and the user credentials used to access the share. - - - Enter target database name for the selected source databases. - - - Network share location where the backups are stored - - - Ensure that the service account running the source SQL Server instance has read privileges on the network share. - - - Windows user account with read access to the network share location - - - Password - - - Enter password. - - - Storage account details - - - Provide the Azure Storage account where the backups will be uploaded to. - - - Select a unique name for this target database - - - Database '{0}' already exists on the target managed instance '{1}'. - - - Azure Storage Blob Container details - - - Provide the Azure Storage Blob Container that contains the backups. - - - Enter target database name and select resource group, storage account and container for the selected source databases. - - - Select the subscription that contains the storage account. - - - Migration mode - - - To migrate to the Azure SQL target, choose a migration mode based on your downtime requirements. - - - Online migration - - - Application downtime is limited to cutover at the end of migration. - - - Offline migration - - - Application downtime will start when the migration starts. - - - \\Servername.domainname.com\Backupfolder - - - Domain\username - - - No subscription found. - - - No location found. - - - No storage account found. - - - No file shares found. - - - No blob containers found. - - - No blob files found. - - - To continue, select a valid subscription. - - - To continue, select a valid location. - - - To continue, select a valid storage account. - - - To continue, select a valid resource group for source database '{0}'. - - - To continue, select a valid storage account for source database '{0}'. - - - To continue, select a valid blob container for source database '{0}'. - - - To continue, select a valid last backup file for source database '{0}'. - - - Invalid network share location format. Example: {0} - - - Invalid user account format. Example: {0} - - - Enter a valid name for the target database. - - - Provide a unique container for each target database. Databases affected: - - - Enter the Windows Authentication credentials used to connect to SQL Server instance {0}. These credentials will be used to connect to the SQL Server instance and identify valid backup files. - - - Enter the SQL Authentication credentials used to connect to SQL Server instance {0}. ​These credentials will be used to connect to the SQL Server instance and identify valid backup files. - - - Select a resource group value first. - - - Select a storage account value first. - - - Select a blob container value first. - - - Azure Database Migration Service - - - Azure Database Migration Service orchestrates database migration activities and tracks their progress. You can select an existing Database Migration Service as an Azure SQL target if you have created one previously, or create a new one below. - - - No Database Migration Service found. Create a new one. - - - Create new - - - Select a valid Database Migration Service. - - - Select a Database Migration Service that is connected to a node. - - - Authentication keys - - - Azure Database Migration Service "{0}" details:` - - - Any existing Azure Database Migration Service in the Azure portal do not appear in Azure Data Studio. Any Database Migration Service created in Azure Data Studio will not be visible in the Azure portal yet. - - - Database Migration Service authentication keys - - - Create Azure Database Migration Service - - - Subscription name for your Azure Database Migration Service. - - - Azure region for your Azure Database Migration Service. This should be the same region as your target Azure SQL. - - - Resource group for your Azure Database Migration Service. - - - Azure Database Migration Service name. - - - Azure SQL target selected as default. - - - Enter the information below to add a new Azure Database Migration Service. - - - Loading Migration Services - - - Setup integration runtime - - - Azure Database Migration Service leverages Azure Data Factory's self-hosted integration runtime to upload backups from on-premises network fie share to Azure. - - - Follow the instructions below to setup self-hosted integration runtime. - - - Step 1: {0} - - - Download and install integration runtime - - - Step 2: Use this key to register your integration runtime - - - Step 3: Click on 'Test connection' button to check the connection between Azure Database Migration Service and integration runtime - - - Connection status - - - Key 1 - - - Key 2 - - - Key 1 copied - - - Key 2 copied - - - Refresh key 1 - - - Refresh key 2 - - - Copy key 1 - - - Copy key 2 + + This is a blocking issue that will prevent the database migration from succeeding. Authentication key @@ -431,230 +57,599 @@ This may take some time. Authentication key '{0}' has been refreshed. - - Azure Database Migration Service is not registered. Azure Database Migration Service '{0}' needs to be registered with self-hosted integration runtime on any node. + + Authentication type - - Azure Database Migration Service '{0}' is connected to self-hosted integration runtime running on the node - {1} - - - No resource groups found. - - - To continue, select a valid resource group. - - - Enter a valid name for the Migration Service. - - - No Migration Services found. To continue, create a new one. - - - An error occurred while refreshing the migration service creation status. - - - Azure SQL Managed Instance - - - No managed instance found. - - - No virtual machine found. - - - A resource group is a container that holds related resources for an Azure solution. - - - OK - - - (new) {0} - - - Test connection - - - Successfully created a Database Migration Service. - - - Failed to provision a Database Migration Service. Wait a few minutes and then try again. - - - Apply - - - Creating resource group - - - Resource group created - - - Name of new resource group - - - Comparison of the actual amount of data read from the source and the actual amount of data uploaded to the target. - - - Data movement throughput achieved during the migration of your database backups to Azure. This is the rate of data transfer, calculated by data read divided by duration of backups migration to Azure. - - - Learn more - - - Learn more about things you need before starting a migration. - - - Subscription - - - Storage account - - - Resource group - - - Name - - - Location - - - Refresh - - - Create - - - Cancel - - - Type - - - User account - - - View all - - - Target + + Authentication keys Azure SQL - - Close - - - Data uploaded / size - - - Copy throughput (MBPS) - - - New support request - - - Impact - - - All fields are required. - - - Summary - - + Azure SQL Managed Instance - - SQL Server on Azure Virtual Machine + + SQL Server on Azure Virtual Machines - - Databases for migration - - - Azure storage subscription - - - Azure storage - - - Network share - - - Blob container - - - Last backup file - - - Blob container resource group - - - Blob container storage account - - - Source databases - - - Mode - - - Backup location + + Target type Azure Storage account to upload backups - - Self-hosted integration runtime node + + Azure AD tenant - - Database to be migrated + + Backup location - - {0} database + + Backup start time + + + A SQL Managed Instance migration cutover to the Business Critical service tier can take significantly longer than General Purpose because three secondary replicas have to be seeded for Always On High Availability group. The duration of the operation depends on the size of the data. Seeding speed in 90% of cases is 220 GB/hour or higher. + + + Blob container resource group + + + Last backup file + + + Select a blob container value first. + + + Blob container storage account + + + Blob container + + + Select a resource group value first. + + + Azure Storage Blob Container details + + + Provide the Azure Storage Blob Container that contains the backups. + + + Select the subscription that contains the storage account. + + + Enter target database name and select resource group, storage account and container for the selected source databases. + + + Select a storage account value first. + + + {0} out of {1} databases can be migrated + + + Cancel + + + An error occurred while canceling the migration. + + + Cancel migration + + + Migration is not in progress and cannot be canceled. + + + Migration is not in progress and cannot be cutover. + + + The cutover process cannot start until all the migrations are done. To return the latest file status, refresh your browser window. + + + Close + + + Complete cutover + + + Completing cutover without restoring all the backups may result in a data loss. + + + I confirm there are no additional log backups to provide and want to complete cutover. + + + Connection status + + + Copy key 1 + + + Copy key 2 + + + Copy migration details + + + Copy throughput (MBPS) + + + Data movement throughput achieved during the migration of your database backups to Azure. This is the rate of data transfer, calculated by data read divided by duration of backups migration to Azure. {0} databases - - Select the operation you'd like to perform. + + {0} database - - Inline migration + + Create - - SQL migration assessment + + Create new - - Failed to open the migration notebook. + + Creating resource group - - Azure SQL Migration + + Migration cutover + + + Completing cutover + + + Perform the following steps before you complete cutover. + + + Cutover in progress for database '{0}' + + + Last file restored: {0} + + + Log backups pending restore: {0} + + + An error occurred while refreshing the migration status. + + + 1. Stop all incoming transactions to the source database. + + + 2. Create a final transaction log differential or backup and store it in the Azure Storage Blob Container. + + + 2. ​​Create a final transaction log backup and store it on the network share. + + + 3. Verify that all backups have been restored on the target database. The "Log backups pending restore" value should be zero. + + + 3. Verify that all log backups have been restored on the target database. The "Log backups pending restore" value should be zero. + + + Migration mode Determine the migration readiness of your SQL Server instances, identify a recommended Azure SQL target, and complete the migration of your SQL Server instance to Azure SQL Managed Instance or SQL Server on Azure Virtual Machines. + + Help articles and video links + + + Assessment rules used to determine the feasibility of migrating your SQL Server instance to Azure SQL Managed Instance. + + + Assessment rules for Azure SQL Managed Instance + Migrate to Azure SQL Migrate a SQL Server instance to Azure SQL. + + Azure SQL Migration + + + Comparison of the actual amount of data read from the source and the actual amount of data uploaded to the target. + + + Data uploaded / size + + + Database + + + Database '{0}' already exists on the target managed instance '{1}'. + + + Databases for migration + + + To migrate to the Azure SQL target, choose a migration mode based on your downtime requirements. + + + Migration mode + + + Application downtime will start when the migration starts. + + + Offline migration + + + Application downtime is limited to cutover at the end of migration. + + + Online migration + + + Database Migration Service + + + Database Migration Service authentication keys + + + Successfully created a Database Migration Service. + Database migration status - - Assessment rules for Azure SQL Managed Instance + + Select the location of the database backups to use during migration. - - Assessment rules used to determine the feasibility of migrating your SQL Server instance to Azure SQL Managed Instance. + + Database backup - - Help articles and video links + + Database to be migrated - - Things you need before starting a migration: + + Databases ({0}/{1}) + + + {0}/{1} Databases selected + + + Databases + + + {0} day + + + {0} days + + + Description + + + Details copied + + + Any existing Azure Database Migration Service in the Azure portal do not appear in Azure Data Studio. Any Database Migration Service created in Azure Data Studio will not be visible in the Azure portal yet. + + + Failed to provision a Database Migration Service. Wait a few minutes and then try again. + + + Duration + + + If results were expected, verify the connection to the SQL Server instance. + + + No backup files + + + Enter the credentials for the source SQL Server instance. These credentials will be used while migrating databases to Azure SQL. + + + Migrations failed + + + Feedback on the migration experience + + + File name + + + Finish time + + + First LSN + + + Full backup files + + + We have completed the assessment of your SQL Server instance '{0}'. + + + {0} hr + + + {0} hrs + + + Impact + + + Name: {0} + + + Type: {0} + + + Impacted objects + + + Inline migration + + + {0} databases have warnings + + + {0} database has warnings + + + Instance + + + To continue, select a valid blob container for source database '{0}'. + + + To continue, select a valid last backup file for source database '{0}'. + + + To continue, select a valid resource group for source database '{0}'. + + + To continue, select a valid storage account for source database '{0}'. + + + To continue, select a valid location. + + + Select a valid Database Migration Service. + + + Select a Database Migration Service that is connected to a node. + + + Invalid network share location format. Example: {0} + + + To continue, select a valid resource group. + + + Enter a valid name for the Migration Service. + + + To continue, select a valid storage account. + + + To continue, select a valid subscription. + + + Enter a valid name for the target database. + + + Invalid user account format. Example: {0} + + + Azure Database Migration Service orchestrates database migration activities and tracks their progress. You can select an existing Database Migration Service as an Azure SQL target if you have created one previously, or create a new one below. + + + No Database Migration Service found. Create a new one. + + + Azure Database Migration Service + + + Step 1: {0} + + + Step 2: Use this key to register your integration runtime + + + Step 3: Click on 'Test connection' button to check the connection between Azure Database Migration Service and integration runtime + + + Issues + + + Issues ({0}) + + + Issue details + + + Key 1 copied + + + Key 1 + + + Key 2 copied + + + Key 2 + + + Last LSN + + + Last applied backup files + + + Last applied backup files taken on + + + Last applied LSN + + + Last backup + + + Last scan completed: {0} + + + Learn more + + + Learn more about things you need before starting a migration. + + + Location + + + Azure SQL Managed Instance + + + Migrations completed + + + Database migrations in progress + + + Migrations not started + + + Migration status + + + Migration status filter + + + {0} min + + + {0} mins + + + Mode + + + More info + + + Name + + + Names: + + + Name of new resource group + + + My database backups are in an Azure Storage Blob Container + + + My database backups are on a network share + + + Network share + + + Storage account details + + + Provide the Azure Storage account where the backups will be uploaded to. + + + Network share details + + + Provide the network share location where the backups are stored, and the user credentials used to access the share. + + + Network share path for your database backups. The migration process will automatically retrieve valid backup files from this network share. + + + Network share location where the backups are stored + + + Password + + + Enter password. + + + \\Servername.domainname.com\Backupfolder + + + Enter target database name for the selected source databases. + + + Windows user account with read access to the network share location. + + + Windows user account with read access to the network share location + + + (new) {0} + + + New support request + + + No + + + No blob containers found. + + + No blob files found. + + + No file shares found. + + + No issues found for migrating to SQL Server on Azure SQL Managed Instance. + + + No issues found for migrating to SQL Server on Azure Virtual Machine. + + + No location found. + + + No managed instance found. + + + No pending backups. Click refresh to check current status. + + + No storage account found. + + + No subscription found. + + + No virtual machine found. + + + Failed to open the migration notebook. + + + Object details + + + Offline + + + OFFLINE + + + OK + + + Online + + + Download and install integration runtime + + + User account Azure account details @@ -665,308 +660,86 @@ This may take some time. Backup location details - - Database migrations in progress + + Things you need before starting a migration: - - Migrations failed + + Provide a unique container for each target database. Databases affected: - - Migrations completed + + Select the operation you'd like to perform. - - Completing cutover + + Recommendation - - Migrations not started + + Refresh - - Show status + + Refresh assessment - - {0} database has warnings + + Refresh key 1 - - {0} databases have warnings + + Refresh key 2 - - Feedback on the migration experience + + A resource group is a container that holds related resources for an Azure solution. - - Migration cutover + + No resource groups found. - - Complete cutover + + Resource group - - Source database name + + Resource group created - - Source server - - - Source version - - - Target database name - - - Target server - - - Target version - - - Migration status - - - Migration status filter - - - Full backup files - - - Last applied LSN - - - Last applied backup files - - - Last applied backup files taken on - - - Active backup files - - - An error occurred while refreshing the migration status. - - - An error occurred while canceling the migration. - - - Status - - - Backup start time - - - First LSN - - - Last LSN - - - The cutover process cannot start until all the migrations are done. To return the latest file status, refresh your browser window. - - - Azure SQL Managed Instance - - - SQL Server on Azure Virtual Machines - - - Cancel migration - - - Active backup files (1 item) - - - Active backup files ({0} items) - - - Copy migration details - - - Details copied - - - Are you sure you want to cancel this migration? - - - Yes - - - No - - - No backup files - - - If results were expected, verify the connection to the SQL Server instance. - - - Completing cutover without restoring all the backups may result in a data loss. - - - A SQL Managed Instance migration cutover to the Business Critical service tier can take significantly longer than General Purpose because three secondary replicas have to be seeded for Always On High Availability group. The duration of the operation depends on the size of the data. Seeding speed in 90% of cases is 220 GB/hour or higher. - - - Perform the following steps before you complete cutover. - - - 1. Stop all incoming transactions to the source database. - - - 2. ​​Create a final transaction log backup and store it on the network share. - - - 2. Create a final transaction log differential or backup and store it in the Azure Storage Blob Container. - - - 3. Verify that all log backups have been restored on the target database. The "Log backups pending restore" value should be zero. - - - 3. Verify that all backups have been restored on the target database. The "Log backups pending restore" value should be zero. - - - Last file restored: {0} - - - Last scan completed: {0} - - - Log backups pending restore: {0} - - - I confirm there are no additional log backups to provide and want to complete cutover. - - - Cutover in progress for database '{0}' - - - Migration is not in progress and cannot be canceled. - - - Migration is not in progress and cannot be cutover. - - - File name - - - Size - - - No pending backups. Click refresh to check current status. - - - Status: All - - - Status: Ongoing - - - Status: Completing - - - Status: Succeeded - - - Status: Failed + + Search Search for migrations - - Online - - - Offline - - - Database - - - Database Migration Service - - - Duration - - - Target type - - - SQL Managed Instance - - - SQL Virtual Machine - - - Target name - - - Migration mode - - - Start time - - - Finish time - - - {0} ( - - - {0} - - - In progress - - - Succeeded - - - Creating - - - Completing - - - Canceling - - - Failed - - - {0} Warning) - - - {0} Warnings) - - - {0} Error) - - - {0} Errors) - - - {0} hrs - - - {0} hr - - - {0} days - - - {0} day - - - {0} mins - - - {0} min - {0} sec - - Azure Database Migration Service + + Select your target Azure subscription and your target Azure SQL Managed Instance - - Close + + Select your target Azure Subscription and your target SQL Server on Azure Virtual Machine for your target. - - Self-hosted integration runtime node + + Please select 1 or more databases to assess for migration + + + Select the databases to migrate. + + + Click on SQL Server instance or any of the databases on the left to view its details. + + + To continue, select a target database. + + + Server + + + Ensure that the service account running the source SQL Server instance has read privileges on the network share. + + + Azure Database Migration Service leverages Azure Data Factory's self-hosted integration runtime to upload backups from on-premises network fie share to Azure. + + + Follow the instructions below to setup self-hosted integration runtime. + + + Setup integration runtime + + + Loading Migration Services Authentication keys @@ -974,127 +747,354 @@ This may take some time. Authentication keys used to connect to the self-hosted integration runtime node + + Close + + + Azure Database Migration Service + + + Self-hosted integration runtime node + -- unavailable -- + + Azure Database Migration Service "{0}" details:` + + + No Migration Services found. To continue, create a new one. + + + Azure Database Migration Service is not registered. Azure Database Migration Service '{0}' needs to be registered with self-hosted integration runtime on any node. + + + Azure Database Migration Service '{0}' is connected to self-hosted integration runtime running on the node - {1} + + + An error occurred while refreshing the migration service creation status. + + + Enter the information below to add a new Azure Database Migration Service. + + + Create Azure Database Migration Service + + + Azure region for your Azure Database Migration Service. This should be the same region as your target Azure SQL. + + + Azure Database Migration Service name. + + + Resource group for your Azure Database Migration Service. + + + Subscription name for your Azure Database Migration Service. + + + Azure SQL target selected as default. + + + Self-hosted integration runtime node + + + Show status + + + Size (MB) + + + Size + + + Azure region for your Azure SQL target + + + Azure SQL Managed Instance (PaaS) + + + To migrate to Azure SQL Managed Instance (PaaS), view assessment results and select one or more databases. + + + To migrate to SQL Server on Azure Virtual Machine (IaaS), view assessment results and select one or more databases. + + + Your Azure SQL target resource name + + + Resource group for your Azure SQL target + + + Subscription name for your Azure SQL target + + + SQL Server on Azure Virtual Machine (IaaS) + Source configuration Source credentials - - Enter the credentials for the source SQL Server instance. These credentials will be used while migrating databases to Azure SQL. + + Source database name - - Server + + Source databases - - Username + + Enter the SQL Authentication credentials used to connect to SQL Server instance {0}. ​These credentials will be used to connect to the SQL Server instance and identify valid backup files. - - Size (MB) + + Enter the Windows Authentication credentials used to connect to SQL Server instance {0}. These credentials will be used to connect to the SQL Server instance and identify valid backup files. - - Last backup + + Source server - - Databases for migration + + Source version - - Select the databases that you want to migrate to Azure SQL. + + SQL migration assessment - - OFFLINE + + SQL Managed Instance - - Please select 1 or more databases to assess for migration + + SQL Server instance - - Issues + + SQL Virtual Machine - - Search + + Start time - - Instance + + Status - - Warnings + + Canceling - - Impacted objects + + Completing - - Object details + + Creating - - Assessment results + + Status: All - - Type: + + Status: Completing - - Names: + + Status: Failed - - Description + + Status: Ongoing - - Recommendation + + Status: Succeeded - - More info + + {0} Errors) - - Target platform + + {0} - - Warnings details + + {0} Error) - - Issue details + + {0} ( - - Click on SQL Server instance or any of the databases on the left to view its details. + + Failed - - No issues found for migrating to SQL Server on Azure Virtual Machine. - - - No issues found for migrating to SQL Server on Azure SQL Managed Instance. - - - Type: {0} - - - Name: {0} - - - Databases ({0}/{1}) - - - {0}/{1} Databases selected - - - Issues ({0}) - - - Warnings ({0}) - - - Authentication type + + In progress Refresh - - - - Starting migration for database {0} to {1} - {2} + + Succeeded + + {0} Warnings) + + + {0} Warning) + + + Storage account + + + Subscription + + + Azure storage + + + Azure storage subscription + + + Databases for migration + + + Azure SQL Managed Instance + + + Summary + + + SQL Server on Azure Virtual Machine + + + Target + + + Target name + + + Target database name + + + Target platform + + + Target server + + + Target version + + + Test connection + + + Type + + + Type: + + + Select a unique name for this target database + + + Username + + + View all + + + View/Select + + + Warnings + + + Warnings ({0}) + + + Warnings details + + + Domain\username + + + Link account + + + {0} accounts linked + + + {0} account linked + + + Select an Azure account linked to Azure Data Studio, or link one now. + + + Add a linked account and then try again. + + + Azure account + + + Based on the assessment results, all {0} of your databases in an online state can be migrated to Azure SQL. + + + An error occurred while assessing the server '{0}'. + + + Choose your Azure SQL target + + + Azure SQL target + + + Yes + + + {0} of {1} databases selected. + + + An error occurred while starting the migration: '{0}' - + + Starting migration for database {0} to {1} - {2} + + + + + Cancel migration + + + Complete cutover + + + Copy migration details + + + Database details + + + Azure SQL migration description + + + Azure SQL Migration + + + Azure SQL Migration + + + Migration Context Menu + + + Migration Tasks + + + Azure SQL Migration + + + Open Azure SQL migration notebooks + + + New support request + + + Feedback + + + Migrate to Azure SQL + + + Database Migration Service details + + + Azure SQL Target details + + \ No newline at end of file diff --git a/resources/xlf/en/sql.xlf b/resources/xlf/en/sql.xlf index c7f4896937..6829566c86 100644 --- a/resources/xlf/en/sql.xlf +++ b/resources/xlf/en/sql.xlf @@ -1,13 +1,13 @@ - - Loading - Loading completed - + + Loading + + Hide text labels @@ -15,12 +15,12 @@ Show text labels - + Close - + Hide properties @@ -28,45 +28,49 @@ Show Properties - + Error: {0} - - Warning: {0} - Info: {0} - + + Warning: {0} + + no data available - + Select/Deselect All - + + + Cancel + + + Clear + + + OK + Show Filter - - Select All + + Filter Options Search - - {0} Results - This tells the user how many items are shown in the list. Currently not visible, but read by screen readers. - - - {0} Selected - This tells the user how many items are selected in the list + + Select All Sort Ascending @@ -74,49 +78,99 @@ Sort Descending - - OK + + {0} Selected + This tells the user how many items are selected in the list - - Clear + + {0} Results + This tells the user how many items are shown in the list. Currently not visible, but read by screen readers. - - Cancel - - - Filter Options - - + Loading - + Loading Error... - + Toggle More - + + + Your extension policy does not allow installing extensions. Please change your extension policy and try again. + + + Completed installing {0} extension from VSIX. Please reload Azure Data Studio to enable it. + + + Please reload Azure Data Studio to complete reinstalling the extension {0}. + + + In order to use Azure Data Studio in {0}, Azure Data Studio needs to restart. + Azure Data Studio + + The connection dialog's browse tree context menu + + + The dashboard toolbar action menu + + + The dataexplorer view container title action menu + + + The dataexplorer item context menu + + + The data grid item context menu + Enable automatic update checks. Azure Data Studio will check for updates automatically and periodically. + + Please reload Azure Data Studio to enable this extension locally. + + + Please reload Azure Data Studio to enable this extension in {0}. + Enable to download and install new Azure Data Studio Versions in the background on Windows - - Show Release Notes after an update. The Release Notes are opened in a new web browser window. + + Sets the security policy for downloading extensions. - - The dashboard toolbar action menu + + Unable to install extension '{0}' as it is not compatible with Azure Data Studio '{1}'. + + + Installing extension {0} is completed. Please reload Azure Data Studio to enable it. + + + Controls the memory available to Azure Data Studio after restart when trying to open large files. Same effect as specifying `--max-memory=NEWSIZE` on the command line. + + + &&New Notebook + && denotes a mnemonic + + + New &&Query + && denotes a mnemonic + + + Install Extension from VSIX Package + && denotes a mnemonic + + + New Query The notebook cell title menu @@ -127,29 +181,14 @@ The notebook toolbar menu - - The dataexplorer view container title action menu - - - The dataexplorer item context menu - The object explorer item context menu - - The connection dialog's browse tree context menu + + Please reload Azure Data Studio to disable this extension. - - The data grid item context menu - - - Sets the security policy for downloading extensions. - - - Your extension policy does not allow installing extensions. Please change your extension policy and try again. - - - Completed installing {0} extension from VSIX. Please reload Azure Data Studio to enable it. + + Please reload Azure Data Studio to enable this extension. Please reload Azure Data Studio to complete the uninstallation of this extension. @@ -157,67 +196,28 @@ Please reload Azure Data Studio to enable the updated extension. - - Please reload Azure Data Studio to enable this extension locally. - - - Please reload Azure Data Studio to enable this extension. - - - Please reload Azure Data Studio to disable this extension. - - - Please reload Azure Data Studio to complete the uninstallation of the extension {0}. - - - Please reload Azure Data Studio to enable this extension in {0}. - - - Installing extension {0} is completed. Please reload Azure Data Studio to enable it. - - - Please reload Azure Data Studio to complete reinstalling the extension {0}. - Marketplace The scenario type for extension recommendations must be provided. - - Unable to install extension '{0}' as it is not compatible with Azure Data Studio '{1}'. + + Show Release Notes after an update. The Release Notes are opened in a new web browser window. - - New Query - - - New &&Query - && denotes a mnemonic - - - &&New Notebook - && denotes a mnemonic - - - Controls the memory available to Azure Data Studio after restart when trying to open large files. Same effect as specifying `--max-memory=NEWSIZE` on the command line. + + Please reload Azure Data Studio to complete the uninstallation of the extension {0}. Would you like to change Azure Data Studio's UI language to {0} and restart? - - In order to use Azure Data Studio in {0}, Azure Data Studio needs to restart. + + New Notebook New SQL File - - New Notebook - - - Install Extension from VSIX Package - && denotes a mnemonic - - + Must be an option from the list @@ -225,51 +225,51 @@ Select Box - + - - Add an account - - - Remove account - - - Are you sure you want to remove '{0}'? - - - Yes + + There is no account to refresh No - - Failed to remove account + + Yes + + + Add an account Apply Filters + + Are you sure you want to remove '{0}'? + Reenter your credentials - - There is no account to refresh + + Remove account - + + Failed to remove account + + Copying images is not supported - + A server group with the same name already exists. - + Widget used in the dashboards - + Widget used in the dashboards @@ -280,33 +280,42 @@ Widget used in the dashboards - + - - Saving results into different format disabled for this data provider. - Cannot serialize data as no provider has been registered + + Saving results into different format disabled for this data provider. + Serialization failed with an unknown error - + - - The border color of tiles - - - The tile box shadow color - The button dropdown background hover color The button dropdown box shadow color - - The extension pack header text shadowcolor + + Callout dialog body background. + + + Callout dialog exterior borders to provide contrast against notebook UI. + + + Callout dialog foreground. + + + Callout dialog header and footer background. + + + Callout dialog interior borders used for separating elements. + + + Callout dialog box shadow color. The top color for the extension pack gradient @@ -314,51 +323,48 @@ The bottom color for the extension pack gradient + + The extension pack header text shadowcolor + + + The background color for the banner image gradient + The top color for the banner image gradient The bottom color for the banner image gradient - - The background color for the banner image gradient + + InfoBox: The background color when the notification type is error. - - Notebook: Main toolbar icons + + InfoBox: The background color when the notification type is information. - - Notebook: Main toolbar select box border + + InfoBox: The background color when the notification type is success. - - Notebook: Main toolbar select box background + + InfoBox: The background color when the notification type is warning. - - Notebook: Main toolbar bottom border and separator + + Info button background color. - - Notebook: Main toolbar dropdown arrow + + Info button border color. + + + Info button foreground color. + + + Info button hover background color. Notebook: Main toolbar custom buttonMenu dropdown arrow - - Notebook: Markdown toolbar background - - - Notebook: Markdown toolbar icons - - - Notebook: Markdown toolbar bottom border - Notebook: Active cell border - - Notebook: Markdown editor background - - - Notebook: Border between Markdown editor and preview - Notebook: Code editor background @@ -368,88 +374,79 @@ Notebook: Code editor line numbers - - Notebook: Code editor toolbar icons - Notebook: Code editor toolbar background Notebook: Code editor toolbar right border + + Notebook: Code editor toolbar icons + + + Notebook: Main toolbar dropdown arrow + + + Notebook: Markdown editor background + Tag background color. Tag foreground color. + + Notebook: Main toolbar icons + + + Notebook: Main toolbar bottom border and separator + + + Notebook: Main toolbar select box background + + + Notebook: Main toolbar select box border + + + Notebook: Border between Markdown editor and preview + + + Notebook: Markdown toolbar background + + + Notebook: Markdown toolbar bottom border + + + Notebook: Markdown toolbar icons + Color of the other search matches. The color must not be opaque so as not to hide underlying decorations. Color of the range limiting the search. The color must not be opaque so as not to hide underlying decorations. - - InfoBox: The background color when the notification type is information. + + The border color of tiles - - InfoBox: The background color when the notification type is warning. + + The tile box shadow color - - InfoBox: The background color when the notification type is error. - - - InfoBox: The background color when the notification type is success. - - - Info button foreground color. - - - Info button background color. - - - Info button border color. - - - Info button hover background color. - - - Callout dialog foreground. - - - Callout dialog interior borders used for separating elements. - - - Callout dialog exterior borders to provide contrast against notebook UI. - - - Callout dialog header and footer background. - - - Callout dialog body background. - - - Callout dialog box shadow color. - - + - - Table header background color + + SQL Agent table cell background color. - - Table header foreground color + + SQL Agent table cell border color. - - List/Table background color for the selected and focus item when the list/table is active + + SQL Agent heading background color. - - Color of the outline of a cell. + + SQL Agent Table background color. - - Disabled Input box background. - - - Disabled Input box foreground. + + SQL Agent table hover background color. Button outline color when focused. @@ -457,61 +454,64 @@ Disabled checkbox foreground. - - SQL Agent Table background color. + + Disabled Input box background. - - SQL Agent table cell background color. + + Disabled Input box foreground. - - SQL Agent table hover background color. - - - SQL Agent heading background color. - - - SQL Agent table cell border color. + + List/Table background color for the selected and focus item when the list/table is active Results messages error color. - - - - Some of the loaded extensions are using obsolete APIs, please find the detailed information in the Console tab of Developer Tools window + + Color of the outline of a cell. + + Table header background color + + + Table header foreground color + + + Don't Show Again - - - - F5 shortcut key requires a code cell to be selected. Please select a code cell to run. + + Some of the loaded extensions are using obsolete APIs, please find the detailed information in the Console tab of Developer Tools window + + Clear result requires a code cell to be selected. Please select a code cell to run. - - - - Unknown component type. Must use ModelBuilder to create objects + + F5 shortcut key requires a code cell to be selected. Please select a code cell to run. + + The index {0} is invalid. + + Unknown component type. Must use ModelBuilder to create objects + Unkown component configuration, must use ModelBuilder to create a configuration object - + - - Done + + Tabs are not initialized Cancel - - Generate script + + Done Next @@ -519,93 +519,93 @@ Previous - - Tabs are not initialized + + Generate script - + No tree view with id '{0}' registered. - + - - A NotebookProvider with valid providerId must be passed to this method + + No Manager found no notebook provider found - - No Manager found + + Notebook Manager for notebook {0} does not have a content manager. Cannot perform operations on it Notebook Manager for notebook {0} does not have a server manager. Cannot perform operations on it - - Notebook Manager for notebook {0} does not have a content manager. Cannot perform operations on it - Notebook Manager for notebook {0} does not have a session manager. Cannot perform operations on it - + + A NotebookProvider with valid providerId must be passed to this method + + A NotebookProvider with valid providerId must be passed to this method - + + + Clear all saved accounts + + + Learn More + Manage Show Details - - Learn More - - - Clear all saved accounts - - + - - Preview Features - - - Enable unreleased preview features - - - Show connect dialog on startup + + Enable/disable obsolete API usage notification Obsolete API Notification - - Enable/disable obsolete API usage notification + + Enable unreleased preview features - + + Preview Features + + + Show connect dialog on startup + + Edit Data Session Failed To Connect - + - - Profiler - - - Not connected - - - XEvent Profiler Session stopped unexpectedly on the server {0}. + + The XEvent Profiler session for {0} has lost events. Error while starting new session - - The XEvent Profiler session for {0} has lost events. + + XEvent Profiler Session stopped unexpectedly on the server {0}. - + + Not connected + + + Profiler + + Show Actions @@ -613,54 +613,54 @@ Resource Viewer - + - - Information - - - Warning - - - Error - - - Show Details + + Close Copy - - Close - - - Back + + Error Hide Details - - - - OK + + Information + + Back + + + Show Details + + + Warning + + + Cancel - - - - is required. + + OK + + Invalid input. Numeric value expected. - + + is required. + + The index {0} is invalid. - + blank @@ -671,15 +671,15 @@ Show Actions - + - - Loading - Loading completed - + + Loading + + Invalid value @@ -687,96 +687,93 @@ {0}. {1} - + - - Loading - Loading completed - + + Loading + + modelview code editor for view model. - + Could not find component for type {0} - + Changing editor types on unsaved files is unsupported - + - - Select Top 1000 - - - Take 10 - - - Script as Execute + + Edit Data Script as Alter - - Edit Data - Script as Create Script as Drop - + + Script as Execute + + + Take 10 + + + Select Top 1000 + + - - No script was returned when calling select script on object - - - Select - Create - - Insert - - - Update - Delete - - No script was returned when scripting as {0} on object {1} - - - Scripting Failed + + Insert No script was returned when scripting as {0} - + + No script was returned when scripting as {0} on object {1} + + + No script was returned when calling select script on object + + + Scripting Failed + + + Select + + + Update + + disconnected - + Extension - + - - Active tab background color for vertical tabs - Color for borders in dashboard @@ -786,124 +783,136 @@ Color for dashboard widget subtext - - Color for property values displayed in the properties container component - Color for property names displayed in the properties container component + + Color for property values displayed in the properties container component + Toolbar overflow shadow color - + + Active tab background color for vertical tabs + + - - Identifier of the account type + + Contributes icons to account provider. (Optional) Icon which is used to represent the accpunt in the UI. Either a file path or a themable configuration - - Icon path when a light theme is used - Icon path when a dark theme is used - - Contributes icons to account provider. + + Icon path when a light theme is used - + + Identifier of the account type + + - - View applicable rules - View applicable rules for {0} - - Invoke Assessment - Invoke Assessment for {0} Export As Script - - View all rules and learn more on GitHub - Create HTML Report - - Report has been saved. Do you want to open it? - - - Open - Cancel - + + Open + + + Report has been saved. Do you want to open it? + + + View applicable rules + + + Invoke Assessment + + + View all rules and learn more on GitHub + + Nothing to show. Invoke assessment to get results - - Display Name - - - Target - - - Severity + + Database {0} is totally compliant with the best practices. Good job! Instance {0} is totally compliant with the best practices. Good job! - - Database {0} is totally compliant with the best practices. Good job! + + Display Name - + + Severity + + + Target + + - - API information - API Version: - - Default Ruleset Version: - - - SQL Server Instance Details - - - Version: - - - Edition: - - - Instance Name: - - - OS Version: - - - Message - Check ID + + Help Link + + + Message + Tags + + Instance Name: + Learn More - - SQL Assessment Report + + OS Version: + + + Default Ruleset Version: + + + API information + + + SQL Server Instance Details + + + Edition: + + + Version: + + + Error + + + Information + + + Warning Results for database @@ -911,54 +920,63 @@ Results for server - - Error - - - Warning - - - Information - - - Help Link - {0}: {1} item(s) - + + SQL Assessment Report + + Open in Azure Portal - + - - Backup name + + Add a file - - Recovery model - - - Backup type - - - Backup files + + Advanced Configuration Algorithm + + Backup files + + + Backup file path is required + + + Backup name + + + Backup the tail of the log + + + Backup type + Certificate or Asymmetric key - - Media + + Perform checksum before writing to media - - Backup to the existing media set + + Compression - - Backup to a new media set + + Only backup to file is supported + + + Continue on error + + + Copy-only backup + + + Encryption Append to the existing backup set @@ -966,41 +984,41 @@ Overwrite all existing backup sets - - New media set name + + Expiration + + + Media + + + Media name is required + + + Backup to the existing media set + + + Backup to a new media set New media set description - - Perform checksum before writing to media + + New media set name - - Verify backup when finished + + No certificate or asymmetric key is available - - Continue on error + + Recovery model - - Expiration - - - Set backup retain days - - - Copy-only backup - - - Advanced Configuration - - - Compression + + Reliability Set backup compression - - Encryption + + Set backup retain days Transaction log @@ -1008,29 +1026,8 @@ Truncate the transaction log - - Backup the tail of the log - - - Reliability - - - Media name is required - - - No certificate or asymmetric key is available - - - Add a file - - - Remove files - - - Invalid input. Value must be greater than or equal 0. - - - Script + + Verify backup when finished Backup @@ -1038,71 +1035,86 @@ Cancel - - Only backup to file is supported + + Invalid input. Value must be greater than or equal 0. - - Backup file path is required + + Script - + + Remove files + + Backup - + - - You must enable preview features in order to use backup + + Backup command is not supported for Azure SQL databases. Backup command is not supported outside of a database context. Please select a database and try again. - - Backup command is not supported for Azure SQL databases. + + You must enable preview features in order to use backup Backup - + + + Asymmetric Key + + + Compress backup + + + Use the default server setting + + + Do not compress backup + Database + + Differential + + + Disk + Files and filegroups Full - - Differential - Transaction Log - - Disk - Url - - Use the default server setting - - - Compress backup - - - Do not compress backup - Server Certificate - - Asymmetric Key - - + + + Could not find chart to save + + + Saved Chart to path: {0} + + + Configure Chart + + + Copy as image + Create Insight @@ -1112,37 +1124,34 @@ My-Widget - - Configure Chart - - - Copy as image - - - Could not find chart to save + + PNG Save as image - - PNG - - - Saved Chart to path: {0} - - + + + Chart Type + + + Use column names as labels + Data Direction - - Vertical + + Data Type + + + Encoding Horizontal - - Use column names as labels + + Image Format Use first column as row label @@ -1150,20 +1159,20 @@ Legend Position - - Y Axis Label + + Number - - Y Axis Minimum Value + + Point - - Y Axis Maximum Value + + Vertical X Axis Label - - X Axis Minimum Value + + X Axis Maximum Date X Axis Maximum Value @@ -1171,39 +1180,45 @@ X Axis Minimum Date - - X Axis Maximum Date + + X Axis Minimum Value - - Data Type + + Y Axis Label - - Number + + Y Axis Maximum Value - - Point + + Y Axis Minimum Value - - Chart Type - - - Encoding - - - Image Format - - + Chart - + + + Bar + + + Failed to get rows for the dataset to chart. + + + Chart type '{0}' is not supported. + + + Count + + + Doughnut + Horizontal Bar - - Bar + + Image Line @@ -1214,51 +1229,36 @@ Scatter - - Time Series - - - Image - - - Count - Table - - Doughnut + + Time Series - - Failed to get rows for the dataset to chart. - - - Chart type '{0}' is not supported. - - + - - Built-in Charts - The maximum number of rows for charts to display. Warning: increasing this may impact performance. - + + Built-in Charts + + Close - + Series {0} - + Table does not contain a valid image - + Maximum row count for built-in charts has been exceeded, only the first {0} rows are used. To configure the value, you can open user settings and search for: 'builtinCharts.maxRowCount'. @@ -1266,36 +1266,36 @@ Don't Show Again - + - - Connecting: {0} - - - Running command: {0} - - - Opening new query: {0} - - - Cannot connect as no server information was provided - - - Could not open URL due to error {0} + + Are you sure you want to connect? This will connect to server {0} - - Are you sure you want to connect? - - - &&Open + + Connecting: {0} Connecting query file - + + Could not open URL due to error {0} + + + &&Open + + + Opening new query: {0} + + + Running command: {0} + + + Cannot connect as no server information was provided + + {0} was replaced with {1} in your user settings. @@ -1303,81 +1303,81 @@ {0} was replaced with {1} in your workspace settings. - + - - The maximum number of recently used connections to store in the connection list. + + Attempt to parse the contents of the clipboard when the connection dialog is opened or a paste is performed. Default SQL Engine to use. This drives default language provider in .sql files and the default to use when creating a new connection. - - Attempt to parse the contents of the clipboard when the connection dialog is opened or a paste is performed. + + The maximum number of recently used connections to store in the connection list. - + Connection Status - + - - Common id for the provider + + Options for connection Display Name for the provider - - Notebook Kernel Alias for the provider - Icon path for the server type - - Options for connection + + Notebook Kernel Alias for the provider - + + Common id for the provider + + - - User visible name for the tree provider - Id for the provider, must be the same as when registering the tree data provider and must start with `connectionDialog/` - + + User visible name for the tree provider + + - - Unique identifier for this container. + + Contributes a single or multiple dashboard containers for users to add to their dashboard. The container that will be displayed in the tab. - - Contributes a single or multiple dashboard containers for users to add to their dashboard. - - - No id in dashboard container specified for extension. - - - No container in dashboard container specified for extension. + + Unique identifier for this container. Exactly 1 dashboard container must be defined per space. + + No container in dashboard container specified for extension. + + + No id in dashboard container specified for extension. + Unknown container type defines in dashboard container for extension. - + The controlhost that will be displayed in this tab. - + The "{0}" section has invalid content. Please contact extension owner. - + The list of widgets or webviews that will be displayed in this tab. @@ -1385,52 +1385,52 @@ widgets or webviews are expected inside widgets-container for extension. - + The model-backed view that will be displayed in this tab. - + - - Unique identifier for this nav section. Will be passed to the extension for any requests. - - - (Optional) Icon which is used to represent this nav section in the UI. Either a file path or a themeable configuration - - - Icon path when a light theme is used - - - Icon path when a dark theme is used - - - Title of the nav section to show the user. + + The list of dashboard containers that will be displayed in this navigation section. The container that will be displayed in this nav section. - - The list of dashboard containers that will be displayed in this navigation section. + + (Optional) Icon which is used to represent this nav section in the UI. Either a file path or a themeable configuration - - No title in nav section specified for extension. + + Icon path when a dark theme is used - - No container in nav section specified for extension. + + Icon path when a light theme is used - - Exactly 1 dashboard container must be defined per space. + + Unique identifier for this nav section. Will be passed to the extension for any requests. + + + Title of the nav section to show the user. NAV_SECTION within NAV_SECTION is an invalid container for extension. - + + No container in nav section specified for extension. + + + No title in nav section specified for extension. + + + Exactly 1 dashboard container must be defined per space. + + The webview that will be displayed in this tab. - + The list of widgets that will be displayed in this tab. @@ -1438,106 +1438,85 @@ The list of widgets is expected inside widgets-container for extension. - + + + Open installed features + + + Click to pin + + + Click to unpin + + + Collapse Widget + + + Delete Widget + Edit Exit + + Expand Widget + Refresh Show Actions - - Delete Widget - - - Click to unpin - - - Click to pin - - - Open installed features - - - Collapse Widget - - - Expand Widget - - + {0} is an unknown container. - + + + General + Home No connection information could be found for this dashboard - - General - - + - - Unique identifier for this tab. Will be passed to the extension for any requests. - - - Title of the tab to show the user. - - - Description of this tab that will be shown to the user. - - - Condition which must be true to show this item - - - Defines the connection types this tab is compatible with. Defaults to 'MSSQL' if not set - - - The container that will be displayed in this tab. + + Administration Whether or not this tab should always be shown or only when the user adds it. - - Whether or not this tab should be used as the Home tab for a connection type. + + The container that will be displayed in this tab. + + + Description of this tab that will be shown to the user. The unique identifier of the group this tab belongs to, value for home group: home. - - (Optional) Icon which is used to represent this tab in the UI. Either a file path or a themeable configuration + + Icon path when a dark theme is used Icon path when a light theme is used - - Icon path when a dark theme is used + + Unique identifier for this tab. Will be passed to the extension for any requests. - - Contributes a single or multiple tabs for users to add to their dashboard. + + Whether or not this tab should be used as the Home tab for a connection type. - - No title specified for extension. - - - No description specified to show. - - - No container specified for extension. - - - Exactly 1 dashboard container must be defined per space + + Title of the tab to show the user. Unique identifier for this tab group. @@ -1545,17 +1524,44 @@ Title of the tab group. + + Defines the connection types this tab is compatible with. Defaults to 'MSSQL' if not set + + + Condition which must be true to show this item + Contributes a single or multiple tab groups for users to add to their dashboard. + + Contributes a single or multiple tabs for users to add to their dashboard. + + + Exactly 1 dashboard container must be defined per space + + + No container specified for extension. + + + No description specified to show. + + + No title specified for extension. + No id specified for tab group. No title specified for tab group. - - Administration + + databases tab + + + Databases + + + (Optional) Icon which is used to represent this tab in the UI. Either a file path or a themeable configuration Monitoring @@ -1566,59 +1572,47 @@ Security - - Troubleshooting - Settings - - databases tab + + Troubleshooting - - Databases - - + - - Manage - Dashboard - + + Manage + + Manage - + property `icon` can be omitted or must be either a string or a literal like `{dark, light}` - + - - Defines a property to show on the dashboard + + Defines that this provider supports the dashboard - - What value to use as a label for the property + + Provider id (ex. MSSQL) - - What value in the object to access for the value + + Property values to show on dashboard - - Specify values to be ignored - - - Default value to show if ignored or no value + + Properties to show for database page A flavor for defining dashboard properties - - Id of the flavor - Condition to use this flavor @@ -1631,37 +1625,40 @@ Value to compare the field to - - Properties to show for database page + + Id of the flavor + + + Defines a property to show on the dashboard + + + Default value to show if ignored or no value + + + What value to use as a label for the property + + + Specify values to be ignored + + + What value in the object to access for the value Properties to show for server page - - Defines that this provider supports the dashboard - - - Provider id (ex. MSSQL) - - - Property values to show on dashboard - - + - - Condition which must be true to show this item + + Unique identifier for this tab. Will be passed to the extension for any requests. Whether to hide the header of the widget, default value is false - - The title of the container + + Condition which must be true to show this item - - The row of the component in the grid - - - The rowspan of the component in the grid. Default value is 1. Use '*' to set to number of rows in the grid. + + Extension tab is unknown or not installed. The column of the component in the grid @@ -1669,21 +1666,24 @@ The colspan of the component in the grid. Default value is 1. Use '*' to set to number of columns in the grid. - - Unique identifier for this tab. Will be passed to the extension for any requests. + + The row of the component in the grid - - Extension tab is unknown or not installed. + + The rowspan of the component in the grid. Default value is 1. Use '*' to set to number of rows in the grid. - + + The title of the container + + Database Properties - + - - Enable or disable the properties widget + + Compatibility Level Property values to show @@ -1691,14 +1691,20 @@ Display name of the property - - Value in the Database Info Object - Specify specific values to ignore - - Recovery Model + + Value in the Database Info Object + + + Customizes the database dashboard page + + + Enable or disable the properties widget + + + Customizes the database dashboard tabs Last Database Backup @@ -1706,30 +1712,24 @@ Last Log Backup - - Compatibility Level + + Search Owner - - Customizes the database dashboard page + + Recovery Model - - Search - - - Customizes the database dashboard tabs - - + Server Properties - + - - Enable or disable the properties widget + + Computer Name Property values to show @@ -1740,38 +1740,38 @@ Value in the Server Info Object - - Version - - - Edition - - - Computer Name - - - OS Version - - - Search - Customizes the server dashboard page + + Enable or disable the properties widget + Customizes the Server dashboard tabs - + + Edition + + + Search + + + OS Version + + + Version + + Home - + Failed to change database - + Show Actions @@ -1779,100 +1779,85 @@ Actions + + Filtered search list to {0} items + No matching item found Filtered search list to 1 item - - Filtered search list to {0} items - - + Name - - Schema - Type - + + Schema + + - - loading objects + + Unable to load databases + + + Unable to load objects loading databases - - loading objects completed. - loading databases completed. + + loading objects + + + loading objects completed. + Search by name of type (t:, v:, f:, or sp:) Search databases - - Unable to load objects - - - Unable to load databases - - + Run Query - + - - Loading {0} - - - Loading {0} completed - Auto Refresh: OFF Last Updated: {0} {1} + + Loading {0} completed + + + Loading {0} + No results to show - + - - Adds a widget that can query a server or database and display the results in multiple ways - as a chart, summarized count, and more - - - Unique Identifier used for caching the results of the insight. - - - SQL query to run. This should return exactly 1 resultset. - - - [Optional] path to a file that contains a query. Use if 'query' is not set - - - [Optional] Auto refresh interval in minutes, if not set, there will be no auto refresh - - - Which actions to use - Target database for the action; can use the format '${ columnName }' to use a data driven column name. Target server for the action; can use the format '${ columnName }' to use a data driven column name. + + Which actions to use + Target user for the action; can use the format '${ columnName }' to use a data driven column name. @@ -1882,12 +1867,27 @@ Contributes insights to the dashboard palette. - + + [Optional] Auto refresh interval in minutes, if not set, there will be no auto refresh + + + Unique Identifier used for caching the results of the insight. + + + SQL query to run. This should return exactly 1 resultset. + + + [Optional] path to a file that contains a query. Use if 'query' is not set + + + Adds a widget that can query a server or database and display the results in multiple ways - as a chart, summarized count, and more + + Chart cannot be displayed with the given data - + Displays results of a query as a chart on the dashboard @@ -1895,121 +1895,106 @@ Maps 'column name' -> color. for example add 'column1': red to ensure this column uses a red color - - Indicates preferred position and visibility of the chart legend. These are the column names from your query, and map to the label of each chart entry - - - If dataDirection is horizontal, setting this to true uses the first columns value for the legend. - If dataDirection is vertical, setting this to true will use the columns names for the legend. Defines whether the data is read from a column (vertical) or a row (horizontal). For time series this is ignored as direction must be vertical. + + If dataDirection is horizontal, setting this to true uses the first columns value for the legend. + + + Indicates preferred position and visibility of the chart legend. These are the column names from your query, and map to the label of each chart entry + If showTopNData is set, showing only top N data in the chart. - + - - Minimum value of the y axis - - - Maximum value of the y axis + + Label for the x axis Label for the y axis - - Minimum value of the x axis - Maximum value of the x axis - - Label for the x axis + + Minimum value of the x axis - + + Maximum value of the y axis + + + Minimum value of the y axis + + Indicates data property of a data set for a chart. - + For each column in a resultset, displays the value in row 0 as a count followed by the column name. Supports '1 Healthy', '3 Unhealthy' for example, where 'Healthy' is the column name and 1 is the value in row 1 cell 1 - + - - Displays an image, for example one returned by an R query using ggplot2 + + Is this encoded as hex, base64 or some other format? What format is expected - is this a JPEG, PNG or other format? - - Is this encoded as hex, base64 or some other format? + + Displays an image, for example one returned by an R query using ggplot2 - + Displays the results in a simple table - + + + Unable to load dashboard properties + Loading properties Loading properties completed - - Unable to load dashboard properties - - + - - Database Connections - - - data source connections - - - data source groups - Saved connections are sorted by the dates they were added. Saved connections are sorted by their display names alphabetically. + + Database Connections + + + data source groups + + + data source connections + Controls sorting order of saved connections and connection groups. - - Startup Configuration - True for the Servers view to be shown on launch of Azure Data Studio default; false if the last opened view should be shown - + + Startup Configuration + + - - Identifier of the view. Use this to register a data provider through `vscode.window.registerTreeDataProviderForView` API. Also to trigger activating your extension by registering `onView:${id}` event to `activationEvents`. - - - The human-readable name of the view. Will be shown - - - Condition which must be true to show this view - - - Contributes views to the editor - - - Contributes views to Data Explorer container in the Activity bar - Contributes views to contributed views container @@ -2019,16 +2004,31 @@ A view with id `{0}` is already registered in the view container `{1}` + + Contributes views to the editor + + + Contributes views to Data Explorer container in the Activity bar + + + property `{0}` can be omitted or must be of type `string` + views must be an array property `{0}` is mandatory and must be of type `string` - - property `{0}` can be omitted or must be of type `string` + + Identifier of the view. Use this to register a data provider through `vscode.window.registerTreeDataProviderForView` API. Also to trigger activating your extension by registering `onView:${id}` event to `activationEvents`. - + + The human-readable name of the view. Will be shown + + + Condition which must be true to show this view + + Servers @@ -2039,7 +2039,7 @@ Show Connections - + Disconnect @@ -2047,34 +2047,34 @@ Refresh - + Show Edit Data SQL pane on startup - + - - Run - Dispose Edit Failed With Error: - - Stop - - - Show SQL Pane - Close SQL Pane - + + Run + + + Show SQL Pane + + + Stop + + Max Rows: - + Delete Row @@ -2082,33 +2082,45 @@ Revert Current Row - + - - Save As CSV - - - Save As JSON - - - Save As Excel - - - Save As XML - Copy Copy With Headers + + Save As CSV + + + Save As Excel + + + Save As JSON + + + Save As XML + Select All - + - - Dashboard Tabs ({0}) + + When + + + Id + + + Dashboard Insights ({0}) + + + Name + + + Description Id @@ -2116,48 +2128,33 @@ Title - - Description + + Dashboard Tabs ({0}) - - Dashboard Insights ({0}) - - - Id - - - Name - - - When - - + - - Gets extension information from the gallery + + Extension '{0}' not found. Extension id - - Extension '{0}' not found. + + Gets extension information from the gallery - + - - Show Recommendations - Install Extensions Author an Extension... - - - - Don't Show Again + + Show Recommendations + + Azure Data Studio has extension recommendations. @@ -2168,82 +2165,73 @@ Once installed, you can select the Visualizer icon to visualize your query resul Install All - - Show Recommendations + + Don't Show Again The scenario type for extension recommendations must be provided. - + + Show Recommendations + + This extension is recommended by Azure Data Studio. - + + + Alerts + Jobs Notebooks - - Alerts + + Operators Proxies - - Operators - - + - - Name - - - Last Occurrence - - - Enabled + + Category Name Delay Between Responses (in secs) - - Category Name + + Enabled - + + Last Occurrence + + + Name + + - - Success - - - Error - - - Refresh - - - New Job - - - Run + + The operator was deleted successfully : The job was successfully started. - - Stop - : The job was successfully stopped. - - Edit Job + + Cancel - - Open + + Delete Alert + + + Are you sure you'd like to delete the alert '{0}'? Delete Job @@ -2251,56 +2239,8 @@ Once installed, you can select the Visualizer icon to visualize your query resul Are you sure you'd like to delete the job '{0}'? - - Could not delete job '{0}'. -Error: {1} - - - The job was successfully deleted - - - New Step - - - Delete Step - - - Are you sure you'd like to delete the step '{0}'? - - - Could not delete step '{0}'. -Error: {1} - - - The job step was successfully deleted - - - New Alert - - - Edit Alert - - - Delete Alert - - - Cancel - - - Are you sure you'd like to delete the alert '{0}'? - - - Could not delete alert '{0}'. -Error: {1} - - - The alert was successfully deleted - - - New Operator - - - Edit Operator + + Are you sure you'd like to delete the notebook '{0}'? Delete Operator @@ -2308,104 +2248,167 @@ Error: {1} Are you sure you'd like to delete the operator '{0}'? - - Could not delete operator '{0}'. -Error: {1} - - - The operator was deleted successfully - - - New Proxy - - - Edit Proxy - Delete Proxy Are you sure you'd like to delete the proxy '{0}'? - - Could not delete proxy '{0}'. -Error: {1} + + Delete Step + + + Are you sure you'd like to delete the step '{0}'? + + + The alert was successfully deleted + + + The job was successfully deleted + + + The notebook was successfully deleted The proxy was deleted successfully - - New Notebook Job + + The job step was successfully deleted - - Edit + + Edit Alert - - Open Template Notebook + + Edit Job - - Delete + + Edit Operator - - Are you sure you'd like to delete the notebook '{0}'? + + Edit Proxy + + + Could not delete alert '{0}'. +Error: {1} + + + Could not delete job '{0}'. +Error: {1} Could not delete notebook '{0}'. Error: {1} - - The notebook was successfully deleted + + Could not delete operator '{0}'. +Error: {1} - - Pin + + Could not delete proxy '{0}'. +Error: {1} + + + Could not delete step '{0}'. +Error: {1} + + + Error + + + New Alert + + + New Job + + + New Operator + + + New Proxy + + + New Step + + + Refresh + + + Run + + + Stop + + + Success + + + Open Delete - - Unpin + + Delete - - Rename + + Edit + + + New Notebook Job Open Latest Run - + + Open Template Notebook + + + Pin + + + Rename + + + Unpin + + + + Message + Step ID Step Name - - Message - - + Steps - + - - Name - - - Last Run - - - Next Run + + Category Enabled - - Status + + Last Run - - Category + + Last Run Outcome + + + Name + + + Next Run + + + Previous Runs Runnable @@ -2413,28 +2416,28 @@ Error: {1} Schedule - - Last Run Outcome - - - Previous Runs - - - No Steps available for this job. + + Status Error: - + + No Steps available for this job. + + + + Job Error: + Date Created: Notebook Error: - - Job Error: + + Past Runs Pinned @@ -2442,53 +2445,50 @@ Error: {1} Recent Runs - - Past Runs - - + - - Name - - - Target Database - Last Run - - Next Run - - - Status - Last Run Outcome + + Name + + + Next Run + Previous Runs - - No Steps available for this job. + + Status + + + Target Database Error: + + No Steps available for this job. + Notebook Error: - + - - Name - Email Address Enabled - + + Name + + Account Name @@ -2502,31 +2502,13 @@ Error: {1} Enabled - + - - Insert - Cancel - - Image location - - - This computer - - - Online - - - Image URL - - - Enter image path - - - Enter image URL + + Insert Browse @@ -2537,14 +2519,26 @@ Error: {1} Local + + This computer + + + Image location + + + Image URL + + + Enter image path + Remote - - Text to display + + Online - - Text to display + + Enter image URL Address @@ -2552,63 +2546,66 @@ Error: {1} Link to an existing file or web page - - - - More + + Text to display - - Edit + + Text to display + + + + + Remove parameter cell + + + Clear Result Close - - Convert Cell - - - Run Cells Above - - - Run Cells Below - Insert Code Above Insert Code Below - - Insert Text Above - - - Insert Text Below - Collapse Cell + + Convert Cell + + + Edit + Expand Cell Make parameter cell - - Remove parameter cell + + Insert Text Above - - Clear Result + + Insert Text Below - + + More + + + Run Cells Above + + + Run Cells Below + + Add cell - - Code cell - - - Text cell + + Delete Move cell down @@ -2616,25 +2613,31 @@ Error: {1} Move cell up - - Delete - Add cell Code cell + + Code cell + + + Text cell + Text cell - + Parameters - + + + Error on last run. Click to run again + Please select active cell and try again @@ -2644,33 +2647,30 @@ Error: {1} Cancel execution - - Error on last run. Click to run again - - + - - Expand code cell contents - Collapse code cell contents - + + Expand code cell contents + + Bold - - Italic - - - Underline + + Code Highlight - - Code + + Image + + + Italic Link @@ -2681,15 +2681,24 @@ Error: {1} Ordered list - - Image - Markdown preview toggle - off + + Underline + + + Insert image + + + Insert link + Heading + + Markdown View + Heading 1 @@ -2702,215 +2711,146 @@ Error: {1} Paragraph - - Insert link - - - Insert image - Rich Text View Split View - - Markdown View - - + + + Error rendering component: {0} + No {0}renderer could be found for output. It has the following MIME types: {1} - - safe - No component could be found for selector {0} - - Error rendering component: {0} + + safe - + Click on - - + Code - or - - + Text - - - to add a code or text cell + + + Code Add a code cell + + + Text + Add a text cell - + + to add a code or text cell + + StdIn: - + - - <i>Double-click to edit</i> - <i>Add content here...</i> - + + <i>Double-click to edit</i> + + - - Find - - - Find - - - Previous match - - - Next match - Close - - Your search returned a large number of results, only the first 999 matches will be highlighted. + + Find {0} of {1} + + Next match + No Results - + + Previous match + + + Find + + + Your search returned a large number of results, only the first 999 matches will be highlighted. + + + + Cell + Add code Add text - - Create File - - - Could not display contents: {0} - - - Add cell - - - Code cell - - - Text cell - - - Run all - - - Cell - - - Views - - - Editor - - - Code - - - Text - - - Run Cells - - - < Previous - - - Next > - cell with URI {0} was not found in this model Run Cells failed - See error in output of the currently selected cell for more information. - + + Code + + + Add cell + + + Code cell + + + Create File + + + Could not display contents: {0} + + + Editor + + + Next > + + + < Previous + + + Run Cells + + + Run all + + + Text + + + Text cell + + + Views + + - - New Notebook - - - New Notebook - - - Set Workspace And Open - - - SQL kernel: stop Notebook execution when error occurs in a cell. - - - (Preview) show all kernels for the current notebook provider. - - - Allow notebooks to run Azure Data Studio commands. - - - Enable double click to edit for text cells in notebooks - - - Text is displayed as Rich Text (also known as WYSIWYG). - - - Markdown is displayed on the left, with a preview of the rendered text on the right. - - - Text is displayed as Markdown. - - - The default editing mode used for text cells - - - (Preview) Save connection name in notebook metadata. - - - Controls the line height used in the notebook markdown preview. This number is relative to the font size. - - - (Preview) Show rendered notebook in diff editor. - - - The maximum number of changes stored in the undo history for the notebook Rich Text editor. - - - Use absolute file paths when linking to other notebooks. - - - Enable incremental grid rendering for notebooks. This will improve the initial rendering time for large notebooks. There may be performance issues when interacting with the notebook while the rest of the grids are rendering. - - - Notebook Views - - - (Preview) Enable Notebook Views - - - Search Notebooks - Configure glob patterns for excluding files and folders in fulltext searches and quick open. Inherits all glob patterns from the `#files.exclude#` setting. Read more about glob patterns [here](https://code.visualstudio.com/docs/editor/codebasics#_advanced-search-options). @@ -2920,26 +2860,8 @@ Error: {1} Additional check on the siblings of a matching file. Use $(basename) as variable for the matching file name. - - This setting is deprecated and now falls back on "search.usePCRE2". - - - Deprecated. Consider "search.usePCRE2" for advanced regex feature support. - - - When enabled, the searchService process will be kept alive instead of being shut down after an hour of inactivity. This will keep the file search cache in memory. - - - Controls whether to use `.gitignore` and `.ignore` files when searching for files. - - - Controls whether to use global `.gitignore` and `.ignore` files when searching for files. - - - Whether to include results from a global symbol search in the file results for Quick Open. - - - Whether to include results from recently opened files in the file results for Quick Open. + + Controls sorting order of editor history in quick open when filtering. History entries are sorted by relevance based on the filter value used. More relevant entries appear first. @@ -2947,15 +2869,81 @@ Error: {1} History entries are sorted by recency. More recently opened entries appear first. - - Controls sorting order of editor history in quick open when filtering. + + New Notebook + + + New Notebook + + + Allow notebooks to run Azure Data Studio commands. + + + The default editing mode used for text cells + + + Enable double click to edit for text cells in notebooks + + + Enable incremental grid rendering for notebooks. This will improve the initial rendering time for large notebooks. There may be performance issues when interacting with the notebook while the rest of the grids are rendering. + + + Text is displayed as Markdown. + + + Controls the line height used in the notebook markdown preview. This number is relative to the font size. + + + The maximum number of changes stored in the undo history for the notebook Rich Text editor. + + + Text is displayed as Rich Text (also known as WYSIWYG). + + + (Preview) Save connection name in notebook metadata. + + + (Preview) show all kernels for the current notebook provider. + + + (Preview) Show rendered notebook in diff editor. + + + Markdown is displayed on the left, with a preview of the rendered text on the right. + + + SQL kernel: stop Notebook execution when error occurs in a cell. + + + Use absolute file paths when linking to other notebooks. + + + Whether to use the newer version of the markdown renderer for Notebooks. This may result in markdown being rendered differently than previous versions. + + + Notebook Views + + + (Preview) Enable Notebook Views + + + Controls the positioning of the actionbar on rows in the search view. + + + Position the actionbar to the right when the search view is narrow, and immediately after the content when the search view is wide. + + + Always position the actionbar to the right. + + + Controls whether the search results will be collapsed or expanded. + + + Files with less than 10 results are expanded. Others are collapsed. Controls whether to follow symlinks while searching. - - Search case-insensitively if the pattern is all lowercase, otherwise, search case-sensitively. - Controls whether the search view should read or modify the shared find clipboard on macOS. @@ -2965,47 +2953,17 @@ Error: {1} This setting is deprecated. Please use the search view's context menu instead. - - Files with less than 10 results are expanded. Others are collapsed. + + When enabled, the searchService process will be kept alive instead of being shut down after an hour of inactivity. This will keep the file search cache in memory. - - Controls whether the search results will be collapsed or expanded. + + Whether to include results from recently opened files in the file results for Quick Open. - - Controls whether to open Replace Preview when selecting or replacing a match. + + Whether to include results from a global symbol search in the file results for Quick Open. - - Controls whether to show line numbers for search results. - - - Whether to use the PCRE2 regex engine in text search. This enables using some advanced regex features like lookahead and backreferences. However, not all PCRE2 features are supported - only features that are also supported by JavaScript. - - - Deprecated. PCRE2 will be used automatically when using regex features that are only supported by PCRE2. - - - Position the actionbar to the right when the search view is narrow, and immediately after the content when the search view is wide. - - - Always position the actionbar to the right. - - - Controls the positioning of the actionbar on rows in the search view. - - - Search all files as you type. - - - Enable seeding search from the word nearest the cursor when the active editor has no selection. - - - Update workspace search query to the editor's selected text when focusing the search view. This happens either on click or when triggering the `workbench.views.search.focus` command. - - - When `#search.searchOnType#` is enabled, controls the timeout in milliseconds between a character being typed and the search starting. Has no effect when `search.searchOnType` is disabled. - - - Double clicking selects the word under the cursor. + + Configure effect of double clicking a result in a search editor. Double clicking opens the result in the active editor group. @@ -3013,8 +2971,44 @@ Error: {1} Double clicking opens the result in the editor group to the side, creating one if it does not yet exist. - - Configure effect of double clicking a result in a search editor. + + Double clicking selects the word under the cursor. + + + Search all files as you type. + + + When `#search.searchOnType#` is enabled, controls the timeout in milliseconds between a character being typed and the search starting. Has no effect when `search.searchOnType` is disabled. + + + Update workspace search query to the editor's selected text when focusing the search view. This happens either on click or when triggering the `workbench.views.search.focus` command. + + + Enable seeding search from the word nearest the cursor when the active editor has no selection. + + + Controls whether to show line numbers for search results. + + + Search case-insensitively if the pattern is all lowercase, otherwise, search case-sensitively. + + + Controls sorting order of search results. + + + Whether to use the PCRE2 regex engine in text search. This enables using some advanced regex features like lookahead and backreferences. However, not all PCRE2 features are supported - only features that are also supported by JavaScript. + + + Controls whether to open Replace Preview when selecting or replacing a match. + + + Search Notebooks + + + Results are sorted by count per file, in ascending order. + + + Results are sorted by count per file, in descending order. Results are sorted by folder and file names, in alphabetical order. @@ -3022,91 +3016,82 @@ Error: {1} Results are sorted by file names ignoring folder order, in alphabetical order. - - Results are sorted by file extensions, in alphabetical order. - Results are sorted by file last modified date, in descending order. - - Results are sorted by count per file, in descending order. + + Results are sorted by file extensions, in alphabetical order. - - Results are sorted by count per file, in ascending order. + + Controls whether to use global `.gitignore` and `.ignore` files when searching for files. - - Controls sorting order of search results. + + Controls whether to use `.gitignore` and `.ignore` files when searching for files. - + + Deprecated. PCRE2 will be used automatically when using regex features that are only supported by PCRE2. + + + This setting is deprecated and now falls back on "search.usePCRE2". + + + Deprecated. Consider "search.usePCRE2" for advanced regex feature support. + + + Set Workspace And Open + + - - Loading kernels... - - - Changing kernel... - Attach to Kernel - - Loading contexts... - Change Connection - - Select Connection - - - localhost - - - No Kernel - - - This notebook cannot run with parameters as the kernel is not supported. Please use the supported kernels and format. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). - - - This notebook cannot run with parameters until a parameter cell is added. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). - - - This notebook cannot run with parameters until there are parameters added to the parameter cell. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). + + Changing kernel... Clear Results - - Editor - - - Create New View - - - Unable to find view: {0} - - - Trusted - - - Not Trusted - Collapse Cells Expand Cells - - Run with Parameters + + This notebook cannot run with parameters as the kernel is not supported. Please use the supported kernels and format. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). + + + Loading kernels... + + + Loading contexts... + + + localhost + + + New Notebook + + + Create New View None - - New Notebook + + No Kernel + + + This notebook cannot run with parameters until a parameter cell is added. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). + + + This notebook cannot run with parameters until there are parameters added to the parameter cell. [Learn more](https://docs.microsoft.com/sql/azure-data-studio/notebooks/notebooks-parameterization). Find Next String @@ -3114,41 +3099,59 @@ Error: {1} Find Previous String - + + Editor + + + Run with Parameters + + + Select Connection + + + Trusted + + + Not Trusted + + + Unable to find view: {0} + + Notebook Editor - + + + Notebooks + Search Results Search path not found: {0} - - Notebooks - - + - - You have not opened any folder that contains notebooks/books. + + Cancel Search - - Open Notebooks + + Clear Search Results - - The result set only contains a subset of all matches. Please be more specific in your search to narrow down the results. + + Collapse All - - Search in progress... - + + Expand All - - No results found in '{0}' excluding '{1}' - + + Toggle Collapse and Expand - - No results found in '{0}' - + + Search returned {0} results in {1} files No results found excluding '{0}' - @@ -3156,34 +3159,34 @@ Error: {1} No results found. Review your settings for configured exclusions and check your gitignore files - + + No results found in '{0}' - + + + No results found in '{0}' excluding '{1}' - + + + Open Notebooks + + + Open Settings + Search again Search again in all files - - Open Settings + + Search in progress... - - - Search returned {0} results in {1} files + + The result set only contains a subset of all matches. Please be more specific in your search to narrow down the results. - - Toggle Collapse and Expand + + You have not opened any folder that contains notebooks/books. - - Cancel Search - - - Expand All - - - Collapse All - - - Clear Search Results - - + Search: Type Search Term and press Enter to search or Escape to cancel @@ -3191,10 +3194,13 @@ Error: {1} Search - + - - Insert cells + + Cancel + + + Insert Select cell sources @@ -3202,101 +3208,98 @@ Error: {1} Error: Unable to generate thumbnails. + + Insert cells + Untitled Cell : {0} - - Insert - - - Cancel - - + Cell Awaiting Input - - Loading - cell with URI {0} was not found in this model - - Starting execution - - - Running cell {0} of {1} - Run Cells failed - See error in output of the currently selected cell for more information. + + Loading + Run all - - - - Unable to remove view + + Running cell {0} of {1} + + Starting execution + + + Are you sure you want to delete view "{0}"? &&Delete + + Error on last run. Click to run again + Insert Cells + + More + Run cell Cancel execution - - Error on last run. Click to run again - Unable to navigate to notebook cell. View Cell In Notebook - - More + + Unable to remove view - + Please run this cell to view outputs. - + This view is empty. Add a cell to this view by clicking the Insert Cells button. - + - - Configure View - - - View Name - - - This field is required. - - - This view name has already been taken. + + Cancel Save - - Cancel + + This field is required. - + + View Name + + + This view name has already been taken. + + + Configure View + + Copy failed with error {0} @@ -3307,7 +3310,7 @@ Error: {1} Show table - + No {0} renderer could be found for output. It has the following MIME types: {1} @@ -3315,77 +3318,110 @@ Error: {1} (safe) - + Error displaying Plotly graph: {0} - + - - No connections found. - Add Connection - - - - Server Group color palette used in the Object Explorer viewlet. + + No connections found. + + Auto-expand Server Groups in the Object Explorer viewlet. + + Server Group color palette used in the Object Explorer viewlet. + (Preview) Use the new async server tree for the Servers view and Connection Dialog with support for new features such as dynamic node filtering. - + - - Data + + Built-in Charts Connection - - Query Editor + + Dashboard + + + Data Notebook - - Dashboard - Profiler - - Built-in Charts + + Query Editor - + - - Specifies view templates + + Profiler Filters Specifies session templates - - Profiler Filters + + Specifies view templates - + + + New Session + + + Clear Data + + + Are you sure you want to clear the data? + + + Clear Filter + + + Are you sure you want to clear the filters? + + + Edit Columns + + + Filter… + + + Find Next String + + + Find Previous String + + + Toggle Collapsed Panel + + + Auto Scroll: Off + + + Auto Scroll: On + Connect Disconnect - - Start - - - New Session + + Launch Profiler Pause @@ -3396,46 +3432,16 @@ Error: {1} Stop - - Clear Data + + Start - - Are you sure you want to clear the data? - - - Auto Scroll: On - - - Auto Scroll: Off - - - Toggle Collapsed Panel - - - Edit Columns - - - Find Next String - - - Find Previous String - - - Launch Profiler - - - Filter… - - - Clear Filter - - - Are you sure you want to clear the filters? - - + - - Select View + + Details + + + Label Select Session @@ -3443,82 +3449,64 @@ Error: {1} Select Session: + + Select View + Select View: - - Text - - - Label - Value - - Details + + Text - + - - Find - - - Find - - - Previous match - - - Next match - Close - - Your search returned a large number of results, only the first 999 matches will be highlighted. + + Find {0} of {1} + + Next match + No Results - + + Previous match + + + Find + + + Your search returned a large number of results, only the first 999 matches will be highlighted. + + Profiler editor for event text. Readonly - + - - Events (Filtered): {0}/{1} - Events: {0} + + Events (Filtered): {0}/{1} + Event Count - + - - Save As CSV - - - Save As JSON - - - Save As Excel - - - Save As XML - - - Results encoding will not be saved when exporting to JSON, remember to save with desired encoding once file is created. - - - Save to file is not supported by the backing data source + + Chart Copy @@ -3526,8 +3514,8 @@ Error: {1} Copy With Headers - - Select All + + Results encoding will not be saved when exporting to JSON, remember to save with desired encoding once file is created. Maximize @@ -3535,28 +3523,40 @@ Error: {1} Restore - - Chart + + Save As CSV + + + Save As Excel + + + Save As JSON + + + Save As XML + + + Save to file is not supported by the backing data source + + + Select All Visualizer - + - - Choose SQL Language + + A connection using engine {0} exists. To change please disconnect or change connection Change SQL language provider - - SQL Language Flavor - Change SQL Engine Provider - - A connection using engine {0} exists. To change please disconnect or change connection + + Choose SQL Language No text editor active at this time @@ -3564,133 +3564,136 @@ Error: {1} Select Language Provider - - - - XML Showplan + + SQL Language Flavor + + Results grid Max row count for filtering/sorting has been exceeded. To update it, navigate to User Settings and change the setting: 'queryEditor.results.inMemoryDataProcessingThreshold' - + + XML Showplan + + - - Focus on Current Query - - - Run Query - - - Run Current Query - - - Copy Query With Results - - - Successfully copied query and results. - - - Run Current Query with Actual Plan + + Toggle Focus Between Query And Results Cancel Query - - Refresh IntelliSense Cache + + Copy Query With Results - - Toggle Query Results - - - Toggle Focus Between Query And Results - - - Editor parameter is required for a shortcut to be executed + + Focus on Current Query Parse Query - - Commands completed successfully + + Please connect to a server Command failed: - - Please connect to a server + + Commands completed successfully - + + Successfully copied query and results. + + + Editor parameter is required for a shortcut to be executed + + + Refresh IntelliSense Cache + + + Run Current Query + + + Run Current Query with Actual Plan + + + Run Query + + + Toggle Query Results + + - - Message Panel - Copy Copy All - - - - Query Results + + Message Panel + + New Query - - Query Editor - - - When true, column headers are included when saving results as CSV - - - The custom delimiter to use between values when saving as CSV - - - Character(s) used for seperating rows when saving results as CSV - - - Character used for enclosing text fields when saving results as CSV - - - File encoding used when saving results as CSV - - - When true, XML output will be formatted when saving results as XML - - - File encoding used when saving results as XML - - - Enable results streaming; contains few minor visual issues - - - Configuration options for copying results from the Results View - - - Configuration options for copying multi-line results from the Results View + + The default chart type to use when opening Chart Viewer from a Query Results Controls the max number of rows allowed to do filtering and sorting in memory. If the number is exceeded, sorting and filtering will be disabled. Warning: Increasing this may impact performance. - - Whether to open the file in Azure Data Studio after the result is saved. - Should execution time be shown for individual batches Word wrap messages - - The default chart type to use when opening Chart Viewer from a Query Results + + Prompt to save generated SQL files - - Tab coloring will be disabled + + Configuration options for copying results from the Results View + + + Configuration options for copying multi-line results from the Results View + + + Whether to open the file in Azure Data Studio after the result is saved. + + + The custom delimiter to use between values when saving as CSV + + + File encoding used when saving results as CSV + + + When true, column headers are included when saving results as CSV + + + Character(s) used for seperating rows when saving results as CSV + + + Character used for enclosing text fields when saving results as CSV + + + File encoding used when saving results as XML + + + When true, XML output will be formatted when saving results as XML + + + Enable results streaming; contains few minor visual issues + + + Controls whether to show the connection info for a tab in the title. + + + Controls how to color tabs based on the server group of their active connection The top border of each editor tab will be colored to match the relevant server group @@ -3698,137 +3701,140 @@ Error: {1} Each editor tab's background color will match the relevant server group - - Controls how to color tabs based on the server group of their active connection + + Tab coloring will be disabled - - Controls whether to show the connection info for a tab in the title. + + Query Editor - - Prompt to save generated SQL files + + Query Results Set keybinding workbench.action.query.shortcut{0} to run the shortcut text as a procedure call or query execution. Any selected text in the query editor will be passed as a parameter at the end of your query, or you can reference it with {arg} - + - - New Query - - - Run + + Actual Cancel - - Explain - - - Actual - - - Disconnect - Change Connection - - Connect - - - Enable SQLCMD - - - Disable SQLCMD - - - Select Database - Failed to change database Failed to change database: {0} + + Connect + + + Disable SQLCMD + + + Disconnect + + + Enable SQLCMD + + + Explain + + + New Query + Export as Notebook - + + Run + + + Select Database + + Query Editor - + - - Results - Messages - - - - Time Elapsed + + Results - - Row Count + + + + Executing query... {0} rows - - Executing query... - - - Execution Status + + Row Count Selection Summary + + Execution Status + Average: {0} Count: {1} Sum: {2} - + + Time Elapsed + + - - Results Grid and Messages - - - Controls the font family. - - - Controls the font weight. - - - Controls the font size in pixels. - - - Controls the letter spacing in pixels. - - - Controls the row height in pixels + + Auto size the columns width on inital results. Could have performance problems with large number of columns or large cells Controls the cell padding in pixels - - Auto size the columns width on inital results. Could have performance problems with large number of columns or large cells + + Controls the font family. + + + Controls the font size in pixels. + + + Controls the font weight. + + + Controls the letter spacing in pixels. The maximum width in pixels for auto-sized columns - + + Results Grid and Messages + + + Controls the row height in pixels + + - - Toggle Query History + + Clear All History Delete - - Clear All History + + Pause Query History Capture + + + Start Query History Capture Open Query @@ -3839,21 +3845,18 @@ Error: {1} Toggle Query History capture - - Pause Query History Capture + + Toggle Query History - - Start Query History Capture - - + - - succeeded - failed - + + succeeded + + No queries to display. @@ -3862,13 +3865,14 @@ Error: {1} Query History QueryHistory - + - - QueryHistory + + &&Query History + && denotes a mnemonic - - Whether Query History capture is enabled. If false queries executed will not be captured. + + Query History Clear All History @@ -3879,216 +3883,218 @@ Error: {1} Start Query History Capture + + Whether Query History capture is enabled. If false queries executed will not be captured. + + + QueryHistory + View - - &&Query History - && denotes a mnemonic - - - Query History - - + Query Plan - + Query Plan Editor - + - - Operation - - - Object - - - Est Cost - - - Est Subtree Cost - - - Actual Rows - - - Est Rows - Actual Executions - - Est CPU Cost - - - Est IO Cost - - - Parallel - Actual Rebinds - - Est Rebinds - Actual Rewinds + + Actual Rows + + + Est CPU Cost + + + Est Cost + + + Est IO Cost + + + Est Rebinds + Est Rewinds + + Est Rows + + + Est Subtree Cost + + + Object + + + Operation + + + Parallel + Partitioned Top Operations - + New Deployment... - + Resource Viewer - + Refresh - + - - Error opening link : {0} - Error executing command '{0}' : {1} - + + Error opening link : {0} + + Resource Viewer Tree - + + + Path to the resource icon. + Identifier of the resource. The human-readable name of the view. Will be shown - - Path to the resource icon. - Contributes resource to the resource view - - property `{0}` is mandatory and must be of type `string` - property `{0}` can be omitted or must be of type `string` - - - - Restore + + property `{0}` is mandatory and must be of type `string` + + Restore - + + Restore + + - - You must enable preview features in order to use restore + + Restore command is not supported for Azure SQL databases. Restore command is not supported outside of a server context. Please select a server or database and try again. - - Restore command is not supported for Azure SQL databases. + + You must enable preview features in order to use restore Restore - + + + Edit Data + + + Refresh + + + Script as Alter + + + Script as Alter + Script as Create Script as Drop - - Select Top 1000 - Script as Execute - - Script as Alter - - - Edit Data - - + Select Top 1000 - - Take 10 - Script as Create - - Script as Execute - - - Script as Alter - Script as Drop - - Refresh + + Script as Execute - + + Take 10 + + + Select Top 1000 + + An error occurred refreshing node '{0}': {1} - + {0} in progress tasks - - View - - - Tasks - &&Tasks && denotes a mnemonic - + + Tasks + + + View + + Toggle Tasks - + - - succeeded + + canceled + + + canceling failed @@ -4099,25 +4105,22 @@ Error: {1} not started - - canceled + + succeeded - - canceling - - + No task history to display. + + Task error + Task history TaskHistory - - Task error - - + Cancel @@ -4128,62 +4131,50 @@ Error: {1} Script - + - - There is no data provider registered that can provide view data. - - - Refresh - Collapse All Error running command {1}: {0}. This is likely caused by the extension that contributes {1}. - - - - OK + + There is no data provider registered that can provide view data. + + Refresh + + + Close - + + OK + + + + No, don't show again + + + No + Preview features enhance your experience in Azure Data Studio by giving you full access to new features and improvements. You can learn more about preview features [here]({0}). Would you like to enable preview features? Yes (recommended) - - No - - - No, don't show again - - + - - This feature page is in preview. Preview features introduce new functionalities that are on track to becoming a permanent part the product. They are stable, but need additional accessibility improvements. We welcome your early feedback while they are under development. - - - Preview - Create a connection Connect to a database instance through the connection dialog. - - Run a query - - - Interact with data through a query editor. - Create a notebook @@ -4196,26 +4187,14 @@ Error: {1} Create a new instance of a relational data service on the platform of your choice. - - Resources + + Documentation - - History + + Visit the documentation center for quickstarts, how-to guides, and references for PowerShell, APIs, etc. - - Name - - - Location - - - Show more - - - Show welcome page on startup - - - Useful Links + + Extensions Getting Started @@ -4223,40 +4202,85 @@ Error: {1} Discover the capabilities offered by Azure Data Studio and learn how to make the most of them. - - Documentation - - - Visit the documentation center for quickstarts, how-to guides, and references for PowerShell, APIs, etc. - - - Videos - - - Overview of Azure Data Studio - - - Introduction to Azure Data Studio Notebooks | Data Exposed - - - Extensions - - - Show All + + History Learn more - + + Location + + + Show more + + + Name + + + Preview + + + This feature page is in preview. Preview features introduce new functionalities that are on track to becoming a permanent part the product. They are stable, but need additional accessibility improvements. We welcome your early feedback while they are under development. + + + Resources + + + Run a query + + + Interact with data through a query editor. + + + Show All + + + Show welcome page on startup + + + Useful Links + + + Introduction to Azure Data Studio Notebooks | Data Exposed + + + Overview of Azure Data Studio + + + Videos + + + + Extend the functionality of Azure Data Studio by installing extensions developed by us/Microsoft as well as the third-party community (you!). + Connections + + Discover top features, recently opened files, and recommended extensions on the Welcome page. For more information on how to get started in Azure Data Studio, check out our videos and documentation. + + + Extensions + + + Finish + + + 5 + + + 4 + + + Get started creating your own notebook or collection of notebooks in a single place. + Connect, query, and manage your connections from SQL Server, Azure, and more. - - 1 + + Customize Azure Data Studio based on your preferences. You can configure Settings like autosave and tab size, personalize your Keyboard Shortcuts, and switch to a Color Theme of your liking. Next @@ -4264,91 +4288,55 @@ Error: {1} Notebooks - - Get started creating your own notebook or collection of notebooks in a single place. - - - 2 - - - Extensions - - - Extend the functionality of Azure Data Studio by installing extensions developed by us/Microsoft as well as the third-party community (you!). - - - 3 - - - Settings - - - Customize Azure Data Studio based on your preferences. You can configure Settings like autosave and tab size, personalize your Keyboard Shortcuts, and switch to a Color Theme of your liking. - - - 4 - - - Welcome Page - - - Discover top features, recently opened files, and recommended extensions on the Welcome page. For more information on how to get started in Azure Data Studio, check out our videos and documentation. - - - 5 - - - Finish - - - User Welcome Tour - - - Hide Welcome Tour + + 1 Read more + + Settings + + + 3 + + + 2 + + + Welcome Page + + + User Welcome Tour + Help - + + Hide Welcome Tour + + - + + Would you like to take a quick tour of Azure Data Studio? + + + Welcome! + + + Close quick tour bar + + + Details + + + OK + + Welcome - - SQL Admin Pack - - - SQL Admin Pack - - - Admin Pack for SQL Server is a collection of popular database administration extensions to help you manage SQL Server - - - SQL Server Agent - - - SQL Server Profiler - - - SQL Server Import - - - SQL Server Dacpac - - - Powershell - - - Write and execute PowerShell scripts using Azure Data Studio's rich query editor - - - Data Virtualization - - - Virtualize data with SQL Server 2019 and create external tables using interactive wizards + + Welcome PostgreSQL @@ -4356,36 +4344,63 @@ Error: {1} Connect, query, and manage Postgres databases with Azure Data Studio - - Support for {0} is already installed. + + SQL Admin Pack - - The window will reload after installing additional support for {0}. + + Admin Pack for SQL Server is a collection of popular database administration extensions to help you manage SQL Server - - Installing additional support for {0}... + + Background color for the Welcome page. - - Support for {0} with id {1} could not be found. + + Data Virtualization - - New connection - - - New query - - - New notebook + + Virtualize data with SQL Server 2019 and create external tables using interactive wizards Deploy a server - - Welcome + + Support for {0} is already installed. + + + Support for {0} with id {1} could not be found. + + + Install + + + Install additional support for {0} + + + Install {0} keymap + + + Installed + + + {0} support is already installed + + + {0} keymap is already installed + + + Installing additional support for {0}... New + + New connection + + + New notebook + + + New query + Open… @@ -4395,183 +4410,171 @@ Error: {1} Open folder… - - Start Tour - - - Close quick tour bar - - - Would you like to take a quick tour of Azure Data Studio? - - - Welcome! - Open folder {0} with path {1} - - Install + + Powershell - - Install {0} keymap + + Write and execute PowerShell scripts using Azure Data Studio's rich query editor - - Install additional support for {0} + + SQL Admin Pack - - Installed + + SQL Server Agent - - {0} keymap is already installed + + SQL Server Dacpac - - {0} support is already installed + + SQL Server Import - - OK + + SQL Server Profiler - - Details + + Start Tour - - Background color for the Welcome page. + + The window will reload after installing additional support for {0}. - + Azure Data Studio - - Start - - - New connection - - - New query - - - New notebook - - - Open file - - - Open file - - - Deploy - - - New Deployment… - - - Recent - - - More... - - - No recent folders - - - Help - - - Getting started - - - Documentation - - - Report issue or feature request - - - GitHub repository - - - Release notes - - - Show welcome page on startup - - - Customize - - - Extensions - - - Download extensions that you need, including the SQL Server Admin pack and more - - - Keyboard Shortcuts - - - Find your favorite commands and customize them - - - Color theme - - - Make the editor and your code look the way you love - - - Learn - - - Find and run all commands - - - Rapidly access and search commands from the Command Palette ({0}) - Discover what's new in the latest release New monthly blog posts each month showcasing our new features + + Color theme + + + Make the editor and your code look the way you love + + + Customize + + + Deploy + + + Download extensions that you need, including the SQL Server Admin pack and more + + + Extensions + Follow us on Twitter Keep up to date with how the community is using Azure Data Studio and to talk directly with the engineers. - + + Getting started + + + GitHub repository + + + Help + + + Keyboard Shortcuts + + + Find your favorite commands and customize them + + + Learn + + + More... + + + New connection + + + New Deployment… + + + New notebook + + + New query + + + No recent folders + + + Open file + + + Open file + + + Documentation + + + Recent + + + Release notes + + + Report issue or feature request + + + Find and run all commands + + + Rapidly access and search commands from the Command Palette ({0}) + + + Show welcome page on startup + + + Start + + + + Add an account + + + Close + + + You didn't select any authentication provider. Please try again. + + + There is no linked account. Please add an account. + + + You have no clouds enabled. Go to Settings -> Search Azure Account Configuration -> Enable at least one cloud + Accounts Linked accounts - - Close - - - There is no linked account. Please add an account. - - - Add an account - - - You have no clouds enabled. Go to Settings -> Search Azure Account Configuration -> Enable at least one cloud - - - You didn't select any authentication provider. Please try again. - - + Error adding account - + You need to refresh the credentials for this account. - + Close @@ -4582,7 +4585,7 @@ Error: {1} Refresh account was canceled by the user - + Azure account @@ -4590,7 +4593,7 @@ Error: {1} Azure tenant - + Copy & Open @@ -4604,41 +4607,41 @@ Error: {1} Website - + Cannot start auto OAuth. An auto OAuth is already in progress. - + - - Connection is required in order to interact with adminservice - No Handler Registered - - - - Connection is required in order to interact with Assessment Service + + Connection is required in order to interact with adminservice + + No Handler Registered - - - - Advanced Properties + + Connection is required in order to interact with Assessment Service + + Discard - + + Advanced Properties + + Server Description (optional) - + Clear List @@ -4646,66 +4649,66 @@ Error: {1} Recent connections list cleared - - Yes - - - No - Are you sure you want to delete all the connections from the list? - - Yes - - - No - - - Delete - Get Current Connection String Connection string not available + + No + No active connection available - + + Yes + + + No + + + Yes + + + Delete + + - - Browse - - - Type here to filter the list - - - Filter connections + + Connection Browser Tree Applying filter - - Removing filter - Filter applied + + Filter connections + + + Type here to filter the list + Filter removed - - Saved Connections + + Removing filter + + + Browse Saved Connections - - Connection Browser Tree + + Saved Connections - + Connection error @@ -4719,103 +4722,103 @@ Error: {1} If you have previously connected you may need to re-run kinit. - + - - Connection + + Connection type Connecting - - Connection type - - - Recent + + Connection Connection Details - - Connect - Cancel + + Connect + Recent Connections No recent connection - + + Recent + + + + Are you sure you want to cancel this connection? + Failed to get Azure account token for connection Connection Not Accepted - - Yes - No - - Are you sure you want to cancel this connection? + + Yes - + - - Add an account... - - - <Default> - - - Loading... - - - Server group - - - <Default> - Add new group... - - <Do not save> - - - {0} is required. - - - {0} will be trimmed. - - - Remember password + + Advanced... Account - - Refresh account credentials - Azure AD tenant Name (optional) - - Advanced... + + Add an account... + + + {0} will be trimmed. You must select an account - + + {0} is required. + + + Refresh account credentials + + + <Default> + + + <Default> + + + Loading... + + + <Do not save> + + + Remember password + + + Server group + + Connected to @@ -4826,113 +4829,113 @@ Error: {1} Unsaved Connections - + - - Open dashboard extensions + + Cancel OK - - Cancel + + Open dashboard extensions No dashboard extensions are installed at this time. Go to Extension Manager to explore recommended extensions. - + Step {0} - + - - Done - Cancel - + + Done + + Initialize edit data session failed: - + - - OK - - - Close + + Copy details Action - - Copy details + + Close - + + OK + + Error - - Warning + + Ignore Info - - Ignore + + Warning - + - - Selected path - - - Files of type + + Discard OK - - Discard + + Files of type - + + Selected path + + Select a file - + File browser tree FileBrowserTree - + - - An error occured while loading the file browser. - File browser error - + + An error occured while loading the file browser. + + All files - + Copy Cell - + - - No Connection Profile was passed to insights flyout + + There was an error parsing the insight config; could not find query array/string or queryfile Insights error @@ -4940,11 +4943,20 @@ Error: {1} There was an error reading the query file: - - There was an error parsing the insight config; could not find query array/string or queryfile + + No Connection Profile was passed to insights flyout - + + + Insights + + + Item Details + + + Items + Item @@ -4960,61 +4972,34 @@ Error: {1} Value - - Insights - - - Items - - - Item Details - - + Could not find query file at any of the following paths : {0} - + - - Failed - - - Succeeded - - - Retry + + Between Retries Cancelled - - In Progress - - - Status Unknown - Executing - - Waiting for Thread - - - Between Retries + + Failed Idle - - Suspended + + In Progress - - [Obsolete] - - - Yes + + Never Run No @@ -5022,63 +5007,96 @@ Error: {1} Not Scheduled - - Never Run + + [Obsolete] - + + Retry + + + Status Unknown + + + Succeeded + + + Suspended + + + Waiting for Thread + + + Yes + + - - Connection is required in order to interact with JobManagementService - No Handler Registered - + + Connection is required in order to interact with JobManagementService + + SQL - + - - Cell execution cancelled + + Command executed successfully Query execution was canceled + + No kernel is available for this notebook + The session for this notebook is not yet ready + + Cell execution cancelled + The session for this notebook will start momentarily - - No kernel is available for this notebook - - - Command executed successfully - - + - - An error occurred while starting the notebook session - Server did not start for unknown reason + + An error occurred while starting the notebook session + Kernel {0} was not found. The default kernel will be used instead. - + - - Select Connection - localhost - + + Select Connection + + + + Can't find notebook manager for provider {0} + + + Changing context failed: {0} + + + Failed to change kernel due to error: {0} + + + Failed to change kernel. Kernel {0} will be used. Error was: {1} + + + Failed to delete cell. + # Injected-Parameters @@ -5086,146 +5104,128 @@ Error: {1} Please select a connection to run cells for this kernel - - Failed to delete cell. - - - Failed to change kernel. Kernel {0} will be used. Error was: {1} - - - Failed to change kernel due to error: {0} - - - Changing context failed: {0} + + A client session error occurred when closing the notebook: {0} Could not start session: {0} - - A client session error occurred when closing the notebook: {0} - - - Can't find notebook manager for provider {0} - - + - - No URI was passed when creating a notebook manager - Notebook provider does not exist - + + No URI was passed when creating a notebook manager + + A view with the name {0} already exists in this notebook. - + Untitled View - + - - SQL kernel error - A connection must be chosen to run notebook cells + + SQL kernel error + Displaying Top {0} rows. - + + + Markdown + Rich Text Split View - - Markdown - - + - - nbformat v{0}.{1} not recognized + + Data for {0} is expected to be a string or an Array of strings This file does not have a valid notebook format + + nbformat v{0}.{1} not recognized + Cell type {0} unknown Output type {0} not recognized - - Data for {0} is expected to be a string or an Array of strings - Output type {0} not recognized - + - - Identifier of the notebook provider. + + Optional execution target this magic indicates, for example Spark vs SQL What file extensions should be registered to this notebook provider - - What kernels should be standard with this notebook provider - - - Contributes notebook providers. - - - Name of the cell magic, such as '%%sql'. + + Optional set of kernels this is valid for, e.g. python3, pyspark, sql The cell language to be used if this cell magic is included in the cell - - Optional execution target this magic indicates, for example Spark vs SQL + + Name of the cell magic, such as '%%sql'. - - Optional set of kernels this is valid for, e.g. python3, pyspark, sql + + Identifier of the notebook provider. + + + What kernels should be standard with this notebook provider Contributes notebook language. - + + Contributes notebook providers. + + Loading... - + - - Refresh - - - Edit Connection - Disconnect + + Show Active Connections + New Connection New Server Group + + Edit Connection + Edit Server Group - - Show Active Connections - - - Show All Connections + + Refresh Delete Connection @@ -5233,7 +5233,10 @@ Error: {1} Delete Group - + + Show All Connections + + Failed to create Object Explorer session @@ -5241,34 +5244,34 @@ Error: {1} Multiple errors: - + - - Cannot expand as the required connection provider '{0}' was not found + + Firewall dialog canceled User canceled - - Firewall dialog canceled + + Cannot expand as the required connection provider '{0}' was not found - + Loading... - + - - Recent Connections - Servers + + Recent Connections + Servers - + Sort by event @@ -5276,31 +5279,55 @@ Error: {1} Sort by column - - Profiler + + Cancel OK - - Cancel + + Profiler - + - - Clear all + + Add a clause Apply - - OK - Cancel - - Filters + + Clear all + + + Contains + + + Field + + + Is Not Null + + + Is Null + + + Load Filter + + + Not Contains + + + Not Starts With + + + OK + + + Operator Remove this clause @@ -5308,144 +5335,120 @@ Error: {1} Save Filter - - Load Filter + + Starts With - - Add a clause - - - Field - - - Operator + + Filters Value - - Is Null - - - Is Not Null - - - Contains - - - Not Contains - - - Starts With - - - Not Starts With - - + Commit row failed: + + Canceling the query failed: {0} + + + Line {0} + Started executing query at Started executing query "{0}" - - Line {0} - - - Canceling the query failed: {0} - Update cell failed: - + + + Copy failed with error {0} + + + Batch execution time: {0} + Execution failed due to an unexpected error: {0} {1} Total execution time: {0} - - Started executing query at Line {0} - Started executing batch {0} - - Batch execution time: {0} + + Started executing query at Line {0} - - Copy failed with error {0} - - + Failed to save results. - - Choose Results File - - - CSV (Comma delimited) - - - JSON - - - Excel Workbook - - - XML - - - Plain Text - - - Saving file... - Successfully saved results to {0} Open file - - - - From + + CSV (Comma delimited) - - To + + Excel Workbook + + + JSON + + + Plain Text + + + XML + + + Choose Results File + + + Saving file... + + + + + Add my client IP + + + Add my subnet IP range Create new firewall rule - - OK + + Firewall rule Cancel + + OK + Your client IP address does not have access to the server. Sign in to an Azure account and create a new firewall rule to enable access. Learn more about firewall settings - - Firewall rule + + From - - Add my client IP + + To - - Add my subnet IP range - - + Error adding account @@ -5453,46 +5456,16 @@ Error: {1} Firewall rule error - + - - Backup file path - - - Target database - - - Restore - - - Restore database - - - Database - - - Backup file - Restore database - - Cancel + + Backup file path - - Script - - - Source - - - Restore from - - - Backup file path is required. - - - Please enter one or more file paths separated by commas + + Backup sets to restore Database @@ -5500,26 +5473,26 @@ Error: {1} Destination - - Restore to + + File type - - Restore plan + + Files - - Backup sets to restore - - - Restore database files as - - - Restore database file details + + General Logical file Name - - File type + + Backup file path is required. + + + Please enter one or more file paths separated by commas + + + Options Original File Name @@ -5527,56 +5500,86 @@ Error: {1} Restore as + + Restore database files as + + + Restore database file details + + + Backup file + + + Cancel + + + Database + + + Restore + + + Restore database + + + Script + + + Restore from + Restore options - - Tail-Log backup + + Restore plan + + + Restore to Server connections - - General + + Source - - Files + + Tail-Log backup - - Options + + Target database - + - - Backup Files - All Files - + + Backup Files + + + + Group name is required. + Server Groups - - OK - - - Cancel - Server group name - - Group name is required. + + Group color Group description - - Group color + + Cancel - + + OK + + Add server group @@ -5584,25 +5587,25 @@ Error: {1} Edit server group - + 1 or more tasks are in progress. Are you sure you want to quit? - - Yes - No - - - - Show Getting Started + + Yes + + Getting &&Started && denotes a mnemonic - + + Show Getting Started + + \ No newline at end of file diff --git a/resources/xlf/en/xml-language-features.xlf b/resources/xlf/en/xml-language-features.xlf index c95e74fea8..a77bc19285 100644 --- a/resources/xlf/en/xml-language-features.xlf +++ b/resources/xlf/en/xml-language-features.xlf @@ -1,11 +1,11 @@ - - XML Language Features - Provides rich language support for XML. - + + XML Language Features + + \ No newline at end of file