Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Erweiterungen für Datenbankverwaltungstools für WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Fügt Azure Data Studio zusätzliche Windows-spezifische Funktionen hinzu.Properties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Skripts generieren...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Für handleLaunchSsmsMinPropertiesDialogCommand wurde kein ConnectionContext angegeben.Could not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ Der Objekt-Explorer-Knoten konnte aus dem connectionContext nicht ermittelt werden: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Für handleLaunchSsmsMinPropertiesDialogCommand wurde kein ConnectionContext angegeben.No connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ Aus connectionContext wurde kein connectionProfile bereitgestellt: {0}Launching dialog...
- Launching dialog...
+ Das Dialogfeld wird gestartet...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Fehler beim Aufruf von SsmsMin mit den Argumenten "{0}": {1}
diff --git a/resources/xlf/de/agent.de.xlf b/resources/xlf/de/agent.de.xlf
index 7ba6a72ef5..1363ae2974 100644
--- a/resources/xlf/de/agent.de.xlf
+++ b/resources/xlf/de/agent.de.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ SQL Server-IntegrationsdienstpaketSQL Server Agent Service Account
diff --git a/resources/xlf/de/azurecore.de.xlf b/resources/xlf/de/azurecore.de.xlf
index 00611c7c05..df15556fd0 100644
--- a/resources/xlf/de/azurecore.de.xlf
+++ b/resources/xlf/de/azurecore.de.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: Alle Konten aktualisierenRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: AnmeldenSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Zu Servern hinzufügenClear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ Keine Ressourcen gefunden.
diff --git a/resources/xlf/de/cms.de.xlf b/resources/xlf/de/cms.de.xlf
index 88ae0b30ea..8b6d8c5b28 100644
--- a/resources/xlf/de/cms.de.xlf
+++ b/resources/xlf/de/cms.de.xlf
@@ -4,11 +4,11 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Zentrale SQL Server-VerwaltungsserverSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Unterstützung für die Verwaltung zentraler SQL Server-VerwaltungsserverCentral Management Servers
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Servergruppe aktualisierenDelete
@@ -48,7 +48,7 @@
Add Central Management Server
- Add Central Management Server
+ Zentralen Verwaltungsserver hinzufügenDelete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ MSSQL-KonfigurationShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Optional] Protokollieren Sie die Debugausgabe in der Konsole (Ansicht > Ausgabe), und wählen Sie dann den entsprechenden Ausgabekanal aus der Dropdownliste aus.[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Optional] Protokollebene für Back-End-Dienste. Azure Data Studio generiert bei jedem Start einen Dateinamen, und wenn die Datei bereits vorhanden ist, werden die Protokolleinträge an diese Datei angehängt. Zur Bereinigung alter Protokolldateien finden Sie die Einstellungen logRetentionMinutes und logFilesRemovalLimit. Beim Standard-tracingLevel wird nicht viel protokolliert. Das Ändern der Ausführlichkeit kann zu umfangreicher Protokollierung und hohen Speicherplatzanforderungen für die Protokolle führen. "Error" beinhaltet "Critical", "Warning" beinhaltet "Error", "Information beinhaltet "Warning", und "Verbose" beinhaltet "Information".Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Anzahl von Minuten, für die Protokolldateien für Back-End-Dienste aufbewahrt werden sollen. Der Standardwert ist 1 Woche.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Die maximale Anzahl alter Dateien, die beim Start entfernt werden sollen, bei denen die mssql.logRetentionMinutes abgelaufen sind. Dateien, die aufgrund dieser Einschränkung nicht bereinigt werden, werden beim nächsten Start von Azure Data Studio bereinigt.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Optional] Keine Warnungen zu nicht unterstützten Plattformen anzeigenRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ TarifCompatibility Level
@@ -168,11 +168,11 @@
Name (optional)
- Name (optional)
+ Name (optional)Custom name of the connection
- Custom name of the connection
+ Benutzerdefinierter Name der VerbindungServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Name der SQL Server-InstanzServer Description
- Server Description
+ ServerbeschreibungDescription of the SQL Server instance
- Description of the SQL Server instance
+ Beschreibung der SQL Server-InstanzAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Gibt die Methode der Authentifizierung bei SQL Server an.SQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory: universell mit MFA-UnterstützungUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Gibt die Benutzer-ID an, die beim Herstellen einer Verbindung mit der Datenquelle verwendet werden sollPassword
@@ -224,63 +224,63 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Gibt das Kennwort an, das beim Herstellen einer Verbindung mit der Datenquelle verwendet werden soll.Application intent
- Application intent
+ AnwendungszweckDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Deklariert den Anwendungsworkloadtyp beim Herstellen einer Verbindung mit einem Server.Asynchronous processing
- Asynchronous processing
+ Asynchrone VerarbeitungWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Bei TRUE wird die Verwendung der asynchronen Funktionalität im .NET Framework-Datenanbieter ermöglicht.Connect timeout
- Connect timeout
+ VerbindungstimeoutThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Die Zeitspanne (in Sekunden), die auf eine Verbindung mit dem Server gewartet wird, bevor der Versuch beendet und ein Fehler generiert wird.Current language
- Current language
+ Aktuelle SpracheThe SQL Server language record name
- The SQL Server language record name
+ Der Datensatzname der SQL Server-SpracheColumn encryption
- Column encryption
+ SpaltenverschlüsselungDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Standardeinstellung für die Spaltenverschlüsselung für alle Befehle in der VerbindungEncrypt
- Encrypt
+ VerschlüsselnWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Bei TRUE verwendet SQL Server die SSL-Verschlüsselung für alle Daten, die zwischen Client und Server gesendet werden, wenn auf dem Server ein Zertifikat installiert ist.Persist security info
- Persist security info
+ Sicherheitsinformationen dauerhaft speichernWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Bei FALSE werden sicherheitsrelevante Informationen, z. B. das Kennwort, nicht als Teil der Verbindung zurückgegeben.Trust server certificate
@@ -288,43 +288,43 @@
When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Bei TRUE (und encrypt=true) verwendet SQL Server die SSL-Verschlüsselung für alle Daten, die zwischen Client und Server gesendet werden, ohne das Serverzertifikat zu überprüfen.Attached DB file name
- Attached DB file name
+ Angefügter DB-DateinameThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Der Name der primären Datei einer anfügbaren Datenbank, einschließlich des vollständigen PfadnamensContext connection
- Context connection
+ KontextverbindungWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Bei TRUE wird angegeben, dass die Verbindung aus dem SQL-Serverkontext stammen muss. Nur verfügbar bei Ausführung im SQL Server-Prozess.Port
- Port
+ PortConnect retry count
- Connect retry count
+ Anzahl der VerbindungswiederholungenNumber of attempts to restore connection
- Number of attempts to restore connection
+ Anzahl der Versuche zur VerbindungswiederherstellungConnect retry interval
- Connect retry interval
+ Intervall für VerbindungswiederholungDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Verzögerung zwischen Versuchen zur VerbindungswiederherstellungApplication name
@@ -332,47 +332,47 @@
The name of the application
- The name of the application
+ Der Name der AnwendungWorkstation Id
- Workstation Id
+ Arbeitsstations-IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Der Name der Arbeitsstation, die eine Verbindung mit SQL Server herstelltPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Bei TRUE wird das Verbindungsobjekt aus dem entsprechenden Pool abgerufen oder bei Bedarf erstellt und dem entsprechenden Pool hinzugefügt.Max pool size
- Max pool size
+ Maximale PoolgrößeThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Die maximale Anzahl der im Pool zulässigen VerbindungenMin pool size
- Min pool size
+ Mindestgröße für PoolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Die Mindestanzahl der im Pool zulässigen VerbindungenLoad balance timeout
- Load balance timeout
+ Timeout für LastenausgleichThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Die Mindestzeit (in Sekunden), die diese Verbindung im Pool verbleibt, bevor sie zerstört wirdReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Wird von SQL Server in der Replikation verwendet.Attach DB filename
- Attach DB filename
+ DB-Dateinamen anfügenFailover partner
- Failover partner
+ FailoverpartnerThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Der Name oder die Netzwerkadresse der Instanz von SQL Server, die als Failoverpartner fungiertMulti subnet failover
- Multi subnet failover
+ MultisubnetzfailoverMultiple active result sets
- Multiple active result sets
+ Mehrere aktive ResultsetsWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Bei TRUE können mehrere Resultsets zurückgegeben und aus einer Verbindung gelesen werden.Packet size
- Packet size
+ PaketgrößeSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Größe der Netzwerkpakete (in Byte), die für die Kommunikation mit einer Instanz von SQL Server verwendet werdenType system version
- Type system version
+ Type System VersionIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Gibt an, welches Servertypsystem der Anbieter über den DataReader verfügbar macht.
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ Der zentrale Verwaltungsserver "{0}" wurde nicht gefunden oder ist offline.No resources found
- No resources found
+ Keine Ressourcen gefunden.
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Zentralen Verwaltungsserver hinzufügen...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ Die Gruppe zentraler Verwaltungsserver enthält bereits einen registrierten Server namens "{0}".Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ Der registrierte Server "{0}" konnte nicht hinzugefügt werden.Are you sure you want to delete
- Are you sure you want to delete
+ Möchten Sie den Löschvorgang durchführen?Yes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Beschreibung der Servergruppe{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ "{0}" weist bereits eine Servergruppe namens "{1}" auf.Are you sure you want to delete
- Are you sure you want to delete
+ Möchten Sie den Löschvorgang durchführen?
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ Sie können keinen freigegebenen registrierten Server hinzufügen, dessen Name dem des Konfigurationsservers entspricht.
diff --git a/resources/xlf/de/dacpac.de.xlf b/resources/xlf/de/dacpac.de.xlf
index 7839b1db56..f4e31707d1 100644
--- a/resources/xlf/de/dacpac.de.xlf
+++ b/resources/xlf/de/dacpac.de.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ Sie können den Status der Skriptgenerierung in der Aufgabenansicht anzeigen, sobald der Assistent geschlossen ist. Das generierte Skript wird nach Abschluss geöffnet.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/de/import.de.xlf b/resources/xlf/de/import.de.xlf
index 3b91c12c0a..fe4380b6e6 100644
--- a/resources/xlf/de/import.de.xlf
+++ b/resources/xlf/de/import.de.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Dieser Vorgang war nicht erfolgreich. Versuchen Sie es mit einer anderen Eingabedatei.Refresh
diff --git a/resources/xlf/de/mssql.de.xlf b/resources/xlf/de/mssql.de.xlf
index 8761df5955..d5545989b9 100644
--- a/resources/xlf/de/mssql.de.xlf
+++ b/resources/xlf/de/mssql.de.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Dateien hochladenNew directory
- New directory
+ Neues VerzeichnisDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Neues NotebookOpen Notebook
- Open Notebook
+ Notebook öffnenTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Aufgaben und Informationen zu Ihrem SQL Server-Big Data-ClusterSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Spark-Auftrag übermittelnNew Spark Job
- New Spark Job
+ Neuer Spark-AuftragView Spark History
- View Spark History
+ Spark-Verlauf anzeigenView Yarn History
- View Yarn History
+ YARN-Verlauf anzeigenTasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ Pakete installierenConfigure Python for Notebooks
- Configure Python for Notebooks
+ Python für Notebooks konfigurierenCluster Status
- Cluster Status
+ ClusterstatusSearch: Servers
- Search: Servers
+ Suche: ServerSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Suche: Suchserverergebnisse löschenService Endpoints
- Service Endpoints
+ DienstendpunkteMSSQL configuration
- MSSQL configuration
+ MSSQL-KonfigurationShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Optional] Protokollieren Sie die Debugausgabe in der Konsole (Ansicht > Ausgabe), und wählen Sie dann den entsprechenden Ausgabekanal aus der Dropdownliste aus.[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Optional] Protokollebene für Back-End-Dienste. Azure Data Studio generiert bei jedem Start einen Dateinamen, und wenn die Datei bereits vorhanden ist, werden die Protokolleinträge an diese Datei angehängt. Zur Bereinigung alter Protokolldateien finden Sie die Einstellungen logRetentionMinutes und logFilesRemovalLimit. Beim Standard-tracingLevel wird nicht viel protokolliert. Das Ändern der Ausführlichkeit kann zu umfangreicher Protokollierung und hohen Speicherplatzanforderungen für die Protokolle führen. "Error" beinhaltet "Critical", "Warning" beinhaltet "Error", "Information beinhaltet "Warning", und "Verbose" beinhaltet "Information".Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Anzahl von Minuten, für die Protokolldateien für Back-End-Dienste aufbewahrt werden sollen. Der Standardwert ist 1 Woche.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Die maximale Anzahl alter Dateien, die beim Start entfernt werden sollen, bei denen die mssql.logRetentionMinutes abgelaufen sind. Dateien, die aufgrund dieser Einschränkung nicht bereinigt werden, werden beim nächsten Start von Azure Data Studio bereinigt.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Optional] Keine Warnungen zu nicht unterstützten Plattformen anzeigenRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ TarifCompatibility Level
@@ -224,11 +224,11 @@
Name (optional)
- Name (optional)
+ Name (optional)Custom name of the connection
- Custom name of the connection
+ Benutzerdefinierter Name der VerbindungServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Name der SQL Server-InstanzDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ Der Name des ursprünglichen Katalogs oder der ersten Datenbank in der DatenquelleAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Gibt die Methode der Authentifizierung bei SQL Server an.SQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory: universell mit MFA-UnterstützungUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Gibt die Benutzer-ID an, die beim Herstellen einer Verbindung mit der Datenquelle verwendet werden soll.Password
@@ -280,63 +280,63 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Gibt das Kennwort an, das beim Herstellen einer Verbindung mit der Datenquelle verwendet werden soll.Application intent
- Application intent
+ AnwendungszweckDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Deklariert den Anwendungsworkloadtyp beim Herstellen einer Verbindung mit einem Server.Asynchronous processing
- Asynchronous processing
+ Asynchrone VerarbeitungWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Bei TRUE wird die Verwendung der asynchronen Funktionalität im .NET Framework-Datenanbieter ermöglicht.Connect timeout
- Connect timeout
+ VerbindungstimeoutThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Die Zeitspanne (in Sekunden), die auf eine Verbindung mit dem Server gewartet werden muss, bevor der Versuch beendet und ein Fehler generiert wirdCurrent language
- Current language
+ Aktuelle SpracheThe SQL Server language record name
- The SQL Server language record name
+ Der Datensatzname der SQL Server-SpracheColumn encryption
- Column encryption
+ SpaltenverschlüsselungDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Standardeinstellung für die Spaltenverschlüsselung für alle Befehle in der VerbindungEncrypt
- Encrypt
+ VerschlüsselnWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Bei TRUE verwendet SQL Server die SSL-Verschlüsselung für alle Daten, die zwischen Client und Server gesendet werden, wenn auf dem Server ein Zertifikat installiert ist.Persist security info
- Persist security info
+ Sicherheitsinformationen dauerhaft speichernWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Bei FALSE werden sicherheitsrelevante Informationen, z. B. das Kennwort, nicht als Teil der Verbindung zurückgegeben.Trust server certificate
@@ -344,43 +344,43 @@
When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Bei TRUE (und encrypt=true) verwendet SQL Server die SSL-Verschlüsselung für alle Daten, die zwischen Client und Server gesendet werden, ohne das Serverzertifikat zu überprüfen.Attached DB file name
- Attached DB file name
+ Angefügter DB-DateinameThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Der Name der primären Datei einer anfügbaren Datenbank, einschließlich des vollständigen PfadnamensContext connection
- Context connection
+ KontextverbindungWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Bei TRUE wird angegeben, dass die Verbindung aus dem SQL Server-Kontext stammen muss. Nur verfügbar bei Ausführung im SQL Server-Prozess.Port
- Port
+ PortConnect retry count
- Connect retry count
+ Anzahl von VerbindungswiederholungenNumber of attempts to restore connection
- Number of attempts to restore connection
+ Anzahl der Versuche zur VerbindungswiederherstellungConnect retry interval
- Connect retry interval
+ Intervall für VerbindungswiederholungenDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Verzögerung zwischen Versuchen zur VerbindungswiederherstellungApplication name
@@ -388,47 +388,47 @@
The name of the application
- The name of the application
+ Der Name der AnwendungWorkstation Id
- Workstation Id
+ Arbeitsstations-IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Der Name der Arbeitsstation, die eine Verbindung mit SQL Server herstelltPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Bei TRUE wird das Verbindungsobjekt aus dem entsprechenden Pool abgerufen oder bei Bedarf erstellt und dem entsprechenden Pool hinzugefügt.Max pool size
- Max pool size
+ Maximale PoolgrößeThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Die maximale Anzahl der im Pool zulässigen VerbindungenMin pool size
- Min pool size
+ Mindestgröße für PoolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Die Mindestanzahl der im Pool zulässigen VerbindungenLoad balance timeout
- Load balance timeout
+ Timeout beim LastenausgleichThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Die Mindestzeit (in Sekunden), die diese Verbindung im Pool verbleibt, bevor sie zerstört wirdReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Wird von SQL Server bei der Replikation verwendet.Attach DB filename
- Attach DB filename
+ DB-Dateinamen anfügenFailover partner
- Failover partner
+ FailoverpartnerThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Der Name oder die Netzwerkadresse der Instanz von SQL Server, die als Failoverpartner fungiertMulti subnet failover
- Multi subnet failover
+ MultisubnetzfailoverMultiple active result sets
- Multiple active result sets
+ Mehrere aktive ResultsetsWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Bei TRUE können mehrere Resultsets über eine Verbindung zurückgegeben und gelesen werden.Packet size
- Packet size
+ PaketgrößeSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Größe der Netzwerkpakete (in Byte), die für die Kommunikation mit einer Instanz von SQL Server verwendet werdenType system version
- Type system version
+ TypSystemversionIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Gibt an, welches Servertypsystem vom Anbieter über den DataReader verfügbar gemacht wird.No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ Von der Antwort wurde keine Batch-ID für Spark-Aufträge zurückgegeben.{0}[Fehler] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ Innerhalb der Antwort wird kein Protokoll zurückgegeben.{0}[Fehler] {1}Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ Parameter für SparkJobSubmissionModel sind ungültig.submissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs ist ungültig. livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId ist ungültig. Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Timeout beim Abrufen der Anwendungs-ID. {0}[Protokoll] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ Die localFilePath- oder hdfsFolderPath-Eigenschaft wurde nicht angegeben.Property Path is not specified.
- Property Path is not specified.
+ Der Eigenschaftspfad wurde nicht angegeben.Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ Parameter für SparkJobSubmissionDialog sind ungültig.New Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Senden{0} Spark Job Submission:
- {0} Spark Job Submission:
+ {0} Spark-Auftragsübermittlung:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Start der Spark-Auftragsübermittlung ..........................Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Wählen Sie SQL Server mit Big Data-Cluster aus.No Sql Server is selected.
- No Sql Server is selected.
+ Es ist keine SQL Server-Instanz ausgewählt.Error Get File Path: {0}
- Error Get File Path: {0}
+ Fehler beim Abrufen des Dateipfads: {0}Invalid Data Structure
- Invalid Data Structure
+ Ungültige DatenstrukturUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ WebHDFS-Client kann aufgrund fehlender Optionen nicht erstellt werden: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ "${0}" ist nicht definiert.Bad Request
- Bad Request
+ Fehlerhafte Anforderung.Unauthorized
- Unauthorized
+ Nicht autorisiertForbidden
- Forbidden
+ UnzulässigNot Found
@@ -724,7 +724,7 @@
Internal Server Error
- Internal Server Error
+ Interner Serverfehler.Unknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ Unerwartete UmleitungPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Geben Sie das Kennwort für die Verbindung mit HDFS an:Session for node {0} does not exist
- Session for node {0} does not exist
+ Die Sitzung für den Knoten "{0}" ist nicht vorhanden.Error notifying of node change: {0}
- Error notifying of node change: {0}
+ Fehler bei Benachrichtigung über Knotenänderung: {0}Root
- Root
+ StammHDFS
- HDFS
+ HDFSData Services
- Data Services
+ Data ServicesNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ HINWEIS: Diese Datei wurde zur Vorschau bei "{0}" abgeschnitten.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ Die Datei wurde zur Vorschau bei "{0}" abgeschnitten.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo ist nicht definiert.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ "ConnectionInfo.options" ist nicht definiert.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ Einige Eigenschaften fehlen in connectionInfo.options: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ Die Aktion "{0}" wird für diesen Handler nicht unterstützt.Cannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ Der Link "{0}" kann nicht geöffnet werden, weil nur HTTP- und HTTPS-Links unterstützt werden.Download and open '{0}'?
- Download and open '{0}'?
+ "{0}" herunterladen und öffnen?Could not find the specified file
- Could not find the specified file
+ Die angegebene Datei wurde nicht gefunden.File open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ Fehler bei der Anforderung zum Öffnen von Dateien: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Fehler beim Beenden von Notebook-Server: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ Der Notebook-Vorgang wurde vorzeitig beendet. Fehler: {0}, StdErr-Ausgabe: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Von Jupyter gesendeter Fehler: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter wird bei "{0}" ausgeführt.... Starting Notebook server
- ... Starting Notebook server
+ ... Der Notebook-Server wird gestartet.Unexpected setting type {0}
- Unexpected setting type {0}
+ Unerwarteter Einstellungstyp "{0}".Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ Eine Sitzung kann nicht gestartet werden, der Manager ist noch nicht initialisiert.Spark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Für Spark-Kernel ist eine Verbindung mit einer Masterinstanz eines Big Data-Clusters in SQL Server erforderlich.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ Fehler beim Herunterfahren des Notebook-Servers: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ Notebook-Abhängigkeiten werden installiert.Python download is complete
- Python download is complete
+ Der Python-Download ist abgeschlossen.Error while downloading python setup
- Error while downloading python setup
+ Fehler beim Herunterladen von Python-Setup.Downloading python package
- Downloading python package
+ Das Python-Paket wird heruntergeladen.Unpacking python package
- Unpacking python package
+ Python-Paket entpackenError while creating python installation directory
- Error while creating python installation directory
+ Fehler beim Erstellen des Python-Installationsverzeichnisses.Error while unpacking python bundle
- Error while unpacking python bundle
+ Fehler beim Entpacken des Python-Pakets.Installing Notebook dependencies
- Installing Notebook dependencies
+ Notebook-Abhängigkeiten werden installiert.Installing Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Notebook-Abhängigkeiten werden installiert. Weitere Informationen finden Sie in der Aufgabenansicht.Notebook dependencies installation is complete
- Notebook dependencies installation is complete
+ Die Installation von Notebook-Abhängigkeiten ist abgeschlossen.Cannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Die vorhandene Python-Installation kann nicht überschrieben werden, während Python ausgeführt wird.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ Eine weitere Python-Installation wird derzeit ausgeführt.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python ist am spezifischen Speicherort bereits vorhanden. Die Installation wird übersprungen.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ Fehler beim Installieren von Notebook-Abhängigkeiten: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Lokaler Python-Code für die Plattform "{0}" wird auf "{1}" heruntergeladen.Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Zum Ausführen von Notebooks erforderliche Pakete werden installiert...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Jupyter-Installation abgeschlossen.Installing SparkMagic...
- Installing SparkMagic...
+ SparkMagic wird installiert...A notebook path is required
- A notebook path is required
+ Ein Notebook-Pfad ist erforderlich.Notebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Nur IPYNB-Notebooks werden unterstützt.Are you sure you want to reinstall?
- Are you sure you want to reinstall?
+ Möchten Sie eine Neuinstallation durchführen?Configure Python for Notebooks
- Configure Python for Notebooks
+ Python für Notebooks konfigurierenInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Python-InstallationsspeicherortSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Diese Installation wird einige Zeit in Anspruch nehmen. Es wird empfohlen, die Anwendung erst zu schließen, wenn die Installation abgeschlossen ist.The specified install location is invalid.
- The specified install location is invalid.
+ Der angegebene Installationsspeicherort ist ungültig.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ Am angegebenen Speicherort wurde keine Python-Installation gefunden.Python installation was declined.
- Python installation was declined.
+ Die Python-Installation wurde abgelehnt.Installation Type
- Installation Type
+ InstallationstypNew Python installation
- New Python installation
+ Neue Python-InstallationUse existing Python installation
- Use existing Python installation
+ Vorhandene Python-Installation verwendenOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ Fehler beim Öffnen der Datei "{0}": {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Fehler beim Öffnen der Datei "{0}": {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Fehler beim Öffnen der Datei "{0}": {1}Missing file : {0}
- Missing file : {0}
+ Fehlende Datei: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Dieser Beispielcode lädt die Datei in einen Datenrahmen und zeigt die ersten 10 Ergebnisse an.No notebook editor is active
- No notebook editor is active
+ Es ist kein Notebook-Editor aktiv.Code
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ Welche Art von Zelle möchten Sie hinzufügen?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ SQL Server-Bereitstellungserweiterung für Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Bietet eine Notebook-basierte Oberfläche zum Bereitstellen von Microsoft SQL Server.Deploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ SQL Server in Docker bereitstellen...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ SQL Server-Big Data-Cluster bereitstellen...Deploy SQL Server…
- Deploy SQL Server…
+ SQL Server bereitstellen...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ SQL Server-ContainerimageRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ SQL Server-Containerimage mit Docker ausführenSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ Mit dem Big Data-Cluster von SQL Server können Sie skalierbare Cluster von SQL Server-, Spark- und HDFS-Containern bereitstellen, die in Kubernetes ausgeführt werden.Version
@@ -52,23 +52,23 @@
SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Big Data-Cluster in SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ BereitstellungszielNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Ein in Python geschriebenes Befehlszeilen-Hilfsprogramm, das Clusteradministratoren den Bootstrap und die Verwaltung des Big Data-Clusters über REST-APIs ermöglichtmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Mit einem Befehlszeilentool können Sie Befehle für Kubernetes-Cluster ausführen.kubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Bietet die Möglichkeit, eine Anwendung in isolierten Containern zu paketieren und auszuführen.Docker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Ein Befehlszeilentool zum Verwalten von Azure-RessourcenAzure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ "Package.json" wurde nicht gefunden, oder der Name/Herausgeber wurde nicht festgelegt.
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ Das Notebook "{0}" ist nicht vorhanden.
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Bereitstellungsoptionen auswählenOpen Notebook
- Open Notebook
+ Notebook öffnenTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ Fehler beim Laden der Erweiterung: {0}. In der Ressourcentypdefinition in "package.json" wurde ein Fehler festgestellt. Details finden Sie in der Debugkonsole.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ Der Ressourcentyp "{0}" ist nicht definiert.
diff --git a/resources/xlf/de/schema-compare.de.xlf b/resources/xlf/de/schema-compare.de.xlf
index e247db6fb2..dfc97441ea 100644
--- a/resources/xlf/de/schema-compare.de.xlf
+++ b/resources/xlf/de/schema-compare.de.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server-SchemavergleichSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Der SQL Server-Schemavergleich für Azure Data Studio unterstützt den Vergleich der Schemas von Datenbanken und DACPACs.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ Die Optionen haben sich geändert. Möchten Sie den Vergleich wiederholen und neu anzeigen?Schema Compare Options
@@ -52,311 +52,311 @@
Include Object Types
- Include Object Types
+ Objekttypen einschließenIgnore Table Options
- Ignore Table Options
+ Tabellenoptionen ignorierenIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Semikolon zwischen Anweisungen ignorierenIgnore Route Lifetime
- Ignore Route Lifetime
+ Routenlebensdauer ignorierenIgnore Role Membership
- Ignore Role Membership
+ Rollenmitgliedschaft ignorierenIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Bezeichner in Anführungszeichen ignorierenIgnore Permissions
- Ignore Permissions
+ Berechtigungen ignorierenIgnore Partition Schemes
- Ignore Partition Schemes
+ Partitionsschemas ignorierenIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Objektplatzierung im Partitionsschema ignorierenIgnore Not For Replication
- Ignore Not For Replication
+ "Nicht zur Replikation" ignorierenIgnore Login Sids
- Ignore Login Sids
+ Anmelde-SIDs ignorierenIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Sperrhinweise für Indizes ignorierenIgnore Keyword Casing
- Ignore Keyword Casing
+ Groß-/Kleinschreibung bei Schlüsselwort ignorierenIgnore Index Padding
- Ignore Index Padding
+ Indexauffüllung ignorierenIgnore Index Options
- Ignore Index Options
+ Indexoptionen ignorierenIgnore Increment
- Ignore Increment
+ Inkrement ignorierenIgnore Identity Seed
- Ignore Identity Seed
+ ID-Startwert ignorierenIgnore User Settings Objects
- Ignore User Settings Objects
+ Benutzereinstellungsobjekte ignorierenIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ FilePath für Volltextkatalog ignorierenIgnore Whitespace
- Ignore Whitespace
+ Leerraum ignorierenIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ WITH NOCHECK bei ForeignKeys ignorierenVerify Collation Compatibility
- Verify Collation Compatibility
+ Sortierungskompatibilität überprüfenUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Unveränderliche ObjektwarnungenTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Überprüfungsfehler als Warnungen behandelnScript Refresh Module
- Script Refresh Module
+ SkriptaktualisierungsmodulScript New Constraint Validation
- Script New Constraint Validation
+ Überprüfung neuer Einschränkungen per SkriptScript File Size
- Script File Size
+ SkriptdateigrößeScript Deploy StateChecks
- Script Deploy StateChecks
+ StateChecks per Skript bereitstellenScript Database Options
- Script Database Options
+ SkriptdatenbankoptionenScript Database Compatibility
- Script Database Compatibility
+ SkriptdatenbankkompatibilitätScript Database Collation
- Script Database Collation
+ SkriptdatenbanksortierungRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Executors von Bereitstellungsplan ausführenRegister DataTier Application
- Register DataTier Application
+ DataTier-Anwendung registrierenPopulate Files On File Groups
- Populate Files On File Groups
+ Dateien in Dateigruppen auffüllenNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Keine ALTER-Anweisungen zum Ändern von CLR-Typen.Include Transactional Scripts
- Include Transactional Scripts
+ Transaktionsskripts einschließenInclude Composite Objects
- Include Composite Objects
+ Zusammengesetzte Objekte einschließenAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Unsichere Datenverschiebung bei Sicherheit auf Zeilenebene zulassenIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ WITH NOCHECK bei CHECK CONSTRAINT ignorierenIgnore Fill Factor
- Ignore Fill Factor
+ Füllfaktor ignorierenIgnore File Size
- Ignore File Size
+ Dateigröße ignorierenIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Dateigruppenplatzierung ignorierenDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ Replizierte Objekte nicht ändernDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ Change Data Capture-Objekte nicht ändernDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ DDL-Trigger deaktivieren und wieder aktivierenDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Datenbank im Einzelbenutzermodus bereitstellenCreate New Database
- Create New Database
+ Neue Datenbank erstellenCompare Using Target Collation
- Compare Using Target Collation
+ Anhand der Zielsortierung vergleichenComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Set Var-Deklarationen auskommentierenBlock When Drift Detected
- Block When Drift Detected
+ Bei erkannter Abweichung blockierenBlock On Possible Data Loss
- Block On Possible Data Loss
+ Bei möglichem Datenverlust blockierenBackup Database Before Changes
- Backup Database Before Changes
+ Datenbank vor Änderungen sichernAllow Incompatible Platform
- Allow Incompatible Platform
+ Inkompatible Plattform zulassenAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Löschen blockierender Assemblys zulassenDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Nicht in der Quelle enthaltene Einschränkungen löschenDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Nicht in der Quelle enthaltene DML-Trigger löschenDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Nicht in der Quelle enthaltene erweiterte Eigenschaften löschenDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Nicht in der Quelle enthaltene Indizes löschenIgnore File And Log File Path
- Ignore File And Log File Path
+ Datei- und Protokolldateipfad ignorierenIgnore Extended Properties
- Ignore Extended Properties
+ Erweiterte Eigenschaften ignorierenIgnore Dml Trigger State
- Ignore Dml Trigger State
+ DML-Triggerstatus ignorierenIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ DML-Triggerreihenfolge ignorierenIgnore Default Schema
- Ignore Default Schema
+ Standardschema ignorierenIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Ddl-Triggerstatus ignorierenIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ddl-Triggerreihenfolge ignorierenIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ FilePath von Kryptografieanbieter ignorierenVerify Deployment
- Verify Deployment
+ Bereitstellung überprüfenIgnore Comments
- Ignore Comments
+ Kommentare ignorierenIgnore Column Collation
- Ignore Column Collation
+ Spaltensortierung ignorierenIgnore Authorizer
- Ignore Authorizer
+ Autorisierer ignorierenIgnore AnsiNulls
- Ignore AnsiNulls
+ AnsiNulls ignorierenGenerate SmartDefaults
- Generate SmartDefaults
+ SmartDefaults generierenDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Nicht in der Quelle enthaltene Statistiken löschenDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Nicht in der Quelle enthaltene Rollenmitglieder löschenDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Nicht in der Quelle enthaltene Berechtigungen löschenDrop Objects Not In Source
- Drop Objects Not In Source
+ Nicht in der Quelle enthaltene Objekte löschenIgnore Column Order
- Ignore Column Order
+ Spaltenreihenfolge ignorierenAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,7 +436,7 @@
File Tables
- File Tables
+ DateitabellenFull Text Catalogs
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ SkalarwertfunktionenSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ TabellenwertfunktionenUser Defined Data Types
- User Defined Data Types
+ Benutzerdefinierte DatentypenUser Defined Table Types
- User Defined Table Types
+ Benutzerdefinierte TabellentypenClr User Defined Types
- Clr User Defined Types
+ Benutzerdefinierte Clr-TypenUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ ServertriggerSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Gibt an, dass eine Assembly bei der Veröffentlichung immer gelöscht und neu erstellt werden soll, wenn Unterschiede vorliegen, statt eine ALTER ASSEMBLY-Anweisung auszugeben.Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Bei TRUE wird die Datenbank vor der Bereitstellung auf den Einzelbenutzermodus festgelegt.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Diese Einstellung legt fest, wie die Sortierung der Datenbank während der Bereitstellung verarbeitet wird. Standardmäßig wird die Sortierung der Zieldatenbank aktualisiert, wenn sie nicht mit der von der Quelle angegebenen Sortierung übereinstimmt. Wenn diese Option festgelegt ist, muss die Sortierung der Zieldatenbank (oder des Servers) verwendet werden.Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Gibt an, ob Rollenmitglieder, die nicht in der Datenbank-Momentaufnahmedatei (DACPAC) definiert sind, aus der Zieldatenbank gelöscht werden, wenn Sie Aktualisierungen an einer Datenbank veröffentlichen.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ Anwendungsdatei der Datenschicht (.DACPAC)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ Es wurde ein anderes Quellschema ausgewählt. Möchten Sie einen Vergleich durchführen?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ Es wurde ein anderes Zielschema ausgewählt. Möchten Sie einen Vergleich durchführen?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Es wurden verschiedene Quell- und Zielschemas ausgewählt. Möchten Sie einen Vergleich durchführen?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Details vergleichenAre you sure you want to update the target?
- Are you sure you want to update the target?
+ Möchten Sie das Ziel aktualisieren?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Klicken Sie auf "Vergleichen", um den Vergleich zu aktualisieren.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Skript zum Bereitstellen von Änderungen am Ziel generierenNo changes to script
- No changes to script
+ Keine Änderungen am SkriptApply changes to target
- Apply changes to target
+ Änderungen auf das Ziel anwendenNo changes to apply
- No changes to apply
+ Keine Änderungen zur Anwendung vorhanden.Delete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Der Vergleich wird gestartet. Dies kann einen Moment dauern.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Um zwei Schemas zu vergleichen, wählen Sie zunächst ein Quellschema und ein Zielschema aus, und klicken Sie dann auf "Vergleichen".No schema differences were found.
- No schema differences were found.
+ Es wurden keine Schemaunterschiede gefunden.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ Fehler beim Schemavergleich: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ "Skript generieren" ist aktiviert, wenn das Ziel eine Datenbank ist.Apply is enabled when the target is a database
- Apply is enabled when the target is a database
+ "Anwenden" ist aktiviert, wenn das Ziel eine Datenbank ist.Compare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ Fehler beim Abbrechen des Schemavergleichs: {0}Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ Fehler beim Generieren des Skripts: {0}Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ Fehler beim Anwenden des Schemavergleichs: {0}Switch direction
- Switch direction
+ Richtung wechselnSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ SCMP-Datei öffnenLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Quelle und Ziel sowie die in einer SCMP-Datei gespeicherten Optionen ladenOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ Fehler beim Öffnen von SCMP: {0}Save .scmp file
- Save .scmp file
+ SCMP-Datei speichernSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Quelle und Ziel, Optionen und ausgeschlossene Elemente speichernSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ Fehler beim Speichern der SCMP-Datei: {0}
diff --git a/resources/xlf/es/admin-tool-ext-win.es.xlf b/resources/xlf/es/admin-tool-ext-win.es.xlf
index 37ac15829f..36b3214a8d 100644
--- a/resources/xlf/es/admin-tool-ext-win.es.xlf
+++ b/resources/xlf/es/admin-tool-ext-win.es.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Extensiones de herramienta de administración de bases de datos para WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Agrega funcionalidad adicional específica de Windows a Azure Data StudioProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Generar scripts...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ No se proporciona ConnectionContext para handleLaunchSsmsMinPropertiesDialogCommandCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ No se ha podido determinar el nodo del Explorador de objetos desde connectionContext: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ No se proporciona ConnectionContext para handleLaunchSsmsMinPropertiesDialogCommandNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ No se proporciona connectionProfile desde connectionContext: {0}Launching dialog...
- Launching dialog...
+ Iniciando el diálogo...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Error al llamar a SsmsMin con args "{0}" - {1}
diff --git a/resources/xlf/es/agent.es.xlf b/resources/xlf/es/agent.es.xlf
index fc4f998845..a397016693 100644
--- a/resources/xlf/es/agent.es.xlf
+++ b/resources/xlf/es/agent.es.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ Paquete de servicio de integración de SQL ServerSQL Server Agent Service Account
diff --git a/resources/xlf/es/azurecore.es.xlf b/resources/xlf/es/azurecore.es.xlf
index 79e65677d8..ab13c87e69 100644
--- a/resources/xlf/es/azurecore.es.xlf
+++ b/resources/xlf/es/azurecore.es.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: Actualizar todas las cuentasRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: inicie sesiónSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Añadir a servidoresClear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ No se han encontrado recursos
diff --git a/resources/xlf/es/cms.es.xlf b/resources/xlf/es/cms.es.xlf
index b775d6e7ef..7790b094ae 100644
--- a/resources/xlf/es/cms.es.xlf
+++ b/resources/xlf/es/cms.es.xlf
@@ -4,23 +4,23 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Servidores de administración central de SQL ServerSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Compatibilidad con la administración de servidores de administración central de SQL ServerCentral Management Servers
- Central Management Servers
+ Servidores de administración centralMicrosoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerCentral Management Servers
- Central Management Servers
+ Servidores de administración centralRefresh
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Actualizar grupo de servidoresDelete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ Nuevo registro de servidor...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ Nuevo grupo de servidores...Add Central Management Server
- Add Central Management Server
+ Agregar servidor de administración centralDelete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ Configuración de MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Opcional] Registre la salida de depuración a la consola (Ver -> Salida) y después seleccione el canal de salida apropiado del menú desplegable[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Opcional] Nivel de registro para servicios back-end. Azure Data Studio genera un nombre de archivo cada vez que se inicia y, si el archivo ya existe, las entradas de registros se anexan a ese archivo. Para la limpieza de archivos de registro antiguos, consulte la configuración de logRetentionMinutes y logFilesRemovalLimit. El valor predeterminado tracingLevel no registra mucho. El cambio de detalle podría dar lugar a amplios requisitos de registro y espacio en disco para los registros. Error incluye Crítico, Advertencia incluye Error, Información incluye Advertencia y Detallado incluye InformaciónNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Número de minutos para conservar los archivos de registro de los servicios back-end. El valor predeterminado es 1 semana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Número máximo de archivos antiguos para quitar quitar al iniciarse y que tienen expirado el valor mssql.logRetentionMinutes. Los archivos que no se limpien debido a esta limitación se limpiarán la próxima vez que se inicie Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Opcional] No mostrar advertencias de plataforma no compatibleRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ Plan de tarifaCompatibility Level
@@ -164,15 +164,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nombre (opcional)Custom name of the connection
- Custom name of the connection
+ Nombre personalizado de la conexiónServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nombre de la instancia de SQL ServerServer Description
- Server Description
+ Descripción del servidorDescription of the SQL Server instance
- Description of the SQL Server instance
+ Descripción de la instancia de SQL ServerAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Especifica el método de autenticación con SQL ServerSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory: universal con compatibilidad con MFAUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica el ID de usuario que se utilizará al conectarse al origen de datosPassword
@@ -224,155 +224,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica la contraseña que se utilizará al conectarse al origen de datosApplication intent
- Application intent
+ Intención de la aplicaciónDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Declara el tipo de carga de trabajo de la aplicación al conectarse a un servidorAsynchronous processing
- Asynchronous processing
+ Procesamiento asincrónicoWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Cuando es true, habilita el uso de la funcionalidad asincrónica en el proveedor de datos de .NET FrameworkConnect timeout
- Connect timeout
+ Tiempo de espera de conexiónThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ El tiempo (en segundos) para esperar una conexión con el servidor antes de finalizar el intento y generar un errorCurrent language
- Current language
+ Idioma actualThe SQL Server language record name
- The SQL Server language record name
+ El nombre del registro de idioma de SQL ServerColumn encryption
- Column encryption
+ Cifrado de columnasDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Configuración predeterminada de cifrado de columnas para todos los comandos de la conexiónEncrypt
- Encrypt
+ CifrarWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Cuando es true, SQL Server usa el cifrado SSL para todos los datos enviados entre el cliente y el servidor si el servidor tiene un certificado instaladoPersist security info
- Persist security info
+ Información de seguridad persistenteWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Cuando es false, la información confidencial, como la contraseña, no se devuelve como parte de la conexiónTrust server certificate
- Trust server certificate
+ Certificado de servidor de confianzaWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Cuando es true (y encrypt-true), SQL Server usa el cifrado SSL para todos los datos enviados entre el cliente y el servidor sin validar el certificado de servidorAttached DB file name
- Attached DB file name
+ Nombre de archivo de base de datos adjuntoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ El nombre del archivo principal, incluido el nombre completo de la ruta de acceso, de una base de datos adjuntableContext connection
- Context connection
+ Conexión contextualWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Cuando es true, indica que la conexión debe producirse desde el contexto de SQL Server. Disponible solo cuando se ejecuta en el proceso de SQL ServerPort
- Port
+ PuertoConnect retry count
- Connect retry count
+ Recuento de reintentos de conexiónNumber of attempts to restore connection
- Number of attempts to restore connection
+ Número de intentos para restaurar la conexiónConnect retry interval
- Connect retry interval
+ Intervalo del reintento de conexiónDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Retardo entre intentos de restaurar la conexiónApplication name
- Application name
+ Nombre de la aplicaciónThe name of the application
- The name of the application
+ El nombre de la aplicaciónWorkstation Id
- Workstation Id
+ Id. de estación de trabajoThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ El nombre de la estación de trabajo que se conecta a SQL ServerPooling
- Pooling
+ AgrupaciónWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Cuando es true, el objeto de conexión se extrae del grupo adecuado, o si es necesario, se crea y se agrega al grupo adecuadoMax pool size
- Max pool size
+ Tamaño máximo del grupoThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ El número máximo de conexiones permitidas en el grupoMin pool size
- Min pool size
+ Tamaño mínimo del grupoThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ El número mínimo de conexiones permitidas en el grupoLoad balance timeout
- Load balance timeout
+ Tiempo de espera del equilibrio de cargaThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ La cantidad mínima de tiempo (en segundos) para que esta conexión permanezca en el grupo antes de que se destruyaReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Utilizado por SQL Server en replicaciónAttach DB filename
- Attach DB filename
+ Adjuntar nombre de archivo de base de datosFailover partner
- Failover partner
+ Socio de conmutación por errorThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ El nombre o la dirección de red de la instancia de SQL Server que actúa como asociado para la conmutación por errorMulti subnet failover
- Multi subnet failover
+ Conmutación por error de varias subredesMultiple active result sets
- Multiple active result sets
+ Varios conjuntos de resultados activosWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Cuando es true, se pueden devolver y leer varios conjuntos de resultados desde una conexiónPacket size
- Packet size
+ Tamaño del paqueteSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Tamaño en bytes de los paquetes de red utilizados para comunicarse con una instancia de SQL ServerType system version
- Type system version
+ Tipo de versión del sistemaIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica qué sistema de tipo de servidor el proveedor expondrá entonces a través de DataReader
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ No se pudo encontrar el servidor de administración central {0} o está sin conexiónNo resources found
- No resources found
+ No se han encontrado recursos
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Agregar servidor de administración central...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ El grupo de servidores de administración central ya tiene un servidor registrado con el nombre {0}Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ No se ha podido agregar el servidor registrado {0}Are you sure you want to delete
- Are you sure you want to delete
+ ¿Está seguro de que desea eliminar?Yes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Descripción del grupo de servidores{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} ya tiene un grupo de servidores con el nombre {1}Are you sure you want to delete
- Are you sure you want to delete
+ ¿Está seguro de que desea eliminar?
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ No puede agregar un servidor registrado compartido con el mismo nombre que el servidor de configuración
diff --git a/resources/xlf/es/dacpac.es.xlf b/resources/xlf/es/dacpac.es.xlf
index aaa28f839c..6acc2acea9 100644
--- a/resources/xlf/es/dacpac.es.xlf
+++ b/resources/xlf/es/dacpac.es.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ Puede ver el estado de la generación de scripts en la vista Tareas una vez que se cierra el asistente. El script generado se abrirá cuando se complete.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/es/import.es.xlf b/resources/xlf/es/import.es.xlf
index 470c539849..95204abb09 100644
--- a/resources/xlf/es/import.es.xlf
+++ b/resources/xlf/es/import.es.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Esta operación no se completó correctamente. Pruebe con un archivo de entrada diferente.Refresh
diff --git a/resources/xlf/es/mssql.es.xlf b/resources/xlf/es/mssql.es.xlf
index a7b64133c4..28b86360da 100644
--- a/resources/xlf/es/mssql.es.xlf
+++ b/resources/xlf/es/mssql.es.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Cargar archivosNew directory
- New directory
+ Nuevo directorioDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Nuevo NotebookOpen Notebook
- Open Notebook
+ Abra NotebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Tareas e información sobre el clúster de macrodatos de SQL ServerSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Enviar trabajo de SparkNew Spark Job
- New Spark Job
+ Nuevo trabajo de SparkView Spark History
- View Spark History
+ Ver el historial de SparkView Yarn History
- View Yarn History
+ Ver historial de YarnTasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ Instalar paquetesConfigure Python for Notebooks
- Configure Python for Notebooks
+ Configurar Python para NotebooksCluster Status
- Cluster Status
+ Estado del clústerSearch: Servers
- Search: Servers
+ Buscar: ServidoresSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Buscar: Borrar los resultados del servidor de búsquedaService Endpoints
- Service Endpoints
+ Puntos de conexión de servicioMSSQL configuration
- MSSQL configuration
+ Configuración de MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Opcional] Registre la salida de depuración en a la consola (Ver -> Salida) y después seleccione el canal de salida apropiado del menú desplegable[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Opcional] Nivel de registro para servicios back-end. Azure Data Studio genera un nombre de archivo cada vez que se inicia y, si el archivo ya existe, las entradas de registros se anexan a ese archivo. Para la limpieza de archivos de registro antiguos, consulte la configuración de logRetentionMinutes y logFilesRemovalLimit. El valor predeterminado tracingLevel no registra mucho. El cambio de detalle podría dar lugar a amplios requisitos de registro y espacio en disco para los registros. Error incluye Crítico, Advertencia incluye Error, Información incluye Advertencia y Detallado incluye Información.Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Número de minutos para conservar los archivos de registro para los servicios back-end. El valor predeterminado es 1 semana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Número máximo de archivos antiguos que se van a quitar al iniciarse y que tienen expirado el valor mssql.logRetentionMinutes. Los archivos que no se limpien por esta limitación se limpiarán la próxima vez que se inicie Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Opcional] No mostrar advertencias de plataformas no compatiblesRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ Plan de tarifaCompatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nombre (opcional)Custom name of the connection
- Custom name of the connection
+ Nombre personalizado de la conexiónServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nombre de la instancia de SQL ServerDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ El nombre del catálogo inicial o de la base de datos en el origen de datosAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Especifica el método de autenticación con SQL ServerSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory: universal con compatibilidad con MFAUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica el ID de usuario que se utilizará al conectarse al origen de datosPassword
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica la contraseña que se utilizará al conectarse al origen de datosApplication intent
- Application intent
+ Intención de la aplicaciónDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Declara el tipo de carga de trabajo de la aplicación al conectarse a un servidorAsynchronous processing
- Asynchronous processing
+ Procesamiento asincrónicoWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Cuando es true, habilita el uso de la funcionalidad asincrónica en el proveedor de datos de .NET FrameworkConnect timeout
- Connect timeout
+ Tiempo de espera de conexiónThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ El tiempo (en segundos) para esperar una conexión con el servidor antes de finalizar el intento y generar un errorCurrent language
- Current language
+ Idioma actualThe SQL Server language record name
- The SQL Server language record name
+ El nombre del registro de idioma de SQL ServerColumn encryption
- Column encryption
+ Cifrado de columnasDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Configuración predeterminada de cifrado de columnas para todos los comandos de la conexiónEncrypt
- Encrypt
+ CifrarWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Cuando es true, SQL Server usa el cifrado SSL para todos los datos enviados entre el cliente y el servidor si el servidor tiene un certificado instaladoPersist security info
- Persist security info
+ Información de seguridad persistenteWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Cuando es false, la información de tipo confidencial, como la contraseña, no se devuelve como parte de la conexiónTrust server certificate
- Trust server certificate
+ Certificado de servidor de confianzaWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Cuando true (y encrypt-true), SQL Server usa el cifrado SSL para todos los datos enviados entre el cliente y el servidor sin validar el certificado de servidorAttached DB file name
- Attached DB file name
+ Nombre del archivo de base de datos adjuntoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ El nombre del archivo principal, incluido el nombre completo de la ruta de acceso, de una base de datos adjuntableContext connection
- Context connection
+ Conexión contextualWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Cuando es true, indica que la conexión debe ser del contexto de SQL Server. Disponible solo cuando se ejecuta en el proceso de SQL ServerPort
- Port
+ PuertoConnect retry count
- Connect retry count
+ Recuento de reintentos de conexiónNumber of attempts to restore connection
- Number of attempts to restore connection
+ Número de intentos para restaurar la conexiónConnect retry interval
- Connect retry interval
+ Intervalo de reintento de conexiónDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Retraso entre intentos para restaurar la conexiónApplication name
- Application name
+ Nombre de la aplicaciónThe name of the application
- The name of the application
+ El nombre de la aplicaciónWorkstation Id
- Workstation Id
+ Id. de estación de trabajoThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ El nombre de la estación de trabajo que se conecta a SQL ServerPooling
- Pooling
+ AgrupaciónWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Cuando es true, el objeto de conexión se extrae del grupo adecuado, o si es necesario, se crea y se agrega al grupo adecuadoMax pool size
- Max pool size
+ Tamaño máximo del grupoThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ El número máximo de conexiones permitidas en el grupoMin pool size
- Min pool size
+ Tamaño mínimo del grupoThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ El número mínimo de conexiones permitidas en el grupoLoad balance timeout
- Load balance timeout
+ Tiempo de espera del equilibrio de cargaThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ La cantidad mínima de tiempo (en segundos) para que esta conexión permanezca en el grupo antes de que se destruyaReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Utilizado por SQL Server en replicaciónAttach DB filename
- Attach DB filename
+ Adjuntar nombre de archivo de base de datosFailover partner
- Failover partner
+ Socio de conmutación por errorThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ El nombre o la dirección de red de la instancia de SQL Server que actúa como asociado para la conmutación por errorMulti subnet failover
- Multi subnet failover
+ Conmutación por error de varias subredesMultiple active result sets
- Multiple active result sets
+ Múltiples conjuntos de resultados activosWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Cuando es true, se pueden devolver y leer varios conjuntos de resultados desde una conexiónPacket size
- Packet size
+ Tamaño del paqueteSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Tamaño en bytes de los paquetes de red utilizados para comunicarse con una instancia de SQL ServerType system version
- Type system version
+ Tipo de versión del sistemaIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica qué sistema de tipo de servidor el proveedor expondrá entonces a través de DataReader
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ No se devuelve ningún identificador de lote de trabajo de Spark de la respuesta. {0}[Error] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ No se devuelve ningún registro dentro de la respuesta.{0}[Error] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ Los parámetros de SparkJobSubmissionModel son ilegalessubmissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs no es válido. livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId no es válido. Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Obtener tiempo de espera del identificador de aplicación. {{0}[Registro] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ No se especifica la propiedad localFilePath o hdfsFolderPath. Property Path is not specified.
- Property Path is not specified.
+ No se especifica la ruta de acceso de la propiedad.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ Los parámetros de SparkJobSubmissionDialog son ilegalesNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Enviar{0} Spark Job Submission:
- {0} Spark Job Submission:
+ Envío de trabajo de Spark {0}:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Inicio del envío del trabajo de Spark ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ Introduzca un nombre...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Clúster de SparkPath to a .jar or .py file
- Path to a .jar or .py file
+ Ruta de acceso a un archivo .jar o .pyThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ El archivo local seleccionado se cargará en HDFS: {0}JAR/py File
- JAR/py File
+ Archivo JAR/pyMain Class
- Main Class
+ Clase principalArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ Argumentos de línea de comandos utilizados en la clase principal, varios argumentos deben dividirse con un espacio.Property Job Name is not specified.
- Property Job Name is not specified.
+ No se especifica el nombre del trabajo de propiedad.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ No se especifica la propiedad del archivo JAR/py.Property Main Class is not specified.
- Property Main Class is not specified.
+ No se especifica la clase principal de la propiedad.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} no existe en el clúster o en la excepción iniciada.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ El archivo HDFS especificado no existe. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ Error al localizar el archivo debido a un error: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ Archivos JAR de referenciaJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ Archivos JAR que se colocarán en el directorio de trabajo del ejecutor. La ruta de acceso del archivo JAR debe ser una ruta de acceso de HDFS. Varias rutas deben dividirse por punto y coma (;)Reference py Files
- Reference py Files
+ Archivos py de referenciaPy Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Archivos Py que se colocarán en el directorio de trabajo del ejecutor. La ruta de acceso del archivo debe ser una ruta de acceso HDFS. Varias rutas deben dividirse por punto y coma (;)Reference Files
- Reference Files
+ Archivos de referenciaFiles to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Archivos que se colocarán en el directorio de trabajo del ejecutor. La ruta de acceso del archivo debe ser una ruta de acceso HDFS. Varias rutas deben dividirse por punto y coma (;)Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Seleccione SQL Server con un clúster de macrodatosNo Sql Server is selected.
- No Sql Server is selected.
+ No hay ningún servidor SQL Server seleccionado.Error Get File Path: {0}
- Error Get File Path: {0}
+ Error al obtener ruta de acceso del archivo: {0}Invalid Data Structure
- Invalid Data Structure
+ Estructura de datos no válidaUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ No se ha podido crear el cliente de WebHDFS debido a que faltan opciones: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ "${0}" es indefinido.Bad Request
- Bad Request
+ Solicitud incorrectaUnauthorized
- Unauthorized
+ No autorizadoForbidden
- Forbidden
+ ProhibidoNot Found
- Not Found
+ No encontradoInternal Server Error
- Internal Server Error
+ Error interno del servidorUnknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ Redirección inesperadaPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Proporcione la contraseña para conectarse a HDFS:Session for node {0} does not exist
- Session for node {0} does not exist
+ La sesión para el nodo {0} no existeError notifying of node change: {0}
- Error notifying of node change: {0}
+ Error al notificar el cambio de nodo: {0}Root
- Root
+ RaízHDFS
- HDFS
+ HDFSData Services
- Data Services
+ Servicios de datosNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ AVISO: Este archivo se ha truncado en {0} para la vista previa.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ El archivo se ha truncado en {0} para la vista previa.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo no está definido.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options no está definido.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ Faltan algunas propiedades en connectionInfo.options: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ No se admite la acción {0} para este controladorCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ No se puede abrir el vínculo {0} porque solo se admiten los vínculos HTTP y HTTPSDownload and open '{0}'?
- Download and open '{0}'?
+ ¿Descargar y abrir "{0}"?Could not find the specified file
- Could not find the specified file
+ No se pudo encontrar el archivo especificadoFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ Error en la solicitud de apertura de archivo: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Error al detener el servidor de Notebook: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ El proceso de Notebook se cerró prematuramente con el error: {0}, Salida de StdErr: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Error enviado desde Jupyter: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter se está ejecutando en {0}... Starting Notebook server
- ... Starting Notebook server
+ ... Iniciando el servidor de NotebookUnexpected setting type {0}
- Unexpected setting type {0}
+ Tipo de configuración inesperado {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ No se puede iniciar una sesión, el administrador aún no está inicializadoSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Los kernels de Spark requieren una conexión a una instancia maestra del clúster de macrodatos de SQL Server.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ Error en el apagado del servidor de Notebook: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ La instalación de dependencias de notebook está en cursoPython download is complete
- Python download is complete
+ La descarga de Python está completaError while downloading python setup
- Error while downloading python setup
+ Error al descargar la configuración de PythonDownloading python package
- Downloading python package
+ Descargando paquete de PythonUnpacking python package
- Unpacking python package
+ Desempaquetado de paquete de PythonError while creating python installation directory
- Error while creating python installation directory
+ Error al crear el directorio de instalación de PythonError while unpacking python bundle
- Error while unpacking python bundle
+ Error al desempaquetar el paquete de PythonInstalling Notebook dependencies
- Installing Notebook dependencies
+ Instalación de dependencias de NotebookInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Instalación de dependencias de Notebook, consulte La vista Tareas para obtener más informaciónNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ La instalación de las dependencias de Notebook se ha completadoCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ No se puede sobrescribir la instalación de Python existente mientras python se está ejecutando.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ Otra instalación de Python está actualmente en curso.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python ya existe en la ubicación específica. Omitiendo la instalación.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ Error al instalar las dependencias de Notebook: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Descarga de Python local para la plataforma: {0} a {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Instalación de paquetes necesarios para ejecutar Notebooks...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Instalación de Jupyter completa.Installing SparkMagic...
- Installing SparkMagic...
+ Instalación de SparkMagic...A notebook path is required
- A notebook path is required
+ Se requiere una ruta de acceso del bloc de notasNotebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Solo se admiten los Notebooks de tipo .ipynbAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ ¿Está seguro de que desea volver a instalar?Configure Python for Notebooks
- Configure Python for Notebooks
+ Configurar Python para NotebooksInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Ubicación de instalación de PythonSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Esta instalación llevará algún tiempo. Se recomienda no cerrar la aplicación hasta que se complete la instalación.The specified install location is invalid.
- The specified install location is invalid.
+ La ubicación de instalación especificada no es válida.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ No se encontró ninguna instalación de Python en la ubicación especificada.Python installation was declined.
- Python installation was declined.
+ La instalación de Python se rechazó.Installation Type
- Installation Type
+ Tipo de instalaciónNew Python installation
- New Python installation
+ Nueva instalación de PythonUse existing Python installation
- Use existing Python installation
+ Usar la instalación de Python existenteOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ Error en la apertura del archivo {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Error en la apertura del archivo {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Error en la apertura de archivo {0}: {1}Missing file : {0}
- Missing file : {0}
+ Falta el archivo: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Este código de ejemplo carga el archivo en un marco de datos y muestra los primeros 10 resultados.No notebook editor is active
- No notebook editor is active
+ Ningún editor de blocs de notas está activoCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ ¿Qué tipo de celda desea agregar?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Extensión de implementación de SQL Server para Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Proporciona una experiencia basada en cuadernos para implementar Microsoft SQL ServerDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Implementar SQL Server en Docker...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ Implementar clúster de macrodatos de SQL Server...Deploy SQL Server…
- Deploy SQL Server…
+ Implementar SQL Server...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ Imagen de contenedor de SQL ServerRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Ejecutar la imagen de contenedor de SQL Server con DockerSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ El clúster de macrodatos de SQL Server le permite implementar clústeres escalables de contenedores de SQL Server, Spark y HDFS que se ejecutan en KubernetesVersion
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Clúster de macrodatos de SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ Destino de implementaciónNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Una utilidad de línea de comandos escrita en Python que permite a los administradores de clústeres arrancar y administrar el clúster de macrodatos a través de las API RESTmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Una herramienta de línea de comandos le permite ejecutar comandos en clústeres de Kuberneteskubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Proporciona la capacidad de empaquetar y ejecutar una aplicación en contenedores aisladosDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Una herramienta de línea de comandos para administrar recursos de AzureAzure CLI
- Azure CLI
+ CLI de Azure
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ No se pudo encontrar package.json o el nombre/editor no está establecido
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ El bloc de notas {0} no existe
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Seleccione las opciones de implementaciónOpen Notebook
- Open Notebook
+ Abra NotebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ No se ha podido cargar la extensión {0}. Error detectado en la definición de tipo de recurso en package.json, compruebe la consola de depuración para obtener más información.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ El tipo de recurso {0} no está definido
diff --git a/resources/xlf/es/schema-compare.es.xlf b/resources/xlf/es/schema-compare.es.xlf
index 8d4bab3ba0..66809f7c83 100644
--- a/resources/xlf/es/schema-compare.es.xlf
+++ b/resources/xlf/es/schema-compare.es.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ Comparación de esquemas de SQL ServerSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ La comparación de esquemas de SQL Server para Azure Data Studio admite la comparación de los esquemas de bases de datos y paquetes DAC.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ Las opciones han cambiado. ¿Volver a comparar para ver la comparación?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ Opciones generalesInclude Object Types
- Include Object Types
+ Incluir tipos de objetoIgnore Table Options
- Ignore Table Options
+ Ignorar opciones de tablaIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Ignorar punto y coma entre instruccionesIgnore Route Lifetime
- Ignore Route Lifetime
+ Ignorar la vigencia de la rutaIgnore Role Membership
- Ignore Role Membership
+ Ignorar la pertenencia a rolesIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Ignorar identificadores entrecomilladosIgnore Permissions
- Ignore Permissions
+ Ignorar permisosIgnore Partition Schemes
- Ignore Partition Schemes
+ Ignorar esquemas de particiónIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Ignorar la colocación de objetos en el esquema de particiónIgnore Not For Replication
- Ignore Not For Replication
+ Ignorar la no replicaciónIgnore Login Sids
- Ignore Login Sids
+ Ignorar SID de inicio de sesiónIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Ignorar sugerencias de bloqueo en índicesIgnore Keyword Casing
- Ignore Keyword Casing
+ Ignorar mayúsculas y minúsculas en palabras claveIgnore Index Padding
- Ignore Index Padding
+ Ignorar relleno de índiceIgnore Index Options
- Ignore Index Options
+ Ignorar opciones de índiceIgnore Increment
- Ignore Increment
+ Ignorar incrementoIgnore Identity Seed
- Ignore Identity Seed
+ Ignorar inicialización de identidadIgnore User Settings Objects
- Ignore User Settings Objects
+ Ignorar objetos de configuración de usuarioIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ Ignorar FilePath de catálogo de texto completoIgnore Whitespace
- Ignore Whitespace
+ Ignorar espacio en blancoIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Ignorar WITH NOCHECK en claves externasVerify Collation Compatibility
- Verify Collation Compatibility
+ Verificar la compatibilidad de la intercalaciónUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Advertencias de objetos no modificablesTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Tratar los errores de verificación como advertenciasScript Refresh Module
- Script Refresh Module
+ Módulo de actualización de scriptScript New Constraint Validation
- Script New Constraint Validation
+ Nueva validación de restricciones de scriptScript File Size
- Script File Size
+ Tamaño del archivo de scriptScript Deploy StateChecks
- Script Deploy StateChecks
+ Comprobaciones de estado de la implementación del scriptScript Database Options
- Script Database Options
+ Opciones de base de datos de scriptScript Database Compatibility
- Script Database Compatibility
+ Compatibilidad de bases de datos de scriptScript Database Collation
- Script Database Collation
+ Intercalación de base de datos de scriptRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Ejecutar ejecutores del plan de implementaciónRegister DataTier Application
- Register DataTier Application
+ Registrar la aplicación de DataTierPopulate Files On File Groups
- Populate Files On File Groups
+ Rellenar archivos en grupos de archivosNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ No hay instrucciones de modificación para cambiar los tipos CLRInclude Transactional Scripts
- Include Transactional Scripts
+ Incluir scripts transaccionalesInclude Composite Objects
- Include Composite Objects
+ Incluir objetos compuestosAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Permitir el movimiento de datos de seguridad de nivel de fila no seguroIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Omitir cláusula WITH NOCHECK en restricciones CHECKIgnore Fill Factor
- Ignore Fill Factor
+ Ignorar factor de rellenoIgnore File Size
- Ignore File Size
+ Ignorar tamaño de archivoIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Ignorar la colocación del grupo de archivosDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ No modificar objetos replicadosDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ No alterar los objetos de captura de datos modificadosDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ Deshabilitar y volver a habilitar los desencadenadores de ddlDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Implementar base de datos en modo de usuario únicoCreate New Database
- Create New Database
+ Crear nueva base de datosCompare Using Target Collation
- Compare Using Target Collation
+ Comparar con la intercalación de destinoComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Convertir en comentario las declaraciones de var establecidasBlock When Drift Detected
- Block When Drift Detected
+ Bloquear cuando se detecte una desviaciónBlock On Possible Data Loss
- Block On Possible Data Loss
+ Bloquear la posible pérdida de datosBackup Database Before Changes
- Backup Database Before Changes
+ Copia de seguridad de la base de datos antes de los cambiosAllow Incompatible Platform
- Allow Incompatible Platform
+ Permitir plataforma no compatibleAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Permitir la eliminación de ensamblados de bloqueoDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Quitar limitaciones que no estén en el origenDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Quitar desencadenadores DML que no estén en el origenDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Quitar las propiedades extendidas que no están en la fuenteDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Quitar los índices que no estén en el origenIgnore File And Log File Path
- Ignore File And Log File Path
+ Ignorar archivo y ruta de acceso del archivo de registroIgnore Extended Properties
- Ignore Extended Properties
+ Ignorar propiedades extendidasIgnore Dml Trigger State
- Ignore Dml Trigger State
+ Ignorar el estado del desencadenador DMLIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ Ignorar el orden del desencadenador de DMLIgnore Default Schema
- Ignore Default Schema
+ Ignorar esquema predeterminadoIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Ignorar el estado del desencadenador de DDLIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ignorar el orden del desencadenador de DDLIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ Ignorar ruta de archivos del proveedor de cifradoVerify Deployment
- Verify Deployment
+ Verificar la implementaciónIgnore Comments
- Ignore Comments
+ Ignorar comentariosIgnore Column Collation
- Ignore Column Collation
+ Ignorar intercalación de columnasIgnore Authorizer
- Ignore Authorizer
+ Ignorar autorizadorIgnore AnsiNulls
- Ignore AnsiNulls
+ Ignorar AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ Generar SmartDefaultsDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Quitar las estadísticas que no estén en origenDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Quitar miembros de rol que no estén en origenDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Quitar permisos que no estén en origenDrop Objects Not In Source
- Drop Objects Not In Source
+ Quitar objetos que no estén en el origenIgnore Column Order
- Ignore Column Order
+ Ignorar el orden de las columnasAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ Desencadenadores de base de datosDefaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ Tablas de archivosFull Text Catalogs
- Full Text Catalogs
+ Catálogos de texto completoFull Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ Funciones escalares con valorSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ Funciones con valores de tablaUser Defined Data Types
- User Defined Data Types
+ Tipos de datos definidos por el usuarioUser Defined Table Types
- User Defined Table Types
+ Tipos de tabla definidos por el usuarioClr User Defined Types
- Clr User Defined Types
+ Tipos definidos del usuario de CLRUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ Desencadenadores de servidorSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Especifica que la publicación siempre debe quitar y volver a crear un ensamblado si hay una diferencia en lugar de emitir una instrucción ALTER ASSEMBLYSpecifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Si es true, la base de datos se establece en modo de usuario único antes de implementar.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Esta configuración determina cómo se controla la intercalación de la base de datos durante la implementación; de forma predeterminada, la intercalación de la base de datos de destino se actualizará si no coincide con la intercalación especificada por el origen. Cuando se establece esta opción, se debe usar la intercalación de la base de datos de destino (o el servidor).Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Especifica si los miembros de rol que no están definidos en el archivo de instantánea de base de datos (.dacpac) se quitarán de la base de datos de destino al publicar actualizaciones en una base de datos.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ Archivo de aplicación de capa de datos (.dacpac)Database
@@ -972,7 +972,7 @@
No active connections
- No active connections
+ Sin conexiones activasSchema Compare
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ Se ha seleccionado un esquema de origen diferente. ¿Comparar para ver la comparación?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ Se ha seleccionado un esquema de destino diferente. ¿Comparar para ver la comparación?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Se han seleccionado diferentes esquemas de origen y destino. ¿Comparar para ver la comparación?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Comparar detallesAre you sure you want to update the target?
- Are you sure you want to update the target?
+ ¿Está seguro de que desea actualizar el destino?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Presione Comparar para actualizar la comparación.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Generar script para implementar cambios en el destinoNo changes to script
- No changes to script
+ No hay cambios en el scriptApply changes to target
- Apply changes to target
+ Aplicar cambios al objetivoNo changes to apply
- No changes to apply
+ No hay cambios que aplicarDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Iniciando comparación. Esto podría tardar un momento.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Para comparar dos esquemas, seleccione primero un esquema de origen y un esquema de destino y, a continuación, presione Comparar.No schema differences were found.
- No schema differences were found.
+ No se encontraron diferencias de esquema.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ Error en la comparación de esquemas: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ La generación de script se habilita cuando el destino es una base de datosApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ Aplicar está habilitado cuando el objetivo es una base de datosCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ Error al cancelar la comparación de esquemas: "{0}"Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ Error al generar el script "{0}"Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ Error en la aplicación de comparación de esquemas '''0''Switch direction
- Switch direction
+ Cambiar direcciónSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ Abra el archivo .scmpLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Cargue el origen, el destino y las opciones guardadas en un archivo .scmpOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ Error al abrir el scmp "{0}"Save .scmp file
- Save .scmp file
+ Guardar archivo .scmpSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Guardar origen y destino, opciones y elementos excluidosSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ Error al guardar scmp: "{0}"
diff --git a/resources/xlf/fr/admin-tool-ext-win.fr.xlf b/resources/xlf/fr/admin-tool-ext-win.fr.xlf
index 8fed7be7f9..dea506010c 100644
--- a/resources/xlf/fr/admin-tool-ext-win.fr.xlf
+++ b/resources/xlf/fr/admin-tool-ext-win.fr.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Extensions des outils d'administration de base de données pour WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Ajoute d'autres fonctionnalités spécifiques de Windows à Azure Data StudioProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Générer des scripts...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Aucun ConnectionContext pour handleLaunchSsmsMinPropertiesDialogCommandCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ Impossible de déterminer le nœud de l'Explorateur d'objets à partir de connectionContext : {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Aucun ConnectionContext pour handleLaunchSsmsMinPropertiesDialogCommandNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ Aucun connectionProfile fourni par connectionContext : {0}Launching dialog...
- Launching dialog...
+ Lancement de la boîte de dialogue...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Erreur d'appel de SsmsMin avec les arguments '{0}' - {1}
diff --git a/resources/xlf/fr/agent.fr.xlf b/resources/xlf/fr/agent.fr.xlf
index cba581885b..31f9cb7c22 100644
--- a/resources/xlf/fr/agent.fr.xlf
+++ b/resources/xlf/fr/agent.fr.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ Package SQL Server Integration ServicesSQL Server Agent Service Account
diff --git a/resources/xlf/fr/azurecore.fr.xlf b/resources/xlf/fr/azurecore.fr.xlf
index cc134abc61..6d7b8e38d6 100644
--- a/resources/xlf/fr/azurecore.fr.xlf
+++ b/resources/xlf/fr/azurecore.fr.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure : Actualiser tous les comptesRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure : Se connecterSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Ajouter aux serveursClear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ Aucune ressource
diff --git a/resources/xlf/fr/cms.fr.xlf b/resources/xlf/fr/cms.fr.xlf
index bd9879ea4a..0229c70fd6 100644
--- a/resources/xlf/fr/cms.fr.xlf
+++ b/resources/xlf/fr/cms.fr.xlf
@@ -4,11 +4,11 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Serveurs de gestion centralisée SQL ServerSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Prise en charge de la gestion des serveurs de gestion centralisée SQL ServerCentral Management Servers
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Actualiser le groupe de serveursDelete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ Nouvelle inscription de serveur...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ Nouveau groupe de serveurs...Add Central Management Server
- Add Central Management Server
+ Ajouter un serveur de gestion centraliséeDelete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ Configuration de MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Facultatif] Journaliser la sortie de débogage dans la console (Vue -> Sortie) et sélectionner le canal de sortie approprié dans la liste déroulante[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Facultatif] Niveau de journalisation des services de back-end. Azure Data Studio génère un nom de fichier à chaque démarrage et, si le fichier existe déjà, ajoute les entrées de journal à ce fichier. Pour nettoyer les anciens fichiers journaux, consultez les paramètres logRetentionMinutes et logFilesRemovalLimit. Le niveau de suivi par défaut correspond à une faible journalisation. Si vous changez le niveau de détail, vous pouvez obtenir une journalisation massive nécessitant de l'espace disque pour les journaux. Le niveau Erreur inclut le niveau Critique, le niveau Avertissement inclut le niveau Erreur, le niveau Informations inclut le niveau Avertissement et le niveau Détail inclut le niveau InformationsNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Nombre de minutes de conservation des fichiers journaux pour les services de back-end. La valeur par défaut est 1 semaine.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Nombre maximal d'anciens fichiers ayant dépassé mssql.logRetentionMinutes à supprimer au démarrage. Les fichiers qui ne sont pas nettoyés du fait de cette limitation le sont au prochain démarrage d'Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Facultatif] Ne pas afficher les avertissements de plateforme non prise en chargeRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ Niveau tarifaireCompatibility Level
@@ -168,11 +168,11 @@
Name (optional)
- Name (optional)
+ Nom (facultatif)Custom name of the connection
- Custom name of the connection
+ Nom personnalisé de la connexionServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nom de l'instance SQL ServerServer Description
- Server Description
+ Description du serveurDescription of the SQL Server instance
- Description of the SQL Server instance
+ Description de l'instance SQL ServerAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Spécifie la méthode d'authentification avec SQL ServerSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - Authentification universelle avec prise en charge de MFAUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indique l'identifiant utilisateur à utiliser pour la connexion à la source de donnéesPassword
@@ -224,107 +224,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indique le mot de passe à utiliser pour la connexion à la source de donnéesApplication intent
- Application intent
+ Intention d'applicationDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Déclare le type de charge de travail d'application pendant la connexion à un serveurAsynchronous processing
- Asynchronous processing
+ Traitement asynchroneWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Quand la valeur est true, permet d'utiliser la fonctionnalité asynchrone dans le fournisseur de données .Net FrameworkConnect timeout
- Connect timeout
+ Délai d'expiration de la connexionThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Durée d'attente (en secondes) d'une connexion au serveur avant de terminer la tentative et de générer une erreurCurrent language
- Current language
+ Langage actuelThe SQL Server language record name
- The SQL Server language record name
+ Nom d'enregistrement de la langue de SQL ServerColumn encryption
- Column encryption
+ Chiffrement de colonneDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Paramètre par défaut de chiffrement de colonne pour toutes les commandes sur la connexionEncrypt
- Encrypt
+ ChiffrerWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Quand la valeur est true, SQL Server utilise le chiffrement SSL pour toutes les données envoyées entre le client et le serveur si le serveur a un certificat installéPersist security info
- Persist security info
+ Conserver les informations de sécuritéWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Quand la valeur est false, les informations de sécurité, comme le mot de passe, ne sont pas retournées dans le cadre de la connexionTrust server certificate
- Trust server certificate
+ Approuver le certificat de serveurWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Quand la valeur est true (et encrypt=true), SQL Server utilise le chiffrement SSL pour toutes les données envoyées entre le client et le serveur sans valider le certificat de serveurAttached DB file name
- Attached DB file name
+ Nom de fichier de base de données attachéThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Nom de fichier principal, y compris le nom de chemin complet, d'une base de données pouvant être attachéeContext connection
- Context connection
+ Connexion contextuelleWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Quand la valeur est true, indique que la connexion doit provenir du contexte du serveur SQL. Disponible seulement en cas d'exécution dans le processus SQL ServerPort
- Port
+ PortConnect retry count
- Connect retry count
+ Nombre de tentatives de connexionNumber of attempts to restore connection
- Number of attempts to restore connection
+ Nombre de tentatives de restauration de connexionConnect retry interval
- Connect retry interval
+ Intervalle entre les tentatives de connexionDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Délai entre les tentatives de restauration de connexionApplication name
@@ -332,47 +332,47 @@
The name of the application
- The name of the application
+ Nom de l'applicationWorkstation Id
- Workstation Id
+ ID de station de travailThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Nom de la station de travail se connectant à SQL ServerPooling
- Pooling
+ RegroupementWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Quand la valeur est true, l'objet de connexion est tiré du pool approprié ou, si nécessaire, créé et ajouté dans le pool appropriéMax pool size
- Max pool size
+ Taille maximale du poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Nombre maximal de connexions autorisées dans le poolMin pool size
- Min pool size
+ Taille minimale du poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Nombre minimal de connexions autorisées dans le poolLoad balance timeout
- Load balance timeout
+ Délai d'expiration de l'équilibrage de chargeThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Durée de vie minimale (en secondes) de cette connexion dans le pool avant d'être détruiteReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Utilisé par SQL Server dans la réplicationAttach DB filename
- Attach DB filename
+ Attacher le nom de fichier de base de donnéesFailover partner
- Failover partner
+ Partenaire de basculementThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Nom ou adresse réseau de l'instance de SQL Server servant de partenaire de basculementMulti subnet failover
- Multi subnet failover
+ Basculement de plusieurs sous-réseauxMultiple active result sets
- Multiple active result sets
+ Plusieurs jeux de résultats actifsWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Quand la valeur est true, les jeux de résultats peuvent être retournés et lus sur une même connexionPacket size
- Packet size
+ Taille de paquetSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Taille en octets des paquets réseau utilisés pour communiquer avec une instance de SQL ServerType system version
- Type system version
+ Version du système de typeIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indique le système de type serveur que le fournisseur expose par le biais de DataReader
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ Le serveur de gestion centralisée {0} est introuvable ou hors ligneNo resources found
- No resources found
+ Aucune ressource
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Ajouter un serveur de gestion centralisée...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ Le groupe de serveurs de gestion centralisée a déjà un serveur inscrit nommé {0}Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ Impossible d'ajouter le serveur inscrit {0}Are you sure you want to delete
- Are you sure you want to delete
+ Voulez-vous vraiment supprimerYes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Description du groupe de serveurs{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} a déjà un groupe de serveurs nommé {1}Are you sure you want to delete
- Are you sure you want to delete
+ Voulez-vous vraiment supprimer
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ Vous ne pouvez pas ajouter un serveur inscrit partagé du même nom que le serveur de configuration
diff --git a/resources/xlf/fr/dacpac.fr.xlf b/resources/xlf/fr/dacpac.fr.xlf
index ffa8306bcb..a115c83f1f 100644
--- a/resources/xlf/fr/dacpac.fr.xlf
+++ b/resources/xlf/fr/dacpac.fr.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ Vous pouvez voir l'état de la génération de script dans la vue Tâches une fois l'Assistant fermé. Le script s'ouvre dès qu'il est généré.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/fr/import.fr.xlf b/resources/xlf/fr/import.fr.xlf
index aae6c945b4..21d0c1d1dc 100644
--- a/resources/xlf/fr/import.fr.xlf
+++ b/resources/xlf/fr/import.fr.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Cette opération a échoué. Essayez un autre fichier d'entrée.Refresh
diff --git a/resources/xlf/fr/mssql.fr.xlf b/resources/xlf/fr/mssql.fr.xlf
index c6d9ba3871..68c5467cef 100644
--- a/resources/xlf/fr/mssql.fr.xlf
+++ b/resources/xlf/fr/mssql.fr.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Charger des fichiersNew directory
- New directory
+ Nouveau répertoireDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Nouveau notebookOpen Notebook
- Open Notebook
+ Ouvrir le notebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Tâches et informations concernant votre cluster Big Data SQL ServerSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Envoyer le travail SparkNew Spark Job
- New Spark Job
+ Nouveau travail SparkView Spark History
- View Spark History
+ Voir l'historique SparkView Yarn History
- View Yarn History
+ Voir l'historique YarnTasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ Installer les packagesConfigure Python for Notebooks
- Configure Python for Notebooks
+ Configurer Python pour NotebooksCluster Status
- Cluster Status
+ État du clusterSearch: Servers
- Search: Servers
+ Recherche : ServeursSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Recherche : Effacer les résultats du serveur de rechercheService Endpoints
- Service Endpoints
+ Points de terminaison de serviceMSSQL configuration
- MSSQL configuration
+ Configuration de MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Facultatif] Journaliser la sortie de débogage dans la console (Voir -> Sortie) et sélectionner le canal de sortie approprié dans la liste déroulante[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Facultatif] Niveau de journalisation des services de back-end. Azure Data Studio génère un nom de fichier à chaque démarrage et, si le fichier existe déjà, ajoute les entrées de journal à ce fichier. Pour nettoyer les anciens fichiers journaux, consultez les paramètres logRetentionMinutes et logFilesRemovalLimit. Le niveau de suivi par défaut correspond à une faible journalisation. Si vous changez le niveau de détail, vous pouvez obtenir une journalisation massive nécessitant de l'espace disque pour les journaux. Le niveau Erreur inclut le niveau Critique, le niveau Avertissement inclut le niveau Erreur, le niveau Informations inclut le niveau Avertissement et le niveau Détail inclut le niveau InformationsNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Nombre de minutes de conservation des fichiers journaux pour les services de back-end. La durée par défaut est 1 semaine.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Nombre maximal d'anciens fichiers ayant dépassé mssql.logRetentionMinutes à supprimer au démarrage. Les fichiers qui ne sont pas nettoyés du fait de cette limitation le sont au prochain démarrage d'Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Facultatif] Ne pas afficher les avertissements de plateforme non prise en chargeRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ Niveau tarifaireCompatibility Level
@@ -224,11 +224,11 @@
Name (optional)
- Name (optional)
+ Nom (facultatif)Custom name of the connection
- Custom name of the connection
+ Nom personnalisé de la connexionServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nom de l'instance SQL ServerDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ Nom du catalogue ou de la base de données initiaux dans la source de donnéesAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Spécifie la méthode d'authentification avec SQL ServerSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - Authentification universelle avec prise en charge de MFAUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indique l'identifiant utilisateur à utiliser pour la connexion à la source de donnéesPassword
@@ -280,107 +280,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indique le mot de passe à utiliser pour la connexion à la source de donnéesApplication intent
- Application intent
+ Intention d'applicationDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Déclare le type de charge de travail de l'application pendant la connexion à un serveurAsynchronous processing
- Asynchronous processing
+ Traitement asynchroneWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Quand la valeur est true, permet d'utiliser la fonctionnalité asynchrone dans le fournisseur de données .Net FrameworkConnect timeout
- Connect timeout
+ Délai d'expiration de la connexionThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Durée d'attente (en secondes) d'une connexion au serveur avant de terminer la tentative et de générer une erreurCurrent language
- Current language
+ Langage actuelThe SQL Server language record name
- The SQL Server language record name
+ Nom d'enregistrement de la langue de SQL ServerColumn encryption
- Column encryption
+ Chiffrement de colonneDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Paramètre par défaut de chiffrement de colonne pour toutes les commandes sur la connexionEncrypt
- Encrypt
+ ChiffrerWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Quand la valeur est true, SQL Server utilise le chiffrement SSL pour toutes les données envoyées entre le client et le serveur si le serveur a un certificat installéPersist security info
- Persist security info
+ Conserver les informations de sécuritéWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Quand la valeur est false, les informations de sécurité, comme le mot de passe, ne sont pas retournées dans le cadre de la connexionTrust server certificate
- Trust server certificate
+ Approuver le certificat de serveurWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Quand la valeur est true (et encrypt=true), SQL Server utilise le chiffrement SSL pour toutes les données envoyées entre le client et le serveur sans valider le certificat de serveurAttached DB file name
- Attached DB file name
+ Nom de fichier de base de données attachéThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Nom de fichier principal, y compris le nom de chemin complet, d'une base de données pouvant être attachéeContext connection
- Context connection
+ Connexion contextuelleWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Quand la valeur est true, indique que la connexion doit provenir du contexte du serveur SQL. Disponible uniquement en cas d'exécution dans le processus SQL ServerPort
- Port
+ PortConnect retry count
- Connect retry count
+ Nombre de tentatives de connexionNumber of attempts to restore connection
- Number of attempts to restore connection
+ Nombre de tentatives de restauration de connexionConnect retry interval
- Connect retry interval
+ Intervalle entre les tentatives de connexionDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Délai entre les tentatives de restauration de connexionApplication name
@@ -388,47 +388,47 @@
The name of the application
- The name of the application
+ Nom de l'applicationWorkstation Id
- Workstation Id
+ ID de station de travailThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Nom de la station de travail se connectant à SQL ServerPooling
- Pooling
+ RegroupementWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Quand la valeur est true, l'objet de connexion est tiré du pool approprié ou, si nécessaire, est créé et ajouté au pool appropriéMax pool size
- Max pool size
+ Taille maximale du poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Nombre maximal de connexions autorisées dans le poolMin pool size
- Min pool size
+ Taille minimale du poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Nombre minimal de connexions autorisées dans le poolLoad balance timeout
- Load balance timeout
+ Délai d'expiration de l'équilibrage de chargeThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Durée de vie minimale (en secondes) de cette connexion dans le pool avant d'être détruiteReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Utilisé par SQL Server dans la réplicationAttach DB filename
- Attach DB filename
+ Attacher le nom de fichier de base de donnéesFailover partner
- Failover partner
+ Partenaire de basculementThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Nom ou adresse réseau de l'instance de SQL Server servant de partenaire de basculementMulti subnet failover
- Multi subnet failover
+ Basculement de plusieurs sous-réseauxMultiple active result sets
- Multiple active result sets
+ Plusieurs jeux de résultats actifsWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Quand la valeur est true, plusieurs jeux de résultats peuvent être retournés et lus sur une même connexionPacket size
- Packet size
+ Taille de paquetSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Taille en octets des paquets réseau utilisés pour communiquer avec une instance de SQL ServerType system version
- Type system version
+ Version du système de typeIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indique le système de type serveur que le fournisseur expose par le biais de DataReader
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ Aucun ID de lot de travail Spark dans la réponse.{0}[Erreur] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ Aucun journal dans la réponse.{0}[Erreur] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ Les paramètres de SparkJobSubmissionModel ne sont pas autoriséssubmissionArgs is invalid.
- submissionArgs is invalid.
+ soumissionArgs n'est pas valide.livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId n'est pas valide.Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Le délai d'attente d'obtention de l'ID d'application a expiré. {0}[Journal] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ La propriété localeFilePath ou hdfsFolderPath n'est pas spécifiée.Property Path is not specified.
- Property Path is not specified.
+ Le chemin de propriété n'est pas spécifié.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ Les paramètres de SparkJobSubmissionDialog ne sont pas autorisésNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Envoyer{0} Spark Job Submission:
- {0} Spark Job Submission:
+ Envoi du travail Spark {0} :.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Début de l'envoi du travail Spark ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ Entrer un nom...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Cluster SparkPath to a .jar or .py file
- Path to a .jar or .py file
+ Chemin d'un fichier .jar ou .pyThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ Le fichier local sélectionné est chargé dans HDFS : {0}JAR/py File
- JAR/py File
+ Fichier JAR/pyMain Class
- Main Class
+ Classe principaleArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ Arguments de ligne de commande utilisés dans votre classe principale, plusieurs arguments doivent être séparés par un espace.Property Job Name is not specified.
- Property Job Name is not specified.
+ Le nom de travail de la propriété n'est pas spécifié.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ Le fichier JAR/py de propriétés n'est pas spécifié.Property Main Class is not specified.
- Property Main Class is not specified.
+ La classe principale de la propriété n'est pas spécifiée.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} n'existe pas dans le cluster ou une exception est levée.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ Le fichier HDFS spécifié n'existe pas. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ Erreur de localisation du fichier en raison de l'erreur : {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ Fichiers JAR de référenceJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ Fichiers JAR à placer dans le répertoire de travail de l'exécuteur. Le chemin de fichier JAR doit être un chemin HDFS. Plusieurs chemins doivent être séparés par un point-virgule (;)Reference py Files
- Reference py Files
+ Fichiers py de référencePy Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Fichiers py à placer dans le répertoire de travail de l'exécuteur. Le chemin de fichier doit être un chemin HDFS. Plusieurs chemins doivent être séparés par un point-virgule (;)Reference Files
- Reference Files
+ Fichiers de référenceFiles to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Fichiers à placer dans le répertoire de travail de l'exécuteur. Le chemin de fichier doit être un chemin HDFS. Plusieurs chemins doivent être séparés par un point-virgule (;)Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Sélectionnez SQL Server avec le cluster Big Data.No Sql Server is selected.
- No Sql Server is selected.
+ Aucun serveur SQL sélectionné.Error Get File Path: {0}
- Error Get File Path: {0}
+ Erreur d'obtention du chemin de fichier : {0}Invalid Data Structure
- Invalid Data Structure
+ Structure de données non valideUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ Impossible de créer le client WebHDFS en raison d'options manquantes : ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '${0}' n'est pas défini.Bad Request
- Bad Request
+ Demande incorrecteUnauthorized
- Unauthorized
+ Non autoriséForbidden
- Forbidden
+ InterditNot Found
@@ -724,7 +724,7 @@
Internal Server Error
- Internal Server Error
+ Erreur de serveur interneUnknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ Redirection inattenduePlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Fournissez le mot de passe de connexion à HDFS :Session for node {0} does not exist
- Session for node {0} does not exist
+ La session du nœud {0} n'existe pasError notifying of node change: {0}
- Error notifying of node change: {0}
+ Erreur de notification du changement de nœud : {0}Root
- Root
+ RacineHDFS
- HDFS
+ HDFSData Services
- Data Services
+ Services de donnéesNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ REMARQUE : Ce fichier a été tronqué au niveau de {0} pour l'aperçu.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ Le fichier a été tronqué au niveau de {0} pour l'aperçu.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo n'est pas défini.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options n'est pas défini.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ Des propriétés sont manquantes dans connectionInfo.options : {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ L'action {0} n'est pas prise en charge pour ce gestionnaireCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ Impossible d'ouvrir le lien {0}, car seuls les liens HTTP et HTTPS sont pris en chargeDownload and open '{0}'?
- Download and open '{0}'?
+ Télécharger et ouvrir '{0}' ?Could not find the specified file
- Could not find the specified file
+ Fichier spécifié introuvableFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ La demande d'ouverture de fichier a échoué avec l'erreur : {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Erreur d'arrêt du serveur de notebook : {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ Le processus du notebook s'est terminé prématurément avec l'erreur : {0}, sortie StdErr : {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Erreur envoyée par Jupyter : {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ...Jupyter est en cours d'exécution sur {0}... Starting Notebook server
- ... Starting Notebook server
+ ...Démarrage du serveur de notebookUnexpected setting type {0}
- Unexpected setting type {0}
+ Type de paramètre inattendu {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ Impossible de démarrer une session, le gestionnaire n'est pas encore initialiséSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Les noyaux Spark nécessitent une connexion a une instance maître de cluster Big Data SQL Server.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ L'arrêt du serveur de notebook a échoué : {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ L'installation des dépendances de notebook est en coursPython download is complete
- Python download is complete
+ Le téléchargement de Python est terminéError while downloading python setup
- Error while downloading python setup
+ Erreur de téléchargement du programme d'installation de PythonDownloading python package
- Downloading python package
+ Téléchargement du package PythonUnpacking python package
- Unpacking python package
+ Décompression du package PythonError while creating python installation directory
- Error while creating python installation directory
+ Erreur de création du répertoire d'installation de PythonError while unpacking python bundle
- Error while unpacking python bundle
+ Erreur de décompression du bundle PythonInstalling Notebook dependencies
- Installing Notebook dependencies
+ Installation des dépendances de notebookInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Installation des dépendances de notebook, consultez la vue Tâches pour plus d'informationsNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ L'installation des dépendances de notebook est terminéeCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Impossible de remplacer l'installation de Python existante si Python est en cours d'exécution.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ Une autre installation de Python est actuellement en cours.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python existe déjà à l'emplacement spécifique. Installation ignorée.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ L'installation des dépendances de notebook a échoué avec l'erreur : {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Téléchargement de la version locale de Python pour la plateforme : {0} dans {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Installation des packages nécessaires pour exécuter Notebooks...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ...Installation de Jupyter effectuée.Installing SparkMagic...
- Installing SparkMagic...
+ Installation de SparkMagic...A notebook path is required
- A notebook path is required
+ Un chemin de notebook est nécessaireNotebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Seuls les notebooks .ipynb sont pris en chargeAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ Voulez-vous vraiment le réinstaller ?Configure Python for Notebooks
- Configure Python for Notebooks
+ Configurer Python pour NotebooksInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Emplacement d'installation de PythonSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Cette installation dure un certain temps. Nous vous recommandons de ne pas fermer l'application avant la fin de l'installation.The specified install location is invalid.
- The specified install location is invalid.
+ L'emplacement d'installation spécifié n'est pas valide.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ Aucune installation de Python à l'emplacement spécifié.Python installation was declined.
- Python installation was declined.
+ L'installation de Python a été refusée.Installation Type
- Installation Type
+ Type d'installationNew Python installation
- New Python installation
+ Nouvelle installation de PythonUse existing Python installation
- Use existing Python installation
+ Utiliser l'installation existante de PythonOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ L'ouverture du fichier {0} a échoué : {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ L'ouverture du fichier {0} a échoué : {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ L'ouverture du fichier {0} a échoué : {1}Missing file : {0}
- Missing file : {0}
+ Fichier manquant : {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Cet exemple de code charge le fichier dans un cadre de données et affiche les 10 premiers résultats.No notebook editor is active
- No notebook editor is active
+ Aucun éditeur de notebook actifCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ Quel type de cellule voulez-vous ajouter ?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Extension de déploiement SQL Server pour Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Fournit une expérience de notebook pour déployer Microsoft SQL ServerDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Déployer SQL Server sur Docker...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ Déployer le cluster Big Data SQL Server...Deploy SQL Server…
- Deploy SQL Server…
+ Déployer SQL Server...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ Image de conteneur SQL ServerRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Exécuter l'image conteneur SQL Server avec DockerSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ Le cluster Big Data SQL Server vous permet de déployer des clusters scalables de conteneurs SQL Server, Spark et HDFS s'exécutant sur KubernetesVersion
@@ -52,23 +52,23 @@
SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Cluster Big Data SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ Cible de déploiementNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Utilitaire de ligne de commande écrit en Python qui permet aux administrateurs de cluster de démarrer et de gérer le cluster Big Data via des API RESTmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Un outil de ligne de commande vous permet d'exécuter des commandes sur des clusters Kuberneteskubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Permet de packager et d'exécuter une application dans des conteneurs isolésDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Outil de ligne de commande pour gérer les ressources AzureAzure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ package.json est introuvable ou le nom/l'éditeur n'est pas défini
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ Le notebook {0} n'existe pas
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Sélectionner les options de déploiementOpen Notebook
- Open Notebook
+ Ouvrir le notebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ Le chargement de l'extension {0} a échoué. Erreur détectée dans la définition du type de ressource dans package.json, consultez la console de débogage pour plus d'informations.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ Le type de ressource {0} n'est pas défini
diff --git a/resources/xlf/fr/schema-compare.fr.xlf b/resources/xlf/fr/schema-compare.fr.xlf
index be6dd29695..5623cf8843 100644
--- a/resources/xlf/fr/schema-compare.fr.xlf
+++ b/resources/xlf/fr/schema-compare.fr.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ Comparaison de schémas SQL ServerSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ La comparaison de schémas SQL Server pour Azure Data Studio prend en charge la comparaison des schémas de bases de données et des fichiers dacpac.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ Les options ont changé. Relancer la comparaison pour voir les différences ?Schema Compare Options
@@ -52,311 +52,311 @@
Include Object Types
- Include Object Types
+ Inclure des types d'objetIgnore Table Options
- Ignore Table Options
+ Ignorer les options de tableIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Ignorer le point-virgule entre les instructionsIgnore Route Lifetime
- Ignore Route Lifetime
+ Ignorer la durée de vie de la routeIgnore Role Membership
- Ignore Role Membership
+ Ignorer l'appartenance à un rôleIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Ignorer les identificateurs entre guillemetsIgnore Permissions
- Ignore Permissions
+ Ignorer les autorisationsIgnore Partition Schemes
- Ignore Partition Schemes
+ Ignorer les schémas de partitionIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Ignorer le placement d'objet sur le schéma de partitionIgnore Not For Replication
- Ignore Not For Replication
+ Ignorer l'option Pas pour la réplicationIgnore Login Sids
- Ignore Login Sids
+ Ignorer les SID de connexionIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Ignorer les indicateurs de verrou sur les indexIgnore Keyword Casing
- Ignore Keyword Casing
+ Ignorer la casse des mots clésIgnore Index Padding
- Ignore Index Padding
+ Ignorer le remplissage d'indexIgnore Index Options
- Ignore Index Options
+ Ignorer les options d'indexIgnore Increment
- Ignore Increment
+ Ignorer l'incrémentIgnore Identity Seed
- Ignore Identity Seed
+ Ignorer le seed d'identitéIgnore User Settings Objects
- Ignore User Settings Objects
+ Ignorer les objets des paramètres utilisateurIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ Ignorer le chemin de fichier du catalogue de texte intégralIgnore Whitespace
- Ignore Whitespace
+ Ignorer les espaces blancsIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Ignorer WITH NOCHECK sur les clés étrangèresVerify Collation Compatibility
- Verify Collation Compatibility
+ Vérifier la compatibilité du classementUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Avertissements des objets non modifiablesTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Traiter les erreurs de vérification comme des avertissementsScript Refresh Module
- Script Refresh Module
+ Module d'actualisation de scriptScript New Constraint Validation
- Script New Constraint Validation
+ Validation de la nouvelle contrainte de scriptScript File Size
- Script File Size
+ Taille du fichier de scriptScript Deploy StateChecks
- Script Deploy StateChecks
+ StateChecks du déploiement de scriptScript Database Options
- Script Database Options
+ Options de base de données de scriptScript Database Compatibility
- Script Database Compatibility
+ Compatibilité de base de données de scriptScript Database Collation
- Script Database Collation
+ Classement de base de données de scriptRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Exécuter des exécuteurs de plan de déploiementRegister DataTier Application
- Register DataTier Application
+ Inscrire l'application de la couche DonnéesPopulate Files On File Groups
- Populate Files On File Groups
+ Remplir les fichiers dans des groupes de fichiersNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Aucune instruction ALTER pour changer les types CLRInclude Transactional Scripts
- Include Transactional Scripts
+ Inclure des scripts transactionnelsInclude Composite Objects
- Include Composite Objects
+ Inclure des objets compositesAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Autoriser le déplacement non sécurisé des données de sécurité au niveau des lignesIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Ignorer WITH NO CHECK sur les contraintes de validationIgnore Fill Factor
- Ignore Fill Factor
+ Ignorer le facteur de remplissageIgnore File Size
- Ignore File Size
+ Ignorer la taille du fichierIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Ignorer le placement du groupe de fichiersDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ Ne pas modifier les objets répliquésDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ Ne pas modifier les objets de capture des changements de donnéesDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ Désactiver et réactiver les déclencheurs DDLDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Déployer la base de données en mode mono-utilisateurCreate New Database
- Create New Database
+ Créer une base de donnéesCompare Using Target Collation
- Compare Using Target Collation
+ Comparer à l'aide du classement cibleComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Annuler les marques de commentaire des déclarations de variable définiesBlock When Drift Detected
- Block When Drift Detected
+ Bloquer en cas de dérive détectéeBlock On Possible Data Loss
- Block On Possible Data Loss
+ Bloquer en cas de perte de données possibleBackup Database Before Changes
- Backup Database Before Changes
+ Sauvegarder la base de données avant les changementsAllow Incompatible Platform
- Allow Incompatible Platform
+ Autoriser la plateforme incompatibleAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Autoriser la suppression des assemblys bloquantsDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Supprimer les contraintes qui ne sont pas dans la sourceDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Supprimer les déclencheurs DML qui ne sont pas dans la sourceDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Supprimer les propriétés étendues qui ne sont pas dans la sourceDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Supprimer les index qui ne sont pas dans la sourceIgnore File And Log File Path
- Ignore File And Log File Path
+ Ignorer le chemin de fichier et de fichier journalIgnore Extended Properties
- Ignore Extended Properties
+ Ignorer les propriétés étenduesIgnore Dml Trigger State
- Ignore Dml Trigger State
+ Ignorer l'état des déclencheurs DMLIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ Ignorer l'ordre des déclencheurs DMLIgnore Default Schema
- Ignore Default Schema
+ Ignorer le schéma par défautIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Ignorer l'état des déclencheurs DDLIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ignorer l'ordre des déclencheurs DDLIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ Ignorer la propriété FilePath du fournisseur de chiffrementVerify Deployment
- Verify Deployment
+ Vérifier le déploiementIgnore Comments
- Ignore Comments
+ Ignorer les commentairesIgnore Column Collation
- Ignore Column Collation
+ Ignorer le classement de colonneIgnore Authorizer
- Ignore Authorizer
+ Ignorer l'autorisateurIgnore AnsiNulls
- Ignore AnsiNulls
+ Ignorer AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ Générer des SmartDefaultsDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Supprimer les statistiques qui ne sont pas dans la sourceDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Supprimer les membres de rôle qui ne sont pas dans la sourceDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Supprimer les autorisations qui ne sont pas dans la sourceDrop Objects Not In Source
- Drop Objects Not In Source
+ Supprimer les objets qui ne sont pas dans la sourceIgnore Column Order
- Ignore Column Order
+ Ignorer l'ordre des colonnesAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,7 +436,7 @@
File Tables
- File Tables
+ Tables de fichiersFull Text Catalogs
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ Fonctions scalairesSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ Fonctions tableUser Defined Data Types
- User Defined Data Types
+ Types de données définis par l'utilisateurUser Defined Table Types
- User Defined Table Types
+ Types de table définis par l'utilisateurClr User Defined Types
- Clr User Defined Types
+ Types CLR définis par l'utilisateurUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ Déclencheurs de serveurSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Spécifie que la publication doit toujours supprimer et recréer un assembly en cas de différence au lieu d'envoyer une instruction ALTER ASSEMBLYSpecifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Si la valeur est true, la base de données est définie sur le mode mono-utilisateur avant le déploiement.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Ce paramètre définit la façon dont le classement de la base de données est géré pendant le déploiement. Par défaut, le classement de la base de données cible est mis à jour s'il ne correspond pas au classement spécifié par la source. Quand cette option est définie, le classement de la base de données (ou du serveur) cible doit être utilisé.Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Spécifie si les membres de rôle qui ne sont pas définis dans le fichier d'instantané de base de données (.dacpac) sont supprimés de la base de données cible quand vous publiez des mises à jour sur une base de données.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ Fichier d'application de la couche Données (.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ Un autre schéma source a été sélectionné. Comparer pour voir les différences ?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ Un autre schéma cible a été sélectionné. Comparer pour voir les différences ?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Vous avez sélectionné des schémas cible et source différents. Comparer pour voir les différences ?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Comparer les détailsAre you sure you want to update the target?
- Are you sure you want to update the target?
+ Voulez-vous vraiment mettre à jour la cible ?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Appuyez sur Comparer pour actualiser la comparaison.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Générer un script pour déployer les changements sur la cibleNo changes to script
- No changes to script
+ Aucun changement au scriptApply changes to target
- Apply changes to target
+ Appliquer les changements à la cibleNo changes to apply
- No changes to apply
+ Aucun changement à appliquerDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Initialisation de la comparaison. Cette opération peut durer un certain temps.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Pour comparer deux schémas, sélectionnez d'abord un schéma source et un schéma cible, puis appuyez sur Comparer.No schema differences were found.
- No schema differences were found.
+ Aucune différence de schéma.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ La comparaison de schémas a échoué : {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ La génération de script est activée quand la cible est une base de donnéesApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ L'option Appliquer est activée quand la cible est une base de donnéesCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ L'annulation de la comparaison de schémas a échoué : '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ La génération de script a échoué : '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ L'application de la comparaison de schémas a échoué '{0}'Switch direction
- Switch direction
+ Changer le sensSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ Ouvrir le fichier .scmpLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Charger la source, la cible et les options enregistrées dans un fichier .scmpOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ L'ouverture de scmp a échoué : '{0}'Save .scmp file
- Save .scmp file
+ Enregistrer le fichier .scmpSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Enregistrer la source et la cible, les options et les éléments exclusSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ L'enregistrement de scmp a échoué : '{0}'
diff --git a/resources/xlf/it/admin-tool-ext-win.it.xlf b/resources/xlf/it/admin-tool-ext-win.it.xlf
index 96c4d77bc5..9960253d3e 100644
--- a/resources/xlf/it/admin-tool-ext-win.it.xlf
+++ b/resources/xlf/it/admin-tool-ext-win.it.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Estensioni degli strumenti di amministrazione del database per WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Consente di aggiungere altre funzionalità specifiche di Windows ad Azure Data StudioProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Genera script...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Non è stato specificato alcun elemento ConnectionContext per handleLaunchSsmsMinPropertiesDialogCommandCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ Non è stato possibile determinare il nodo di Esplora oggetti da connectionContext: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Non è stato specificato alcun elemento ConnectionContext per handleLaunchSsmsMinPropertiesDialogCommandNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ Non è stato specificato alcun elemento connectionProfile da connectionContext: {0}Launching dialog...
- Launching dialog...
+ Avvio della finestra di dialogo...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Si è verificato un errore durante la chiamata di SsmsMin con gli argomenti '{0}' - {1}
diff --git a/resources/xlf/it/agent.it.xlf b/resources/xlf/it/agent.it.xlf
index d987871bed..3328a805e6 100644
--- a/resources/xlf/it/agent.it.xlf
+++ b/resources/xlf/it/agent.it.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ Pacchetto SQL Server Integration ServicesSQL Server Agent Service Account
diff --git a/resources/xlf/it/azurecore.it.xlf b/resources/xlf/it/azurecore.it.xlf
index ea936d0eed..5705e1e1ca 100644
--- a/resources/xlf/it/azurecore.it.xlf
+++ b/resources/xlf/it/azurecore.it.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: Aggiorna tutti gli accountRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: AccediSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Aggiungi ai serverClear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ Non sono state trovate risorse
diff --git a/resources/xlf/it/cms.it.xlf b/resources/xlf/it/cms.it.xlf
index f2a29a4ce8..f26c91fd88 100644
--- a/resources/xlf/it/cms.it.xlf
+++ b/resources/xlf/it/cms.it.xlf
@@ -4,23 +4,23 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Server di gestione centrale di SQL ServerSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Supporto per la gestione di Server di gestione centrale di SQL ServerCentral Management Servers
- Central Management Servers
+ Server di gestione centraleMicrosoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerCentral Management Servers
- Central Management Servers
+ Server di gestione centraleRefresh
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Aggiorna gruppo di serverDelete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ Registrazione nuovo server...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ Nuovo gruppo di server...Add Central Management Server
- Add Central Management Server
+ Aggiungi server di gestione centraleDelete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ Configurazione di MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Facoltativo] Registrare l'output di debut nella console (Visualizza -> Output), quindi selezionare il canale di output appropriato dall'elenco a discesa[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Facoltativo] Livello di registrazione per i servizi back-end. Azure Data Studio genera un nome file a ogni avvio e, se il file esiste già, le voci del log vengono aggiunte a tale file. Per la pulizia dei file di log meno recenti, vedere le impostazioni logRetentionMinutes e logFilesRemovalLimit. Con l'impostazione predefinita di tracingLevel, la quantità di dati registrata non è eccessiva. Se si cambia il livello di dettaglio, la registrazione potrebbe diventare eccessiva e richiedere un notevole spazio su disco per i log. Il livello Error include quello Critical, il livello Warning include quello Error, il livello Information include quello Warning e il livello Verbose include quello InformationNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Numero di minuti per la conservazione dei file di log per i servizi di back-end. L'impostazione predefinita è 1 settimana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Numero massimo di file meno recenti da rimuovere all'avvio per cui è scaduto il tempo impostato con mssql.logRetentionMinutes. I file che non vengono rimossi a causa di questa limitazione verranno rimossi al successivo avvio di Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Facoltativo] Non visualizzare avvisi su piattaforme non supportateRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ Piano tariffarioCompatibility Level
@@ -164,15 +164,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nome (facoltativo)Custom name of the connection
- Custom name of the connection
+ Nome personalizzato della connessioneServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nome dell'istanza di SQL ServerServer Description
- Server Description
+ Descrizione serverDescription of the SQL Server instance
- Description of the SQL Server instance
+ Descrizione dell'istanza di SQL ServerAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Specifica il metodo di autenticazione con SQL ServerSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - Universale con supporto MFAUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica l'ID utente da usare per la connessione all'origine datiPassword
@@ -224,155 +224,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica la password da usare per la connessione all'origine datiApplication intent
- Application intent
+ Finalità dell'applicazioneDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Dichiara il tipo di carico di lavoro dell'applicazione durante la connessione a un serverAsynchronous processing
- Asynchronous processing
+ Elaborazione asincronaWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Se è true, consente l'utilizzo della funzionalità asincrona nel provider di dati .NET Framework.Connect timeout
- Connect timeout
+ Timeout di connessioneThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Intervallo di tempo (in secondi) in cui attendere la connessione al server prima di interrompere il tentativo e generare un erroreCurrent language
- Current language
+ Lingua correnteThe SQL Server language record name
- The SQL Server language record name
+ Nome del record di lingua di SQL ServerColumn encryption
- Column encryption
+ Crittografia di colonnaDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Impostazione di crittografia di colonna predefinita per tutti i comandi della connessioneEncrypt
- Encrypt
+ CrittografaWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Se è true, SQL Server usa la crittografia SSL per tutti i dati scambiati tra il client e il server, se nel server è installato un certificatoPersist security info
- Persist security info
+ Salva in modo permanente le informazioni di sicurezzaWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Se è false, le informazioni sensibili dal punto di vista della sicurezza, come la password, non vengono restituite nell'ambito della connessioneTrust server certificate
- Trust server certificate
+ Considera attendibile il certificato del serverWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Se è true (e encrypt=true), SQL Server usa la crittografia SSL per tutti i dati inviati tra il client e il server senza convalidare il certificato del serverAttached DB file name
- Attached DB file name
+ Nome file DB collegatoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Nome del file primario, incluso il nome completo del percorso, di un database collegabileContext connection
- Context connection
+ Connessione al contestoWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Se è true, indica che la connessione deve provenire dal contesto SQL Server. Disponibile solo quando è in esecuzione nel processo SQL Server.Port
- Port
+ PortaConnect retry count
- Connect retry count
+ Conteggio tentativi di connessioneNumber of attempts to restore connection
- Number of attempts to restore connection
+ Numero di tentativi di ripristino della connessioneConnect retry interval
- Connect retry interval
+ Intervallo tentativi di connessioneDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Ritardo tra tentativi di ripristino della connessioneApplication name
- Application name
+ Nome dell'applicazioneThe name of the application
- The name of the application
+ Nome dell'applicazioneWorkstation Id
- Workstation Id
+ ID workstationThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Nome della workstation che si connette a SQL ServerPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Se è true, l'oggetto connessione viene prelevato dal pool appropriato oppure, se necessario, viene creato e aggiunto al pool appropriatoMax pool size
- Max pool size
+ Dimensioni massime del poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Numero massimo di connessioni consentite nel poolMin pool size
- Min pool size
+ Dimensioni minime del poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Numero minimo di connessioni consentite nel poolLoad balance timeout
- Load balance timeout
+ Timeout durante il bilanciamento del caricoThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Tempo minimo (in secondi) in cui la connessione rimane attiva nel pool prima di essere eliminata definitivamenteReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Usato da SQL Server nella replicaAttach DB filename
- Attach DB filename
+ Collega nome file DBFailover partner
- Failover partner
+ Partner di failoverThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Nome o indirizzo di rete dell'istanza di SQL Server che funge da partner di failoverMulti subnet failover
- Multi subnet failover
+ Failover su più subnetMultiple active result sets
- Multiple active result sets
+ Multiple Active Result SetWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Se è true, possono essere restituiti e letti più set di risultati da un'unica connessionePacket size
- Packet size
+ Dimensioni del pacchettoSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Dimensioni in byte dei pacchetti di rete usati per comunicare con un'istanza di SQL ServerType system version
- Type system version
+ Digita versione del sistemaIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica il sistema di tipi di server esposto dal provider tramite l'oggetto DataReader
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ Il server di gestione centrale {0} non è stato trovato oppure è offlineNo resources found
- No resources found
+ Non sono state trovate risorse
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Aggiungi server di gestione centrale...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ Il gruppo di server di gestione centrale include già un server registrato denominato {0}Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ Non è stato possibile aggiungere il server registrato {0}Are you sure you want to delete
- Are you sure you want to delete
+ EliminareYes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Descrizione gruppo di server{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} include già un gruppo di server denominato {1}Are you sure you want to delete
- Are you sure you want to delete
+ Eliminare
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ Non è possibile aggiungere un server registrato condiviso con lo stesso nome del server di configurazione
diff --git a/resources/xlf/it/dacpac.it.xlf b/resources/xlf/it/dacpac.it.xlf
index cbc8fb6e24..80427d5364 100644
--- a/resources/xlf/it/dacpac.it.xlf
+++ b/resources/xlf/it/dacpac.it.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ È possibile visualizzare lo stato della generazione dello script nella visualizzazione attività dopo la chiusura della procedura guidata. Lo script generato verrà aperto dopo il completamento.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/it/import.it.xlf b/resources/xlf/it/import.it.xlf
index 96521e743c..ad79459ddd 100644
--- a/resources/xlf/it/import.it.xlf
+++ b/resources/xlf/it/import.it.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Questa operazione non è riuscita. Provare con un file di input diverso.Refresh
diff --git a/resources/xlf/it/mssql.it.xlf b/resources/xlf/it/mssql.it.xlf
index 9921b1fb1b..96bf03b823 100644
--- a/resources/xlf/it/mssql.it.xlf
+++ b/resources/xlf/it/mssql.it.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Carica fileNew directory
- New directory
+ Nuova directoryDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Nuovo notebookOpen Notebook
- Open Notebook
+ Apri notebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Attività e informazioni sul cluster Big Data di SQL ServerSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Invia processo SparkNew Spark Job
- New Spark Job
+ Nuovo processo SparkView Spark History
- View Spark History
+ Visualizza cronologia di SparkView Yarn History
- View Yarn History
+ Visualizza cronologia di YARNTasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ Installa pacchettiConfigure Python for Notebooks
- Configure Python for Notebooks
+ Configura Python per NotebooksCluster Status
- Cluster Status
+ Stato clusterSearch: Servers
- Search: Servers
+ Ricerca: ServerSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Ricerca: Cancella risultati del server di ricercaService Endpoints
- Service Endpoints
+ Endpoint servizioMSSQL configuration
- MSSQL configuration
+ Configurazione di MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Facoltativo] Registrare l'output di debut nella console (Visualizza -> Output), quindi selezionare il canale di output appropriato dall'elenco a discesa[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Facoltativo] Livello di registrazione per i servizi back-end. Azure Data Studio genera un nome file a ogni avvio e, se il file esiste già, le voci del log vengono aggiunte a tale file. Per la pulizia dei file di log meno recenti, vedere le impostazioni logRetentionMinutes e logFilesRemovalLimit. Con l'impostazione predefinita di tracingLevel, la quantità di dati registrata non è eccessiva. Se si cambia il livello di dettaglio, la registrazione potrebbe diventare eccessiva e richiedere un notevole spazio su disco per i log. Il livello Error include quello Critical, il livello Warning include quello Error, il livello Information include quello Warning e il livello Verbose include quello InformationNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Numero di minuti per la conservazione dei file di log per i servizi di back-end. L'impostazione predefinita è 1 settimana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Numero massimo di file meno recenti da rimuovere all'avvio per cui è scaduto il tempo impostato con mssql.logRetentionMinutes. I file che non vengono rimossi a causa di questa limitazione verranno rimossi al successivo avvio di Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Facoltativo] Non visualizzare avvisi su piattaforme non supportateRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ Piano tariffarioCompatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nome (facoltativo)Custom name of the connection
- Custom name of the connection
+ Nome personalizzato della connessioneServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nome dell'istanza di SQL ServerDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ Nome del database o del catalogo iniziale nell'origine datiAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Specifica il metodo di autenticazione con SQL ServerSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - Universale con supporto MFAUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica l'ID utente da usare per la connessione all'origine datiPassword
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica la password da usare per la connessione all'origine datiApplication intent
- Application intent
+ Finalità dell'applicazioneDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Dichiara il tipo di carico di lavoro dell'applicazione durante la connessione a un serverAsynchronous processing
- Asynchronous processing
+ Elaborazione asincronaWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Se è true, consente l'utilizzo della funzionalità asincrona nel provider di dati .NET Framework.Connect timeout
- Connect timeout
+ Timeout di connessioneThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Intervallo di tempo (in secondi) in cui attendere la connessione al server prima di interrompere il tentativo e generare un erroreCurrent language
- Current language
+ Lingua correnteThe SQL Server language record name
- The SQL Server language record name
+ Nome del record di lingua di SQL ServerColumn encryption
- Column encryption
+ Crittografia di colonnaDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Impostazione di crittografia di colonna predefinita per tutti i comandi della connessioneEncrypt
- Encrypt
+ CrittografaWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Se è true, SQL Server usa la crittografia SSL per tutti i dati scambiati tra il client e il server, se nel server è installato un certificatoPersist security info
- Persist security info
+ Salva in modo permanente le informazioni di sicurezzaWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Se è false, le informazioni sensibili dal punto di vista della sicurezza, come la password, non vengono restituite nell'ambito della connessioneTrust server certificate
- Trust server certificate
+ Considera attendibile il certificato del serverWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Se è true (e encrypt=true), SQL Server usa la crittografia SSL per tutti i dati inviati tra il client e il server senza convalidare il certificato del serverAttached DB file name
- Attached DB file name
+ Nome file DB collegatoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Nome del file primario, incluso il nome completo del percorso, di un database collegabileContext connection
- Context connection
+ Connessione al contestoWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Se è true, indica che la connessione deve provenire dal contesto SQL Server. Disponibile solo quando è in esecuzione nel processo SQL Server.Port
- Port
+ PortaConnect retry count
- Connect retry count
+ Conteggio tentativi di connessioneNumber of attempts to restore connection
- Number of attempts to restore connection
+ Numero di tentativi di ripristino della connessioneConnect retry interval
- Connect retry interval
+ Intervallo tentativi di connessioneDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Ritardo tra tentativi di ripristino della connessioneApplication name
- Application name
+ Nome dell'applicazioneThe name of the application
- The name of the application
+ Nome dell'applicazioneWorkstation Id
- Workstation Id
+ ID workstationThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Nome della workstation che si connette a SQL ServerPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Se è true, l'oggetto connessione viene prelevato dal pool appropriato oppure, se necessario, viene creato e aggiunto al pool appropriatoMax pool size
- Max pool size
+ Dimensioni massime del poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Numero massimo di connessioni consentite nel poolMin pool size
- Min pool size
+ Dimensioni minime del poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Numero minimo di connessioni consentite nel poolLoad balance timeout
- Load balance timeout
+ Timeout durante il bilanciamento del caricoThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Tempo minimo (in secondi) in cui la connessione rimane attiva nel pool prima di essere eliminata definitivamenteReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Usato da SQL Server nella replicaAttach DB filename
- Attach DB filename
+ Collega nome file DBFailover partner
- Failover partner
+ Partner di failoverThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Nome o indirizzo di rete dell'istanza di SQL Server che funge da partner di failoverMulti subnet failover
- Multi subnet failover
+ Failover su più subnetMultiple active result sets
- Multiple active result sets
+ Multiple Active Result SetWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Se è true, possono essere restituiti e letti più set di risultati da un'unica connessionePacket size
- Packet size
+ Dimensioni del pacchettoSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Dimensioni in byte dei pacchetti di rete usati per comunicare con un'istanza di SQL ServerType system version
- Type system version
+ Digita versione del sistemaIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica il sistema di tipi di server esposto dal provider tramite l'oggetto DataReader
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ La risposta non ha restituito alcun ID batch di processo Spark.{0}[Errore] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ Nella risposta non è stato restituito alcun log.{0}[Errore] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ I parametri per SparkJobSubmissionModel non sono validisubmissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs non è valido. livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId non è valido.Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Timeout del recupero dell'ID applicazione. {0}[Log] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ La proprietà localFilePath o hdfsFolderPath non è specificata. Property Path is not specified.
- Property Path is not specified.
+ Il percorso proprietà non è specificato.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ I parametri per SparkJobSubmissionDialog non sono validiNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Invia{0} Spark Job Submission:
- {0} Spark Job Submission:
+ Invio processo Spark {0}:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Invia processo Spark - Inizio ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ Immettere un nome...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Cluster SparkPath to a .jar or .py file
- Path to a .jar or .py file
+ Percorso di un file con estensione jar o pyThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ Il file locale selezionato verrà caricato in HDFS: {0}JAR/py File
- JAR/py File
+ File JAR/pyMain Class
- Main Class
+ Classe principaleArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ Argomenti della riga di comando usati nella classe principale. Separare più argomenti con uno spazio.Property Job Name is not specified.
- Property Job Name is not specified.
+ Il nome del processo della proprietà non è specificato.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ Il file JAR/py delle proprietà non è specificato.Property Main Class is not specified.
- Property Main Class is not specified.
+ La classe principale della proprietà non è specificata.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} non esiste nel cluster oppure è stata generata un'eccezione.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ Il file HDFS specificato non esiste. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ Si è verificato un errore durante l'individuazione del file. Errore: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ File JAR di riferimentoJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ File con estensione jar da inserire nella directory di lavoro dell'executor. Il percorso dei file deve essere di tipo HDFS. Separare più percorsi con punti e virgola (;)Reference py Files
- Reference py Files
+ File di riferimento pyPy Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ File con estensione py da inserire nella directory di lavoro dell'executor. Il percorso dei file deve essere di tipo HDFS. Separare più percorsi con punti e virgola (;)Reference Files
- Reference Files
+ File di riferimentoFiles to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ File da inserire nella directory di lavoro dell'executor. Il percorso dei file deve essere di tipo HDFS. Separare più percorsi con punti e virgola (;)Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Selezionare SQL Server con Cluster Big Data.No Sql Server is selected.
- No Sql Server is selected.
+ Non è stata selezionata alcuna istanza di SQL Server.Error Get File Path: {0}
- Error Get File Path: {0}
+ Si è verificato un errore durante il recupero del percorso del file: {0}Invalid Data Structure
- Invalid Data Structure
+ Struttura dei dati non validaUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ Non è possibile creare il client WebHDFS a causa di opzioni mancanti: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '${0}' non è definito.Bad Request
- Bad Request
+ Richiesta non validaUnauthorized
- Unauthorized
+ non autorizzatoForbidden
- Forbidden
+ Accesso negatoNot Found
- Not Found
+ Non trovatoInternal Server Error
- Internal Server Error
+ Errore interno del serverUnknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ Reindirizzamento imprevistoPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Specificare la password per la connessione a HDFS:Session for node {0} does not exist
- Session for node {0} does not exist
+ La sessione per il nodo {0} non esisteError notifying of node change: {0}
- Error notifying of node change: {0}
+ Si è verificato un errore durante la notifica della modifica del nodo: {0}Root
- Root
+ radiceHDFS
- HDFS
+ HDFSData Services
- Data Services
+ Servizi datiNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ AVVISO: questo file è stato troncato alla posizione {0} per l'anteprima.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ Il file è stato troncato alla posizione {0} per l'anteprima.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo non è definito.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options non è definito.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ In connectionInfo.options mancano alcune proprietà: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ L'azione {0} non è supportata per questo gestoreCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ Non è possibile aprire il collegamento {0} perché sono supportati solo i collegamenti HTTP e HTTPSDownload and open '{0}'?
- Download and open '{0}'?
+ Scaricare e aprire '{0}'?Could not find the specified file
- Could not find the specified file
+ Non è stato possibile trovare il file specificatoFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ La richiesta di apertura file non è riuscita. Errore: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Si è verificato un errore durante l'arresto del server di Notebook: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ Il processo di Notebook è stato chiuso prematuramente. Errore: {0}. Output di STDERR: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Errore restituito da Jupyter: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter è in esecuzione alla posizione {0}... Starting Notebook server
- ... Starting Notebook server
+ ... Avvio del server NotebookUnexpected setting type {0}
- Unexpected setting type {0}
+ Tipo di impostazione imprevisto: {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ Non è possibile avviare una sessione. Il gestore non è ancora inizializzatoSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Con i kernel Spark è richiesta una connessione a un'istanza master del cluster Big Data di SQL Server.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ L'arresto del server di Notebook non è riuscito: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ L'installazione delle dipendenze di Notebook è in corsoPython download is complete
- Python download is complete
+ Il download di Python è stato completatoError while downloading python setup
- Error while downloading python setup
+ Si è verificato un errore durante il download della configurazione di PythonDownloading python package
- Downloading python package
+ Download del pacchetto pythonUnpacking python package
- Unpacking python package
+ Decompressione del pacchetto PythonError while creating python installation directory
- Error while creating python installation directory
+ Si è verificato un errore durante la creazione della directory di installazione di PythonError while unpacking python bundle
- Error while unpacking python bundle
+ Si è verificato un errore durante la decompressione del bundle di PythonInstalling Notebook dependencies
- Installing Notebook dependencies
+ Installazione delle dipendenze di NotebookInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Installazione delle dipendenze di Notebook. Per altre informazioni, vedere la visualizzazione attivitàNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ L'installazione delle dipendenze di Notebook è stata completataCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Non è possibile sovrascrivere l'installazione esistente di Python mentre Python è in esecuzione.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ È già in corso un'altra installazione di Python.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python esiste già nel percorso specificato. L'installazione verrà ignorata.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ L'installazione delle dipendenze di Notebook non è riuscita. Errore: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Download della versione locale di Python per la piattaforma {0} in {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Installazione dei pacchetti obbligatori per l'esecuzione di Notebooks...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Installazione di Jupyter completata.Installing SparkMagic...
- Installing SparkMagic...
+ Installazione di SparkMagic...A notebook path is required
- A notebook path is required
+ È necessario specificare il percorso di un notebookNotebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Sono supportati solo notebook con estensione ipynbAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ Reinstallare?Configure Python for Notebooks
- Configure Python for Notebooks
+ Configura Python per NotebooksInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Percorso di installazione di PythonSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Questa installazione richiede tempo. Non chiudere l'applicazione fino al completamento dell'installazione.The specified install location is invalid.
- The specified install location is invalid.
+ Il percorso di installazione specificato non è valido.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ Non è stata trovata alcuna installazione di Python nel percorso specificato.Python installation was declined.
- Python installation was declined.
+ L'installazione di Python è stata rifiutata.Installation Type
- Installation Type
+ Tipo di installazioneNew Python installation
- New Python installation
+ Nuova installazione di PythonUse existing Python installation
- Use existing Python installation
+ Usa l'installazione esistente di PythonYes
- S??
+ SìNo
@@ -484,7 +484,7 @@
This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Questo esempio di codice consente di caricare il file in un frame di dati e visualizzare i primi 10 risultati.Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Apertura del file {0} non riuscita: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Apertura del file {0} non riuscita: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Apertura del file {0} non riuscita: {1}Missing file : {0}
- Missing file : {0}
+ File mancante: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Questo esempio di codice consente di caricare il file in un frame di dati e visualizzare i primi 10 risultati.No notebook editor is active
- No notebook editor is active
+ Non ci sono editor di notebook attiviCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ Quale tipo di cella si vuole aggiungere?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Estensione Distribuzione SQL Server per Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Offre un'esperienza basata su notebook per la distribuzione di Microsoft SQL ServerDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Distribuisci SQL Server in Docker…Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ Distribuisci cluster Big Data di SQL Server…Deploy SQL Server…
- Deploy SQL Server…
+ Distribuisci SQL Server...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ Immagine del contenitore di SQL ServerRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Esegue l'immagine del contenitore di SQL Server con DockerSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ Il cluster Big Data di SQL Server consente di distribuire cluster scalabili di contenitori SQL Server, Spark e HDFS in esecuzione in KubernetesVersion
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebook/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Cluster Big Data di SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ Destinazione di distribuzioneNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Utilità da riga di comando scritta in Python che consente agli amministratori di eseguire il bootstrap e gestire il cluster Big Data tramite API RESTmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Uno strumento da riga di comando consente di eseguire comandi su cluster Kuberneteskubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Consente di creare il pacchetto dell'applicazione ed eseguirla in contenitori isolatiDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Strumenti da riga di comando per la gestione di risorse di AzureAzure CLI
- Azure CLI
+ Interfaccia della riga di comando di Azure
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ Non è stato possibile trovare il file package.json oppure il nome o l'editore non è impostato
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ Il notebook {0} non esiste
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Seleziona le opzioni di distribuzioneOpen Notebook
- Open Notebook
+ Apri notebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ Non è stato possibile caricare l'estensione: {0}. È stato rilevato un errore nella definizione dei tipi di risorse nel file package.json. Per dettagli, vedere la console di debug.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ Il tipo di risorsa {0} non è definito
diff --git a/resources/xlf/it/schema-compare.it.xlf b/resources/xlf/it/schema-compare.it.xlf
index 90c122e8b4..cec46dcc9e 100644
--- a/resources/xlf/it/schema-compare.it.xlf
+++ b/resources/xlf/it/schema-compare.it.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ Confronto schema di SQL ServerSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ La funzionalità Confronto schema di SQL Server per Azure Data Studio supporta il confronto degli schemi di database e pacchetti di applicazione livello dati.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ Le opzioni sono state modificate. Ripetere il confronto?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ Opzioni generaliInclude Object Types
- Include Object Types
+ Includi tipi di oggettoIgnore Table Options
- Ignore Table Options
+ Ignora opzioni di tabellaIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Ignora punto e virgola tra istruzioniIgnore Route Lifetime
- Ignore Route Lifetime
+ Ignora durata routeIgnore Role Membership
- Ignore Role Membership
+ Ignora appartenenza al ruoloIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Ignora identificatori delimitatiIgnore Permissions
- Ignore Permissions
+ Ignora autorizzazioniIgnore Partition Schemes
- Ignore Partition Schemes
+ Ignora schemi di partizioneIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Ignora posizione oggetto nello schema di partizioneIgnore Not For Replication
- Ignore Not For Replication
+ Ignora non per la replicaIgnore Login Sids
- Ignore Login Sids
+ Ignora sid di accessoIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Ignora hint di blocco in indiciIgnore Keyword Casing
- Ignore Keyword Casing
+ Ignora maiuscole/minuscole parole chiaveIgnore Index Padding
- Ignore Index Padding
+ Ignora spaziatura indiceIgnore Index Options
- Ignore Index Options
+ Ignora opzioni di indiceIgnore Increment
- Ignore Increment
+ Ignora IncrementoIgnore Identity Seed
- Ignore Identity Seed
+ Ignora valore di inizializzazione IdentityIgnore User Settings Objects
- Ignore User Settings Objects
+ Ignora oggetti impostazioni utenteIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ Ignora percorso del file di catalogo full-textIgnore Whitespace
- Ignore Whitespace
+ Ignora spazio vuotoIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Ignora WITH NOCHECK in chiavi esterneVerify Collation Compatibility
- Verify Collation Compatibility
+ Verifica compatibilità regole di confrontoUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Avvisi per oggetti non modificabiliTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Considera errori di verifica come avvisiScript Refresh Module
- Script Refresh Module
+ Crea script per modulo di aggiornamentoScript New Constraint Validation
- Script New Constraint Validation
+ Crea script per convalida nuovi vincoliScript File Size
- Script File Size
+ Dimensioni file di scriptScript Deploy StateChecks
- Script Deploy StateChecks
+ Crea script per verifiche stato di distribuzioneScript Database Options
- Script Database Options
+ Crea script per opzioni databaseScript Database Compatibility
- Script Database Compatibility
+ Crea script per compatibilità databaseScript Database Collation
- Script Database Collation
+ Crea script per regole di confronto databaseRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Esegui executor di piani di distribuzioneRegister DataTier Application
- Register DataTier Application
+ Registra applicazione del livello datiPopulate Files On File Groups
- Populate Files On File Groups
+ Popola file in gruppi di fileNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Non modificare istruzioni per cambiare i tipi CLRInclude Transactional Scripts
- Include Transactional Scripts
+ Includi script transazionaliInclude Composite Objects
- Include Composite Objects
+ Includi oggetti compositiAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Consenti spostamento dati con sicurezza a livello di riga non sicuroIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Ignora WITH NOCHECK in vincoli CHECKIgnore Fill Factor
- Ignore Fill Factor
+ Ignora fattore di riempimentoIgnore File Size
- Ignore File Size
+ Ignora dimensioni fileIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Ignora posizione filegroupDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ Non modificare oggetti replicatiDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ Non modificare oggetti Change Data CaptureDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ Disabilita e riabilita trigger DDLDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Distribuisci database in modalità utente singoloCreate New Database
- Create New Database
+ Crea nuovo databaseCompare Using Target Collation
- Compare Using Target Collation
+ Confronta usando regole di confronto di destinazioneComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Imposta come commento le dichiarazioni SetVarBlock When Drift Detected
- Block When Drift Detected
+ Blocca se viene rilevata una deviazioneBlock On Possible Data Loss
- Block On Possible Data Loss
+ Blocca in caso di possibile perdita di datiBackup Database Before Changes
- Backup Database Before Changes
+ Esegui backup del database prima delle modificheAllow Incompatible Platform
- Allow Incompatible Platform
+ Consenti piattaforma incompatibileAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Consenti rimozione assembly di bloccoDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Rimuovi vincoli non nell'origineDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Rimuovi trigger DML non nell'origineDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Rimuovi proprietà estese non nell'origineDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Rimuovi indici non nell'origineIgnore File And Log File Path
- Ignore File And Log File Path
+ Ignora percorso di file e file di logIgnore Extended Properties
- Ignore Extended Properties
+ Ignora proprietà esteseIgnore Dml Trigger State
- Ignore Dml Trigger State
+ Ignora stato trigger DmlIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ Ignora ordine trigger DmlIgnore Default Schema
- Ignore Default Schema
+ Ignora schema predefinitoIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Ignora stato trigger DdlIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ignora ordine trigger DdlIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ Ignora percorso file del provider del servizio di crittografiaVerify Deployment
- Verify Deployment
+ Verifica distribuzioneIgnore Comments
- Ignore Comments
+ Ignora commentiIgnore Column Collation
- Ignore Column Collation
+ Ignora regole di confronto delle colonneIgnore Authorizer
- Ignore Authorizer
+ Ignora provider di autorizzazioniIgnore AnsiNulls
- Ignore AnsiNulls
+ Ignora AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ Genera impostazioni predefinite intelligentiDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Rimuovi statistiche non nell'origineDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Rimuovi membri del ruolo non nell'origineDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Rimuovi autorizzazioni non nell'origineDrop Objects Not In Source
- Drop Objects Not In Source
+ Rimuovi oggetti non nell'origineIgnore Column Order
- Ignore Column Order
+ Ignora ordine delle colonneAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ Tabelle fileFull Text Catalogs
- Full Text Catalogs
+ Cataloghi full-textFull Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ Funzioni a valori scalariSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ Funzioni con valori di tabellaUser Defined Data Types
- User Defined Data Types
+ Tipi di dati definiti dall'utenteUser Defined Table Types
- User Defined Table Types
+ Tipi di tabella definiti dall'utenteClr User Defined Types
- Clr User Defined Types
+ Tipi clr definiti dall'utenteUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ Trigger serverSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Specifica che se durante la pubblicazione viene rilevata una differenza, invece di eseguire un'istruzione ALTER ASSEMBLY viene sempre rimosso e ricreato un assemblySpecifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Se è true, il database viene impostato sulla modalità utente singolo prima della distribuzione.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Questa impostazione indica come vengono gestite le regole di confronto del database durante la distribuzione. Per impostazione predefinita, le regole di confronto del database di destinazione verranno aggiornate se non corrispondono alle regole di confronto specificate dall'origine. Quando questa opzione è impostata, è necessario usare le regole di confronto del server o del database di destinazione.Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Specifica se i membri del ruolo non distribuiti nel file snapshot del database (con estensione dacpac) vengono rimossi dal database di destinazione quando si pubblicano aggiornamenti in un database.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ File dell'applicazione livello dati (estensione dacpac)Database
@@ -972,7 +972,7 @@
No active connections
- No active connections
+ Non ci sono connessioni attiveSchema Compare
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ È stato selezionato uno schema di origine diverso. Eseguire il confronto?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ È stato selezionato uno schema di destinazione diverso. Eseguire il confronto?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Sono stati selezionati schemi di origine e di destinazione diversi. Eseguire il confronto?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Confronta dettagliAre you sure you want to update the target?
- Are you sure you want to update the target?
+ Aggiornare la destinazione?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Fare clic su Confronta per aggiornare il confronto.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Genera script per distribuire le modifiche nella destinazioneNo changes to script
- No changes to script
+ Non sono presenti modifiche per cui creare lo scriptApply changes to target
- Apply changes to target
+ Applica modifiche alla destinazioneNo changes to apply
- No changes to apply
+ Non sono presenti modifiche da applicareDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Inizializzazione del confronto. L'operazione potrebbe richiedere qualche minuto.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Per confrontare due schemi, selezionare lo schema di origine e quello di destinazione, quindi fare clic su Confronta.No schema differences were found.
- No schema differences were found.
+ Non sono state trovate differenze dello schema.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ Confronto schema non riuscito: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ L'opzione Genera script è abilitata quando la destinazione è un databaseApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ L'opzione Applica è abilitata quando la destinazione è un databaseCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ L'annullamento del confronto schema non è riuscito: '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ Generazione dello script non riuscita: '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ L'applicazione del confronto schema non è riuscito: '{0}'Switch direction
- Switch direction
+ Cambia direzioneSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ Apri file con estensione scmpLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Carica origine, destinazione e opzioni salvate in un file con estensione scmpOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ L'apertura del file scmp non è riuscita: '{0}'Save .scmp file
- Save .scmp file
+ Salva file con estensione scmpSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Salva origine e destinazione, opzioni ed elementi esclusiSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ Il salvataggio del file scmp non è riuscito: '{0}'
diff --git a/resources/xlf/ja/admin-tool-ext-win.ja.xlf b/resources/xlf/ja/admin-tool-ext-win.ja.xlf
index 12af433db4..edfae86600 100644
--- a/resources/xlf/ja/admin-tool-ext-win.ja.xlf
+++ b/resources/xlf/ja/admin-tool-ext-win.ja.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Windows 用データベース管理ツール拡張機能Adds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Azure Data Studio に Windows 特有の他の機能を追加しますProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ スクリプトの生成...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ handleLaunchSsmsMinPropertiesDialogCommand に提供された ConnectionContext がありませんCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ connectionContext からオブジェクト エクスプローラー ノードを判別できませんでした: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ handleLaunchSsmsMinPropertiesDialogCommand に提供された ConnectionContext はありませんNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ connectionContext から提供された connectionProfile はありません: {0}Launching dialog...
- Launching dialog...
+ ダイアログを起動しています...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ 引数 '{0}' - {1} の SsmsMin の呼び出しでエラーが発生しました
diff --git a/resources/xlf/ja/agent.ja.xlf b/resources/xlf/ja/agent.ja.xlf
index ec0330e1f8..3bf9119245 100644
--- a/resources/xlf/ja/agent.ja.xlf
+++ b/resources/xlf/ja/agent.ja.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ SQL Server Integration Service パッケージSQL Server Agent Service Account
diff --git a/resources/xlf/ja/azurecore.ja.xlf b/resources/xlf/ja/azurecore.ja.xlf
index 7d7f46256c..6c1ee0173e 100644
--- a/resources/xlf/ja/azurecore.ja.xlf
+++ b/resources/xlf/ja/azurecore.ja.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: すべてのアカウントを更新するRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: サインインSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ サーバーへの追加Clear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ リソースが見つかりません
diff --git a/resources/xlf/ja/cms.ja.xlf b/resources/xlf/ja/cms.ja.xlf
index 21a8ddad93..d154172375 100644
--- a/resources/xlf/ja/cms.ja.xlf
+++ b/resources/xlf/ja/cms.ja.xlf
@@ -4,23 +4,23 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ SQL Server 中央管理サーバーSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ SQL Server 中央管理サーバーの管理のサポートCentral Management Servers
- Central Management Servers
+ 中央管理サーバーMicrosoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerCentral Management Servers
- Central Management Servers
+ 中央管理サーバーRefresh
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ サーバー グループの更新Delete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ 新しいサーバー登録...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ 新しいサーバー グループ...Add Central Management Server
- Add Central Management Server
+ 中央管理サーバーの追加Delete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ MSSQL 構成Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [オプション] コンソールへのデバッグ出力をログに記録し ([表示] -> [出力])、ドロップダウンから適切な出力チャネルを選択します[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [オプション] バックエンド サービスのログ レベル。Azure Data Studio は、起動のたびにファイル名を生成し、ファイルが既に存在する場合は、ログ エントリが対象ファイルに追加されます。古いログ ファイルのクリーンアップについては、logRetentionMinutes および logFilesRemoveLimit の 設定を参照してください。既定の tracingLevel の場合、ログに記録される数が多くありません。詳細レベルを変更すると、詳細なログが記録され、ログのためのディスク容量が必要になる場合があります。重大を含むエラー、エラーを含む警告、警告を含む情報、情報を含む詳細ですNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ バックエンド サービスのログ ファイルを保持する分数。既知値は 1 週間です。Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ mssql.logRetentionMinutes の有効期限が切れた、起動時に削除する古いファイルの最大数。この制限のためにクリーンアップされないファイルは、Azure Data Studio を次回起動するとクリーンアップされます。[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [オプション] サポートされていないプラットフォームの警告を表示しないRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ 価格レベルCompatibility Level
@@ -164,15 +164,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ 名前 (オプション)Custom name of the connection
- Custom name of the connection
+ 接続のカスタム名Server
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server インスタンスの名前Server Description
- Server Description
+ サーバーの説明Description of the SQL Server instance
- Description of the SQL Server instance
+ SQL Server インスタンスの説明Authentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ SQL Server での認証方法を指定しますSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - MFA サポート付きユニバーサルUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ データ ソースへの接続時に使用されるユーザー ID を示しますPassword
@@ -224,155 +224,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ データ ソースへの接続時に使用するパスワードを示しますApplication intent
- Application intent
+ アプリケーションの目的Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ サーバーに接続するときにアプリケーションのワークロードの種類を宣言しますAsynchronous processing
- Asynchronous processing
+ 非同期処理When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ True の場合は、.Net Framework データ プロバイダーの非同期機能を使用できるようになりますConnect timeout
- Connect timeout
+ 接続タイムアウトThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ サーバーへの接続が確立されるまでに待機する時間 (秒) です。この時間が経過すると接続要求を終了し、エラーを生成しますCurrent language
- Current language
+ 現在の言語The SQL Server language record name
- The SQL Server language record name
+ SQL Server Language レコード名Column encryption
- Column encryption
+ 列暗号化Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 接続上のすべてのコマンドの既定の列暗号化設定Encrypt
- Encrypt
+ 暗号化When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ true の場合、SQL Server では、サーバーに証明書がインストールされている場合、クライアントとサーバー間で送信されるすべてのデータに SSL 暗号化が使用されます。Persist security info
- Persist security info
+ セキュリティ情報を保持するWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ False の場合、パスワードなどのセキュリティによる保護が要求される情報は、接続しても返されませんTrust server certificate
- Trust server certificate
+ サーバー証明書を信頼するWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ True (および encrypt=true) の場合、SQL Server では、サーバー証明書を検証せずに、クライアントとサーバー間で送信されるすべてのデータに対して SSL 暗号化が使用されますAttached DB file name
- Attached DB file name
+ 接続された DB ファイル名The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 完全なパス名を含む、接続可能なデータベースのプライマリ ファイル名Context connection
- Context connection
+ コンテキスト接続When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ True の場合は、接続元が SQL Server のコンテキストであることを示します。SQL Server のプロセスで実行する場合のみ使用できますPort
- Port
+ ポートConnect retry count
- Connect retry count
+ 接続の再試行回数Number of attempts to restore connection
- Number of attempts to restore connection
+ 接続を復元するための試行回数Connect retry interval
- Connect retry interval
+ 接続の再試行間隔Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 接続を復元するための試行間の遅延Application name
- Application name
+ アプリケーション名The name of the application
- The name of the application
+ アプリケーションの名前Workstation Id
- Workstation Id
+ ワークステーション IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ SQL Server に接続しているワークステーションの名前Pooling
- Pooling
+ プーリングWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ True の場合、接続オブジェクトが適切なプールから取得されるか、または、必要に応じて接続オブジェクトが作成され、適切なプールに追加されますMax pool size
- Max pool size
+ 最大プール サイズThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ プールに保持できる最大接続数Min pool size
- Min pool size
+ 最小プール サイズThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ プール内で行える接続の最小数Load balance timeout
- Load balance timeout
+ 負荷分散タイムアウトThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ この接続が破棄されるまでにプールに存在できる最低限の時間 (秒)Replication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ レプリケーション時に SQL Server によって使用されますAttach DB filename
- Attach DB filename
+ DB ファイル名の添付Failover partner
- Failover partner
+ フェールオーバー パートナーThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ フェールオーバー パートナーとして機能する SQL Server インスタンスの名前またはネットワーク アドレスMulti subnet failover
- Multi subnet failover
+ マルチ サブネット フェールオーバーMultiple active result sets
- Multiple active result sets
+ 複数のアクティブな結果セットWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ True の場合は、1 つの接続から複数の結果セットが返され、これらを読み取ることができますPacket size
- Packet size
+ パケット サイズSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ SQL Server インスタンスとの通信に使われるネットワーク パケットのサイズ (バイト)Type system version
- Type system version
+ Type system versionIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ DataReader を通してプロバイダーが公開するサーバー タイプのシステムを示します
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ 中央管理サーバー {0} が見つからないか、オフラインですNo resources found
- No resources found
+ リソースが見つかりません
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ 中央管理サーバーを追加します...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ 中央管理サーバー グループには、既に {0} という名前の登録済みサーバーがありますCould not add the Registered Server {0}
- Could not add the Registered Server {0}
+ 登録済みサーバー {0} を追加できませんでしたAre you sure you want to delete
- Are you sure you want to delete
+ 削除しますかYes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ サーバー グループの説明{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} には既に {1} という名前のサーバー グループがありますAre you sure you want to delete
- Are you sure you want to delete
+ 削除しますか
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ 構成サーバーと同じ名前の共有登録サーバーを追加することはできません
diff --git a/resources/xlf/ja/dacpac.ja.xlf b/resources/xlf/ja/dacpac.ja.xlf
index 3517d65969..9866521a00 100644
--- a/resources/xlf/ja/dacpac.ja.xlf
+++ b/resources/xlf/ja/dacpac.ja.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ ウィザードを閉じた後、タスク ビューでスクリプト生成の状態を表示できます。完了すると、生成されたスクリプトが開きます。Generating deploy plan failed '{0}'
diff --git a/resources/xlf/ja/import.ja.xlf b/resources/xlf/ja/import.ja.xlf
index 0e68485570..86a5c5c76c 100644
--- a/resources/xlf/ja/import.ja.xlf
+++ b/resources/xlf/ja/import.ja.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ この操作は失敗しました。別の入力ファイルをお試しください。Refresh
diff --git a/resources/xlf/ja/mssql.ja.xlf b/resources/xlf/ja/mssql.ja.xlf
index c1e9afaecf..42daa1e79a 100644
--- a/resources/xlf/ja/mssql.ja.xlf
+++ b/resources/xlf/ja/mssql.ja.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ ファイルのアップロードNew directory
- New directory
+ 新しいディレクトリDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ 新しいノートブックOpen Notebook
- Open Notebook
+ ノートブックを開くTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ SQL Server Big Data Cluster に関するタスクと情報SQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Spark ジョブの送信New Spark Job
- New Spark Job
+ 新しい Spark ジョブView Spark History
- View Spark History
+ Spark 履歴の表示View Yarn History
- View Yarn History
+ Yarn 履歴を表示Tasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ パッケージのインストールConfigure Python for Notebooks
- Configure Python for Notebooks
+ ノートブック用 Python の構成Cluster Status
- Cluster Status
+ クラスター状態Search: Servers
- Search: Servers
+ 検索: サーバーSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ 検索: 検索サーバーの結果を消去するService Endpoints
- Service Endpoints
+ サービス エンドポイントMSSQL configuration
- MSSQL configuration
+ MSSQL 構成Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [オプション] コンソールへのデバッグ出力 ([表示] -> [出力}) をログに記録し、ドロップダウンから適切な出力チャネルを選択します[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [省略可能] バックエンド サービスのログ レベル。Azure Data Studio は開始のたびにファイル名を生成し、そのファイルが既に存在する場合にはログ エントリが対象ファイルに追加されます。古いログ ファイルのクリーンアップについては、logRetentionMinutes と logFilesRemovalLimit の設定を参照してください。既定の tracingLevel の場合、ログに記録される数が多くありません。詳細レベルを変更すると、詳細なログが記録され、ログのためのディスク容量が必要になる場合があります。重大を含むエラー、エラーを含む警告、警告を含む情報、情報を含む詳細ですNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ バックエンド サービスのログ ファイルを保持する分数。既定値は 1 週間です。Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ mssql.logRetentionMinutes の有効期限が切れた、起動時に削除する古いファイルの最大数。この制限のためにクリーンアップされないファイルは、Azure Data Studio の次回の起動時にクリーンアップされます。[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [オプション] サポートされていないプラットフォームの警告を表示しないRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ 価格レベルCompatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ 名前 (オプション)Custom name of the connection
- Custom name of the connection
+ 接続のカスタム名Server
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server インスタンスの名前Database
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ データ ソース内の初期カタログまたはデータベースの名前Authentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ SQL Server での認証方法を指定しますSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - MFA サポート付きユニバーサルUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ データ ソースへの接続時に使用するユーザー ID を示しますPassword
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ データ ソースへの接続時に使用するパスワードを示します。Application intent
- Application intent
+ アプリケーションの目的Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ サーバーに接続するときにアプリケーションのワークロードの種類を宣言しますAsynchronous processing
- Asynchronous processing
+ 非同期処理When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ True の場合は、.Net Framework データ プロバイダーの非同期機能を使用できるようになりますConnect timeout
- Connect timeout
+ 接続タイムアウトThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ サーバーへの接続が確立されるまでに待機する時間 (秒) です。この時間が経過すると接続要求を終了し、エラーを生成しますCurrent language
- Current language
+ 現在の言語The SQL Server language record name
- The SQL Server language record name
+ SQL Server Language レコード名Column encryption
- Column encryption
+ 列暗号化Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 接続上のすべてのコマンドの既定の列暗号化設定Encrypt
- Encrypt
+ 暗号化When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ True の場合、SQL Server は、サーバーに証明書がインストールされている場合は、クライアントとサーバー間で送信されるすべてのデータに SSL 暗号化を使用しますPersist security info
- Persist security info
+ セキュリティ情報を保持するWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ False の場合、パスワードなどのセキュリティによる保護が要求される情報は、接続しても返されませんTrust server certificate
- Trust server certificate
+ サーバー証明書を信頼するWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ True の場合 (および encrypt=true)、SQL Server では、サーバー証明書を検証せずにクライアントとサーバーの間で送信されるすべてのデータに対し、SSL 暗号化が使用されますAttached DB file name
- Attached DB file name
+ 接続された DB ファイル名The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 完全なパス名を含む、接続可能なデータベースのプライマリ ファイル名Context connection
- Context connection
+ コンテキスト接続When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ True の場合は、接続元が SQL Server のコンテキストであることを示します。SQL Server のプロセスで実行する場合のみ使用できますPort
- Port
+ ポートConnect retry count
- Connect retry count
+ 接続の再試行回数Number of attempts to restore connection
- Number of attempts to restore connection
+ 接続を復元するための試行回数Connect retry interval
- Connect retry interval
+ 接続の再試行間隔Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 接続を復元するための試行間の遅延Application name
- Application name
+ アプリケーション名The name of the application
- The name of the application
+ アプリケーションの名前Workstation Id
- Workstation Id
+ ワークステーション IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ SQL Server に接続しているワークステーションの名前Pooling
- Pooling
+ プーリングWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ True の場合、接続オブジェクトが適切なプールから取得されるか、または、必要に応じて接続オブジェクトが作成され、適切なプールに追加されますMax pool size
- Max pool size
+ 最大プール サイズThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ プールに保持できる最大接続数Min pool size
- Min pool size
+ 最小プール サイズThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ プール内で行える接続の最小数Load balance timeout
- Load balance timeout
+ 負荷分散タイムアウトThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ この接続が破棄される前にプールに存在する最小時間 (秒)Replication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ レプリケーション時に SQL Server によって使用されますAttach DB filename
- Attach DB filename
+ DB ファイル名の接続Failover partner
- Failover partner
+ フェールオーバー パートナーThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ フェールオーバー パートナーとして機能する SQL Server のインスタンスの名前かネットワーク アドレスMulti subnet failover
- Multi subnet failover
+ マルチ サブネット フェールオーバーMultiple active result sets
- Multiple active result sets
+ 複数のアクティブな結果セットWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ True の場合は、1 つの接続から複数の結果セットが返され、これらを読み取ることができますPacket size
- Packet size
+ パケット サイズSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ SQL Server インスタンスとの通信に使用されるネットワーク パケットのバイト数Type system version
- Type system version
+ Type system versionIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ DataReader を通してプロバイダーが公開するサーバー タイプのシステムを示します
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ 応答から Spark ジョブ バッチ ID は返されません。{0}[エラー]{1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ 応答で返されたログはありません。{0}[エラー] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ SparkJobSubmissionModel のパラメーターが無効ですsubmissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs が無効です。livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId が無効です。Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ アプリケーション ID タイムアウトを取得します。{0}[ログ] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ プロパティ localFilePath または hdfsFolderPath が指定されていません。Property Path is not specified.
- Property Path is not specified.
+ プロパティ パスが指定されていません。
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ SparkJobSubmissionDialog のパラメーターが無効ですNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ 送信{0} Spark Job Submission:
- {0} Spark Job Submission:
+ {0} Spark ジョブの送信:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Spark ジョブの送信開始 ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ 名前を入力します...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Spark クラスターPath to a .jar or .py file
- Path to a .jar or .py file
+ .jar ファイルまたは .py ファイルへのパスThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ 選択したローカル ファイルが HDFS にアップロードされます: {0}JAR/py File
- JAR/py File
+ JAR/py ファイルMain Class
- Main Class
+ メイン クラスArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ メイン クラスで使用されるコマンド ライン引数で複数の引数を使うには、スペースで区切る必要があります。Property Job Name is not specified.
- Property Job Name is not specified.
+ プロパティ ジョブ名が指定されていません。Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ プロパティ JAR/py ファイルが指定されていません。Property Main Class is not specified.
- Property Main Class is not specified.
+ プロパティ メイン クラスが指定されていません。{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} がクラスターに存在しないか、例外がスローされました。The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ 指定された HDFS ファイルが存在しません。Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ エラーが原因でファイルの検索でエラーが発生しました: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ Jar を参照するJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ 実行プログラム作業ディレクトリに配置される Jar。Jar パスは HDFS パスにする必要があります。複数のパスの場合、セミコロン (;) で区切らなければなりませんReference py Files
- Reference py Files
+ 参照 py ファイルPy Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 実行プログラム作業ディレクトリに配置される Py ファイル。ファイル パスは HDFS パスにする必要があります。複数のパスの場合、セミコロン (;) で区切らなければなりませんReference Files
- Reference Files
+ 参照ファイルFiles to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 実行プログラム作業ディレクトリに配置されるファイル。ファイル パスは HDFS パスにする必要があります。複数のパスの場合、セミコロン (;) で区切らなければなりませんPlease select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Big Data Cluster が含まれる SQL Server を選択してください。No Sql Server is selected.
- No Sql Server is selected.
+ SQL Server が選択されていません。Error Get File Path: {0}
- Error Get File Path: {0}
+ ファイル パスの取得でエラーが発生しました: {0}Invalid Data Structure
- Invalid Data Structure
+ 無効なデータ構造Unable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ オプションが不足しているため、WebHDFS クライアントを作成できません: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '${0}' は未定義です。Bad Request
- Bad Request
+ 無効な要求Unauthorized
- Unauthorized
+ 許可されていませんForbidden
- Forbidden
+ 禁止Not Found
- Not Found
+ 見つかりませんInternal Server Error
- Internal Server Error
+ 内部サーバー エラーUnknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ 予期しないリダイレクトPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ HDFS に接続するためのパスワードを入力してください:Session for node {0} does not exist
- Session for node {0} does not exist
+ ノード {0} のセッションが存在しませんError notifying of node change: {0}
- Error notifying of node change: {0}
+ ノード変更の通知でエラーが発生しました: {0}Root
- Root
+ ルートHDFS
- HDFS
+ HdfsData Services
- Data Services
+ Data ServicesNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ 注意: このファイルはプレビュー用に {0} で切り詰められました。The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ ファイルはプレビューのために {0} で切り捨てられました。ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo が定義されていません。ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options が定義されていません。Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ connectionInfo.options で一部のプロパティが不足しています: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ このハンドラーではアクション {0} はサポートされていませんCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ HTTP リンクと HTTPS リンクのみがサポートされているため、リンク {0} を開くことができませんDownload and open '{0}'?
- Download and open '{0}'?
+ '{0}' をダウンロードして開きますか?Could not find the specified file
- Could not find the specified file
+ 指定されたファイルが見つかりませんでしたFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ ファイルを開く要求に失敗しました。エラー: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ ノートブック サーバーの停止でエラーが発生しました: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ ノートブック プロセスが途中で終了しました。エラー: {0}。StdErr 出力: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Jupyter からの送信でエラーが発生しました: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ...Jupyter は {0} で実行中です... Starting Notebook server
- ... Starting Notebook server
+ ... ノートブック サーバーを起動していますUnexpected setting type {0}
- Unexpected setting type {0}
+ 予期しない設定の種類 {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ セッションを開始できません。マネージャーがまだ初期化されていませんSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Spark カーネルは、SQL Server Big Data Cluster マスター インスタンスへの接続を必要とします。Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ ノートブック サーバーをシャットダウンできませんでした: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ ノートブック依存関係のインストールが進行中ですPython download is complete
- Python download is complete
+ Python ダウンロードが完了しましたError while downloading python setup
- Error while downloading python setup
+ Python セットアップのダウンロード中にエラーが発生しましたDownloading python package
- Downloading python package
+ Python パッケージをダウンロードしていますUnpacking python package
- Unpacking python package
+ Python パッケージをアンパックしていますError while creating python installation directory
- Error while creating python installation directory
+ Python インストール ディレクトリの作成中にエラーが発生しましたError while unpacking python bundle
- Error while unpacking python bundle
+ Python バンドルのアンパック中にエラーが発生しましたInstalling Notebook dependencies
- Installing Notebook dependencies
+ ノートブック依存関係のインストールInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ ノートブック依存関係のインストールについて詳しくは、[タスク ビュー] を参照してくださいNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ ノートブック依存関係のインストールが完了しましたCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Python の実行中は、既存の Python インストールを上書きできません。Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ 別の Python インストールが現在進行中です。Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python は既に特定の場所に存在します。インストールをスキップしています。Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ ノートブック依存関係のインストールに失敗しました。エラー: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ {0} から {1} にプラットフォーム用のローカル Python をダウンロードしていますInstalling required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ ノートブックを実行するために必要なパッケージをインストールしています...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Jupyter インストールが完了しました。Installing SparkMagic...
- Installing SparkMagic...
+ SparkMagic をインストールしています...A notebook path is required
- A notebook path is required
+ ノートブック パスが必要ですNotebooks
- Notebooks
+ ノートブックOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ .ipynb ノートブックのみがサポートされていますAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ 再インストールしますか?Configure Python for Notebooks
- Configure Python for Notebooks
+ ノートブック用の Python の構成Install
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Python インストール場所Select
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ このインストールには時間がかかります。インストールが完了するまでアプリケーションを閉じないようにお勧めします。The specified install location is invalid.
- The specified install location is invalid.
+ 指定されたインストール場所が無効です。No python installation was found at the specified location.
- No python installation was found at the specified location.
+ 指定された場所に Python インストールが見つかりませんでした。Python installation was declined.
- Python installation was declined.
+ Python インストールが拒否されました。Installation Type
- Installation Type
+ インストールの種類New Python installation
- New Python installation
+ 新しい Python インストールUse existing Python installation
- Use existing Python installation
+ 既存の Python インストールを使用するOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ ファイル {0} を開くことができませんでした: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ ファイル {0} を開くことができませんでした: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ ファイル {0} を開くことができませんでした: {1}Missing file : {0}
- Missing file : {0}
+ ファイルが見つかりません: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ このサンプル コードは、ファイルをデータ フレームに読み込み、最初の 10 件の結果を示します。No notebook editor is active
- No notebook editor is active
+ アクティブなノートブック エディターがありませんCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ 追加するセルの種類を指定してください。Notebooks
- Notebooks
+ ノートブックSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Azure Data Studio 用の SQL Server 配置拡張Provides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Microsoft SQL Server を展開するためのノートブック ベースのエクスペリエンスを提供しますDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ SQL Server を Docker に展開します...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ SQL Server Big Data Cluster を展開します…Deploy SQL Server…
- Deploy SQL Server…
+ SQL Server の展開...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ SQL Server コンテナー イメージRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Docker を使用して SQL Server コンテナー イメージを実行するSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ SQL Server Big Data Cluster を使用すると、Kubernetes で実行されている SQL Server、Spark、および HDFS のコンテナーのスケーラブルなクラスターをデプロイできます。Version
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ SQL Server 2019 Big Data Cluster CTP 3.1Deployment target
- Deployment target
+ 配置ターゲットNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Python で作成されたコマンドライン ユーティリティを使用すると、クラスター管理者は REST API を介してビッグ データ クラスターをブートストラップおよび管理できますmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ コマンドライン ツールを使用すると、Kubernetes クラスターに対してコマンドを実行できます。kubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ 分離されたコンテナーでアプリケーションをパッケージ化して実行する機能を提供しますDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Azure リソースを管理するためのコマンド ライン ツールAzure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ package.json が見つからないか、名前/発行元が設定されていません
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ ノートブック {0} が存在しません
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ 配置オプションを選択しますOpen Notebook
- Open Notebook
+ ノートブックを開くTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ 拡張機能を読み込めませんでした: {0}。package.json のリソースの種類の定義でエラーが検出されました。詳しくは、デバッグ コンソールを確認してください。The resource type: {0} is not defined
- The resource type: {0} is not defined
+ リソースの種類: {0} が定義されていません
diff --git a/resources/xlf/ja/schema-compare.ja.xlf b/resources/xlf/ja/schema-compare.ja.xlf
index 3b97c69c7c..afee9beb59 100644
--- a/resources/xlf/ja/schema-compare.ja.xlf
+++ b/resources/xlf/ja/schema-compare.ja.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server スキーマの比較SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Azure Data Studio 用 SQL Server スキーマ比較では、データベースと dacpac のスキーマを比較できます。Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ オプションが変更されました。比較を表示して再比較しますか?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ 全般オプションInclude Object Types
- Include Object Types
+ オブジェクトの種類を含めるIgnore Table Options
- Ignore Table Options
+ テーブルのオプションを無視するIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ ステートメント間のセミコロンを無視するIgnore Route Lifetime
- Ignore Route Lifetime
+ ルートの有効期間を無視するIgnore Role Membership
- Ignore Role Membership
+ ロール メンバーシップを無視するIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ 引用符で囲まれた識別子を無視するIgnore Permissions
- Ignore Permissions
+ アクセス許可を無視するIgnore Partition Schemes
- Ignore Partition Schemes
+ パーティション構成を無視するIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ パーティション構成でのオブジェクトの位置を無視するIgnore Not For Replication
- Ignore Not For Replication
+ レプリケーション用以外を無視するIgnore Login Sids
- Ignore Login Sids
+ ログイン SID を無視するIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ インデックスのロック ヒントを無視するIgnore Keyword Casing
- Ignore Keyword Casing
+ キーワードの文字種を無視するIgnore Index Padding
- Ignore Index Padding
+ インデックス パディングを無視するIgnore Index Options
- Ignore Index Options
+ インデックス オプションを無視するIgnore Increment
- Ignore Increment
+ 増分を無視するIgnore Identity Seed
- Ignore Identity Seed
+ IDENTITY シードを無視するIgnore User Settings Objects
- Ignore User Settings Objects
+ ユーザー設定オブジェクトを無視するIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ フルテキスト カタログ ファイル パスを無視するIgnore Whitespace
- Ignore Whitespace
+ 空白を無視するIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Foreign Key の With Nocheck を無視するVerify Collation Compatibility
- Verify Collation Compatibility
+ 照合順序の互換性を確認するUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ 変更できないオブジェクトの警告Treat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ 検証エラーを警告として扱うScript Refresh Module
- Script Refresh Module
+ スクリプトでの更新モジュールScript New Constraint Validation
- Script New Constraint Validation
+ 新しい制約の検証をスクリプトで作成するScript File Size
- Script File Size
+ スクリプト ファイル サイズScript Deploy StateChecks
- Script Deploy StateChecks
+ スクリプトでのデプロイ状態のチェックScript Database Options
- Script Database Options
+ スクリプトでのデータベース オプションScript Database Compatibility
- Script Database Compatibility
+ データベース互換性のスクリプトを作成するScript Database Collation
- Script Database Collation
+ データベース照合順序のスクリプトを作成するRun Deployment Plan Executors
- Run Deployment Plan Executors
+ 配置計画実行プログラムの実行Register DataTier Application
- Register DataTier Application
+ データ層アプリケーションの登録Populate Files On File Groups
- Populate Files On File Groups
+ ファイル グループに対してファイルを作成するNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ CLR 型を変更する ALTER ステートメントがないInclude Transactional Scripts
- Include Transactional Scripts
+ トランザクション スクリプトを含めるInclude Composite Objects
- Include Composite Objects
+ 複合オブジェクトを含めるAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ 安全でない行レベル セキュリティ データ移動を許可するIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ CHECK 制約の With No check を無視するIgnore Fill Factor
- Ignore Fill Factor
+ FILL FACTOR を無視するIgnore File Size
- Ignore File Size
+ ファイル サイズを無視するIgnore Filegroup Placement
- Ignore Filegroup Placement
+ ファイル グループの配置を無視するDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ レプリケートされたオブジェクトを変更しないDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ 変更データ キャプチャ オブジェクトを変更しないDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ DDL トリガーを無効にし、再び有効にするDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ シングル ユーザー モードでデータベースを配置するCreate New Database
- Create New Database
+ 新しいデータベースの作成Compare Using Target Collation
- Compare Using Target Collation
+ ターゲットの照合順序を使用して比較するComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ コメント アウト セット変数宣言Block When Drift Detected
- Block When Drift Detected
+ 誤差が検出されたときにブロックするBlock On Possible Data Loss
- Block On Possible Data Loss
+ データ損失の可能性がある場合にブロックするBackup Database Before Changes
- Backup Database Before Changes
+ 変更前にデータベースをバックアップするAllow Incompatible Platform
- Allow Incompatible Platform
+ 互換性のないプラットフォームを許可するAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ ブロックしているアセンブリの削除を許可するDrop Constraints Not In Source
- Drop Constraints Not In Source
+ ソース内にない制約を削除するDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ ソース内にない DML トリガーを削除するDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ ソース内にない拡張プロパティを削除するDrop Indexes Not In Source
- Drop Indexes Not In Source
+ ソース内にないインデックスを削除するIgnore File And Log File Path
- Ignore File And Log File Path
+ ファイルおよびログ ファイル パスを無視するIgnore Extended Properties
- Ignore Extended Properties
+ 拡張プロパティを無視するIgnore Dml Trigger State
- Ignore Dml Trigger State
+ DML トリガーの状態を無視するIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ DML trigger の順序を無視するIgnore Default Schema
- Ignore Default Schema
+ 既定のスキーマを無視するIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ DDL trigger の状態を無視するIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ DDL トリガーの順序を無視するIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ 暗号化プロバイダーのファイル パスを無視するVerify Deployment
- Verify Deployment
+ 配置を確認するIgnore Comments
- Ignore Comments
+ コメントを無視するIgnore Column Collation
- Ignore Column Collation
+ 列の照合順序を無視するIgnore Authorizer
- Ignore Authorizer
+ 承認者を無視するIgnore AnsiNulls
- Ignore AnsiNulls
+ AnsiNulls を無視Generate SmartDefaults
- Generate SmartDefaults
+ SmartDefaults の生成Drop Statistics Not In Source
- Drop Statistics Not In Source
+ ソース内にない統計を削除するDrop Role Members Not In Source
- Drop Role Members Not In Source
+ ソースに含まれないロール メンバーを削除するDrop Permissions Not In Source
- Drop Permissions Not In Source
+ ソース内にないアクセス許可を削除するDrop Objects Not In Source
- Drop Objects Not In Source
+ ソース内にないオブジェクトを削除するIgnore Column Order
- Ignore Column Order
+ 列の順序を無視するAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ ファイル テーブルFull Text Catalogs
- Full Text Catalogs
+ フルテキスト カタログFull Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ スカラー値関数Search Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ テーブル値関数User Defined Data Types
- User Defined Data Types
+ ユーザー定義データ型User Defined Table Types
- User Defined Table Types
+ ユーザー定義テーブル型Clr User Defined Types
- Clr User Defined Types
+ Clr ユーザー定義型Users
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ サーバー トリガーSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ アセンブリに相違がある場合、発行では ALTER ASSEMBLY ステートメントを発行するのではなく、常にアセンブリを削除して作成し直すことを指定します。Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ true の場合、データベースは配置前にシングル ユーザー モードに設定されます。Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ この設定は、配置時にデータベースの照合順序を処理する方法を決定します。既定では、ターゲット データベースの照合順序がソースによって指定された照合順序と一致しない場合は更新されます。 このオプションを設定すると、ターゲット データベース (またはサーバーの) 照合順序を使用する必要があります。Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ データベースに更新を公開するとき、データベース スナップショット (.dacpac) ファイルで定義されていないロール メンバーをターゲット データベースから削除するかどうかを指定します。</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ データ層アプリケーション ファイル (.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ 別のソース スキーマが選択されました。比較を表示して比較しますか?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ 別のターゲット スキーマが選択されました。比較を表示して比較しますか?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ 異なるソース スキーマとターゲット スキーマが選択されています。比較を表示して比較しますか?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ 詳細の比較Are you sure you want to update the target?
- Are you sure you want to update the target?
+ ターゲットを更新しますか?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ 比較を更新するには、[比較] をクリックします。Generate script to deploy changes to target
- Generate script to deploy changes to target
+ ターゲットに変更を配置するスクリプトを生成しますNo changes to script
- No changes to script
+ スクリプトに変更はありませんApply changes to target
- Apply changes to target
+ ターゲットに変更を適用するNo changes to apply
- No changes to apply
+ 適用する変更はありませんDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ 比較を初期化します。しばらく時間がかかる場合があります。To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ 2 つのスキーマを比較するには、最初にソース スキーマとターゲット スキーマを選択し、[比較] を押します。No schema differences were found.
- No schema differences were found.
+ スキーマの違いは見つかりませんでした。Schema Compare failed: {0}
- Schema Compare failed: {0}
+ スキーマ比較に失敗しました: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ ターゲットがデータベースの場合にスクリプトの生成が有効になりますApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ ターゲットがデータベースの場合に適用が有効になりますCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ スキーマ比較を取り消すことができませんでした: '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ スクリプトを生成できませんでした: '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ スキーマ比較を適用できませんでした '{0}'Switch direction
- Switch direction
+ 方向の切り替えSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ .scmp ファイルを開くLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ .scmp ファイルに保存されたソース、ターゲット、およびオプションを読み込みますOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ scmp を開くことができませんでした: '{0}'Save .scmp file
- Save .scmp file
+ .scmp ファイルを保存Save source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ ソース、ターゲット、オプション、および除外された要素を保存しますSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ scmp を保存できませんでした: '{0}'
diff --git a/resources/xlf/ko/admin-tool-ext-win.ko.xlf b/resources/xlf/ko/admin-tool-ext-win.ko.xlf
index e7079f5709..cae47371d6 100644
--- a/resources/xlf/ko/admin-tool-ext-win.ko.xlf
+++ b/resources/xlf/ko/admin-tool-ext-win.ko.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Windows용 데이터베이스 관리 도구 확장Adds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Azure Data Studio에 Windows 관련 추가 기능 추가Properties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ 스크립트 생성...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ handleLaunchSsmsMinPropertiesDialogCommand의 ConnectionContext가 제공되지 않았습니다.Could not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ connectionContext에서 개체 탐색기 노드를 확인할 수 없습니다. {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ handleLaunchSsmsMinPropertiesDialogCommand에 대해 제공된 ConnectionContext 없음No connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ connectionContext에서 제공된 connectionProfile 없음: {0}Launching dialog...
- Launching dialog...
+ 대화 상자를 시작하는 중...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ '{0}' 인수로 SsmsMin을 호출하는 동안 오류 발생 - {1}
diff --git a/resources/xlf/ko/agent.ko.xlf b/resources/xlf/ko/agent.ko.xlf
index 059ba671a2..13427d8b74 100644
--- a/resources/xlf/ko/agent.ko.xlf
+++ b/resources/xlf/ko/agent.ko.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ SQL Server Integration Service 패키지SQL Server Agent Service Account
diff --git a/resources/xlf/ko/azurecore.ko.xlf b/resources/xlf/ko/azurecore.ko.xlf
index becc082494..0166081a99 100644
--- a/resources/xlf/ko/azurecore.ko.xlf
+++ b/resources/xlf/ko/azurecore.ko.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: 모든 계정 새로 고침Refresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: 로그인Select Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ 서버에 추가Clear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ 리소스를 찾을 수 없음
diff --git a/resources/xlf/ko/cms.ko.xlf b/resources/xlf/ko/cms.ko.xlf
index fcca881286..debddcda84 100644
--- a/resources/xlf/ko/cms.ko.xlf
+++ b/resources/xlf/ko/cms.ko.xlf
@@ -4,11 +4,11 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ SQL Server 중앙 관리 서버Support for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ SQL Server 중앙 관리 서버 관리 지원Central Management Servers
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ 서버 그룹 새로 고침Delete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ 새 서버 등록...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ 새 서버 그룹...Add Central Management Server
- Add Central Management Server
+ 중앙 관리 서버 추가Delete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ MSSQL 구성Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [선택 사항] 디버그 출력을 콘솔에 로그한 다음([보기] -> [출력]) 드롭다운에서 적절한 출력 채널을 선택합니다.[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [선택 사항] 백 엔드 서비스의 로그 수준입니다. Azure Data Studio는 시작할 때마다 파일 이름을 생성하며 파일이 이미 있으면 로그 항목이 해당 파일에 추가됩니다. 이전 로그 파일을 정리하려면 logRetentionMinutes 및 logFilesRemovalLimit 설정을 참조하세요. 기본 tracingLevel에서는 많이 기록되지 않습니다. 세부 정보 표시를 변경하면 로깅이 광범위해지고 로그의 디스크 공간 요구 사항이 커질 수 있습니다. 오류이면 중요가 포함되고 경고이면 오류가 포함되고 정보이면 경고가 포함되고 자세한 정보 표시이면 정보가 포함됩니다.Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ 백 엔드 서비스의 로그 파일을 유지하는 시간(분)입니다. 기본값은 1주일입니다.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ 시작 시 제거할 mssql.logRetentionMinutes가 만료된 이전 파일의 최대 수입니다. 이 제한으로 인해 정리되지 않은 파일은 다음에 Azure Data Studio를 시작할 때 정리됩니다.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [선택 사항] 지원되지 않는 플랫폼 경고 표시 안 함Recovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ 가격 책정 계층Compatibility Level
@@ -168,11 +168,11 @@
Name (optional)
- Name (optional)
+ 이름(선택 사항)Custom name of the connection
- Custom name of the connection
+ 연결의 사용자 지정 이름Server
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server 인스턴스의 이름Server Description
- Server Description
+ 서버 설명Description of the SQL Server instance
- Description of the SQL Server instance
+ SQL Server 인스턴스에 대한 설명Authentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ SQL Server로 인증하는 방법을 지정합니다.SQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - MFA 지원을 포함한 유니버설User name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ 데이터 소스에 연결할 때 사용할 사용자 ID를 나타냅니다.Password
@@ -224,107 +224,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ 데이터 소스에 연결할 때 사용할 암호를 나타냅니다.Application intent
- Application intent
+ 애플리케이션 의도Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ 서버에 연결할 때 애플리케이션 워크로드 유형을 선언합니다.Asynchronous processing
- Asynchronous processing
+ 비동기 처리When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ true이면 .Net Framework 데이터 공급자에서 비동기 기능을 사용하도록 설정합니다.Connect timeout
- Connect timeout
+ 연결 시간 제한The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ 시도를 종료하고 오류를 생성하기 전에 서버의 연결을 기다리는 시간(초)입니다.Current language
- Current language
+ 현재 언어The SQL Server language record name
- The SQL Server language record name
+ SQL Server 언어 레코드 이름Column encryption
- Column encryption
+ 열 암호화Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 연결의 모든 명령에 대한 기본 열 암호화 설정Encrypt
- Encrypt
+ 암호화When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ true이면 SQL Server는 서버에 인증서가 설치된 경우 클라이언트와 서버 간에 전송되는 모든 데이터에 대해 SSL 암호화를 사용합니다.Persist security info
- Persist security info
+ 보안 정보 유지When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ false이면 암호와 같이 보안에 민감한 정보는 연결의 일부로 반환되지 않습니다.Trust server certificate
- Trust server certificate
+ 서버 인증서 신뢰When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ true(및 encrypt=true)이면 SQL Server는 서버 인증서의 유효성을 검사하지 않고 클라이언트와 서버 간에 전송되는 모든 데이터에 대해 SSL 암호화를 사용합니다.Attached DB file name
- Attached DB file name
+ 연결된 DB 파일 이름The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 연결 가능한 데이터베이스의 전체 경로 이름을 포함한 기본 파일의 이름Context connection
- Context connection
+ 컨텍스트 연결When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ true이면 SQL 서버 컨텍스트에서 연결해야 한다는 것을 나타냅니다. SQL Server 프로세스에서 실행 중인 경우에만 사용 가능Port
- Port
+ 포트Connect retry count
- Connect retry count
+ 연결 다시 시도 횟수Number of attempts to restore connection
- Number of attempts to restore connection
+ 연결을 복원하려는 시도 횟수Connect retry interval
- Connect retry interval
+ 연결 다시 시도 간격Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 연결을 복원하려는 시도 간 지연Application name
@@ -332,47 +332,47 @@
The name of the application
- The name of the application
+ 애플리케이션의 이름Workstation Id
- Workstation Id
+ 워크스테이션 IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ SQL Server에 연결하는 워크스테이션의 이름Pooling
- Pooling
+ 풀링When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ true이면 연결 개체를 적절한 풀에서 가져오거나 필요한 경우 만들어서 적당한 풀에 추가합니다.Max pool size
- Max pool size
+ 최대 풀 크기The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ 풀에서 허용되는 최대 연결 수Min pool size
- Min pool size
+ 최소 풀 크기The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ 풀에서 허용되는 최소 연결 수Load balance timeout
- Load balance timeout
+ 부하 분산 시간 제한The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ 이 연결이 삭제되기 전에 풀에 유지되는 최소 시간(초)입니다.Replication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ 복제에서 SQL Server가 사용Attach DB filename
- Attach DB filename
+ DB 파일 이름 연결Failover partner
- Failover partner
+ 장애 조치(Failover) 파트너The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ 장애 조치(failover) 파트너역할을 하는 SQL Server 인스턴스의 이름 또는 네트워크 주소Multi subnet failover
- Multi subnet failover
+ 다중 서브넷 장애 조치(failover)Multiple active result sets
- Multiple active result sets
+ 여러 개의 활성 결과 집합When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ true이면 하나의 연결에서 여러 결과 집합을 반환하고 읽을 수 있습니다.Packet size
- Packet size
+ 패킷 크기Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ SQL Server 인스턴스와 통신하는 데 사용되는 네트워크 패킷의 크기(바이트)Type system version
- Type system version
+ 형식 시스템 버전Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ 공급자가 DataReader를 통해 노출할 서버 유형 시스템을 나타냅니다.
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ 중앙 관리 서버 {0}이(가) 찾을 수 없거나 오프라인 상태입니다.No resources found
- No resources found
+ 리소스를 찾을 수 없습니다.
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ 중앙 관리 서버 추가...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ 중앙 관리 서버 그룹에 이름이 {0}인 등록된 서버가 이미 있습니다.Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ 등록된 서버 {0}을(를) 추가할 수 없습니다.Are you sure you want to delete
- Are you sure you want to delete
+ 삭제하시겠습니까?Yes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ 서버 그룹 설명{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0}에 이름이 {1}인 서버 그룹이 이미 있습니다.Are you sure you want to delete
- Are you sure you want to delete
+ 삭제하시겠습니까?
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ 구성 서버와 이름이 같은 등록된 공유 서버를 추가할 수 없습니다.
diff --git a/resources/xlf/ko/dacpac.ko.xlf b/resources/xlf/ko/dacpac.ko.xlf
index be8c98d677..9f4bb52e4c 100644
--- a/resources/xlf/ko/dacpac.ko.xlf
+++ b/resources/xlf/ko/dacpac.ko.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ 마법사가 닫히면 [작업 보기]에서 스크립트 생성 상태를 볼 수 있습니다. 완료되면 생성된 스크립트가 열립니다.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/ko/import.ko.xlf b/resources/xlf/ko/import.ko.xlf
index dbe512cdb1..20552244a4 100644
--- a/resources/xlf/ko/import.ko.xlf
+++ b/resources/xlf/ko/import.ko.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ 이 작업이 실패했습니다. 다른 입력 파일을 사용해 보세요.Refresh
diff --git a/resources/xlf/ko/mssql.ko.xlf b/resources/xlf/ko/mssql.ko.xlf
index 178a8a8c5b..ed65527489 100644
--- a/resources/xlf/ko/mssql.ko.xlf
+++ b/resources/xlf/ko/mssql.ko.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ 파일 업로드New directory
- New directory
+ 새 디렉터리Delete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ 새 노트북Open Notebook
- Open Notebook
+ 노트북 열기Tasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ SQL Server 빅 데이터 클러스터에 대한 작업 및 정보SQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Spark 작업 제출New Spark Job
- New Spark Job
+ 새 Spark 작업View Spark History
- View Spark History
+ Spark 기록 보기View Yarn History
- View Yarn History
+ Yarn 기록 보기Tasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ 패키지 설치Configure Python for Notebooks
- Configure Python for Notebooks
+ 노트북용 Python 구성Cluster Status
- Cluster Status
+ 클러스터 상태Search: Servers
- Search: Servers
+ 검색: 서버Search: Clear Search Server Results
- Search: Clear Search Server Results
+ 검색: 검색 서버 결과 지우기Service Endpoints
- Service Endpoints
+ 서비스 엔드포인트MSSQL configuration
- MSSQL configuration
+ MSSQL 구성Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [선택 사항] 디버그 출력을 콘솔에 기록한 다음([보기] -> [출력]) 드롭다운에서 적절한 출력 채널을 선택합니다.[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [선택 사항] 백 엔드 서비스의 로그 수준입니다. Azure Data Studio는 시작할 때마다 파일 이름을 생성하며 파일이 이미 있으면 로그 항목이 해당 파일에 추가됩니다. 이전 로그 파일을 정리하려면 logRetentionMinutes 및 logFilesRemovalLimit 설정을 참조하세요. 기본 tracingLevel에서는 많이 기록되지 않습니다. 세부 정보 표시를 변경하면 로깅이 광범위해지고 로그의 디스크 공간 요구 사항이 커질 수 있습니다. 오류이면 중요가 포함되고 경고이면 오류가 포함되고 정보이면 경고가 포함되고 자세한 정보 표시이면 정보가 포함됩니다.Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ 백 엔드 서비스의 로그를 유지할 시간(분)입니다. 기본값은 1주일입니다.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ 시작 시 제거할 mssql.logRetentionMinutes이 만료된 이전 파일의 최대 수입니다. 이 제한으로 인해 정리되지 않은 파일은 다음에 Azure Data Studio를 시작할 때 정리됩니다.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [선택 사항] 지원되지 않는 플랫폼 경고 표시 안 함Recovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ 가격 책정 계층Compatibility Level
@@ -224,11 +224,11 @@
Name (optional)
- Name (optional)
+ 이름(선택 사항)Custom name of the connection
- Custom name of the connection
+ 연결의 사용자 지정 이름Server
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server 인스턴스의 이름Database
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ 데이터 소스의 초기 카탈로그 또는 데이터베이스의 이름Authentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ SQL Server로 인증하는 방법을 지정합니다.SQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - MFA 지원을 포함한 유니버설User name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ 데이터 소스에 연결할 때 사용할 사용자 ID를 나타냅니다.Password
@@ -280,107 +280,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ 데이터 소스에 연결할 때 사용할 암호를 나타냅니다.Application intent
- Application intent
+ 애플리케이션 의도Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ 서버에 연결할 때 애플리케이션 워크로드 유형을 선언합니다.Asynchronous processing
- Asynchronous processing
+ 비동기 처리When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ true이면 .Net Framework 데이터 공급자에서 비동기 기능을 사용하도록 설정합니다.Connect timeout
- Connect timeout
+ 연결 시간 제한The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ 시도를 마치고 오류를 생성하기 전까지 서버 연결을 기다리는 시간(초)입니다.Current language
- Current language
+ 현재 언어The SQL Server language record name
- The SQL Server language record name
+ SQL Server 언어 레코드 이름Column encryption
- Column encryption
+ 열 암호화Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 연결의 모든 명령에 대한 기본 열 암호화 설정Encrypt
- Encrypt
+ 암호화When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ true이면 SQL Server는 서버에 인증서가 설치된 경우 클라이언트와 서버 간에 전송되는 모든 데이터에 대해 SSL 암호화를 사용합니다.Persist security info
- Persist security info
+ 보안 정보 유지When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ false이면 암호와 같은 보안에 중요한 정보가 연결의 일부로 반환되지 않습니다.Trust server certificate
- Trust server certificate
+ 서버 인증서 신뢰When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ true(및 encrypt=true)이면 SQL Server는 서버 인증서의 유효성을 검사하지 않고 클라이언트와 서버 간에 전송되는 모든 데이터에 대해 SSL 암호화를 사용합니다.Attached DB file name
- Attached DB file name
+ 연결된 DB 파일 이름The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 연결 가능한 데이터베이스의 전체 경로 이름을 포함한 기본 파일의 이름Context connection
- Context connection
+ 컨텍스트 연결When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ true이면 SQL 서버 컨텍스트에서 연결해야 한다는 것을 나타냅니다. SQL Server 프로세스에서 실행 중인 경우에만 사용 가능Port
- Port
+ 포트Connect retry count
- Connect retry count
+ 연결 다시 시도 횟수Number of attempts to restore connection
- Number of attempts to restore connection
+ 연결을 복원하려는 시도 횟수Connect retry interval
- Connect retry interval
+ 연결 다시 시도 간격Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 연결을 복원하려는 시도 간 지연Application name
@@ -388,47 +388,47 @@
The name of the application
- The name of the application
+ 애플리케이션의 이름Workstation Id
- Workstation Id
+ 워크스테이션 IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ SQL Server에 연결하는 워크스테이션의 이름Pooling
- Pooling
+ 풀링When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ true이면 연결 개체를 적절한 풀에서 가져오거나 필요한 경우 만들어 적절한 풀에 추가합니다.Max pool size
- Max pool size
+ 최대 풀 크기The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ 풀에서 허용되는 최대 연결 수Min pool size
- Min pool size
+ 최소 풀 크기The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ 풀에서 허용되는 최소 연결 수Load balance timeout
- Load balance timeout
+ 부하 분산 시간 제한The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ 이 연결이 삭제되기 전에 풀에서 유지되는 최소 시간(초)Replication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ 복제에서 SQL Server가 사용Attach DB filename
- Attach DB filename
+ DB 파일 이름 연결Failover partner
- Failover partner
+ 장애 조치(Failover) 파트너The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ 장애 조치(failover ) 파트너 역할을 하는 SQL Server 인스턴스의 이름 또는 네트워크 주소Multi subnet failover
- Multi subnet failover
+ 다중 서브넷 장애 조치(failover)Multiple active result sets
- Multiple active result sets
+ 여러 개의 활성 결과 집합When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ true이면 여러 결과 집합을 하나의 연결에서 반환하고 읽을 수 있습니다.Packet size
- Packet size
+ 패킷 크기Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ SQL Server 인스턴스와 통신하는 데 사용되는 네트워크 패킷의 크기(바이트)Type system version
- Type system version
+ 형식 시스템 버전Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ 공급자가 DataReader를 통해 노출할 서버 유형 시스템을 나타냅니다.
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ 응답에서 Spark 작업 일괄 처리 ID가 반환되지 않습니다. {0}[오류] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ 응답 내에 로그가 반환되지 않았습니다. {0}[오류] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ SparkJobSubmissionModel의 매개 변수가 잘못되었습니다.submissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs가 잘못되었습니다.livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId가 잘못되었습니다.Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ 애플리케이션 ID 시간 제한을 가져옵니다. {0}[로그] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ localFilePath 또는 hdfsFolderPath 속성을 지정하지 않았습니다.Property Path is not specified.
- Property Path is not specified.
+ 속성 경로가 지정되지 않았습니다.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ SparkJobSubmissionDialog에 대한 매개 변수가 잘못되었습니다.New Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ 제출{0} Spark Job Submission:
- {0} Spark Job Submission:
+ {0} Spark 작업 제출:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Spark 작업 시작 제출 ...................................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ 이름 입력 ...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Spark 클러스터Path to a .jar or .py file
- Path to a .jar or .py file
+ .jar 또는 .py 파일의 경로The selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ 선택한 로컬 파일이 HDFS에 업로드됩니다. {0}JAR/py File
- JAR/py File
+ JAR/py 파일Main Class
- Main Class
+ 주 클래스Arguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ 주 클래스에 사용되는 명령줄 인수는 여러 인수를 공백으로 분할해야 합니다.Property Job Name is not specified.
- Property Job Name is not specified.
+ 속성 작업 이름이 지정되지 않았습니다.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ 속성 JAR/py 파일이 지정되지 않았습니다.Property Main Class is not specified.
- Property Main Class is not specified.
+ 속성 주 클래스를 지정하지 않았습니다.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0}이(가) 클러스터 또는 throw된 예외에 없습니다.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ 지정된 HDFS 파일이 없습니다. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ 오류로 인해 파일 찾기 오류: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ 참조 JarJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ 실행기 작업 디렉터리에 배치할 Jar입니다. Jar 경로는 HDFS 경로여야 합니다. 여러 경로는 세미콜론(;)으로 분할해야 합니다.Reference py Files
- Reference py Files
+ 참조 py 파일Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 실행기 작업 디렉터리에 배치할 Py 파일입니다. 파일 경로는 HDFS 경로여야 합니다. 여러 경로는 세미콜론(;)으로 분할해야 합니다.Reference Files
- Reference Files
+ 참조 파일Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 실행기 작업 디렉터리에 배치할 파일입니다. 파일 경로는 HDFS 경로여야 합니다. 여러 경로는 세미콜론(;)으로 분할해야 합니다.Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ 빅 데이터 클러스터가 있는 SQL 서버를 선택하세요.No Sql Server is selected.
- No Sql Server is selected.
+ SQL 서버를 선택하지 않았습니다.Error Get File Path: {0}
- Error Get File Path: {0}
+ 파일 경로 가져오기 오류: {0}Invalid Data Structure
- Invalid Data Structure
+ 잘못된 데이터 구조Unable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ 누락된 옵션으로 인해 WebHDFS 클라이언트를 만들 수 없음: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '${0}'이(가) 정의되지 않았습니다.Bad Request
- Bad Request
+ 잘못된 요청Unauthorized
- Unauthorized
+ 권한 없음Forbidden
- Forbidden
+ 금지됨Not Found
@@ -724,7 +724,7 @@
Internal Server Error
- Internal Server Error
+ 내부 서버 오류Unknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ 예기치 않은 리디렉션Please provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ HDFS에 연결하려면 암호를 제공하세요.Session for node {0} does not exist
- Session for node {0} does not exist
+ {0} 노드의 세션이 없습니다.Error notifying of node change: {0}
- Error notifying of node change: {0}
+ 노드 변경 알리기 오류: {0}Root
- Root
+ 루트HDFS
- HDFS
+ HDFSData Services
- Data Services
+ 데이터 서비스NOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ 알림: 이 파일은 미리 보기를 위해 {0}에서 잘렸습니다.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ 파일이 미리 보기를 위해 {0}에서 잘렸습니다.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo가 정의되지 않았습니다.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options가 정의되지 않았습니다.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ connectionInfo.options에서 일부 누락된 속성 : {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ 이 처리기에 대해 {0} 작업이 지원되지 않습니다.Cannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ HTTP 및 HTTPS 링크만 지원되기 때문에 {0} 링크를 열 수 없습니다.Download and open '{0}'?
- Download and open '{0}'?
+ '{0}'을(를) 다운로드하고 열겠습니까?Could not find the specified file
- Could not find the specified file
+ 지정한 파일을 찾을 수 없습니다.File open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ {0} {1} 오류로 파일 열기 요청 실패Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ 노트북 서버 중지 오류: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ 노트북 프로세스가 종기에 종료되었습니다(오류: {0}). StdErr 출력: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Jupyter에서 보낸 오류: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter가 {0}에서 실행되고 있습니다.... Starting Notebook server
- ... Starting Notebook server
+ ... 전자 필기장 서버 시작Unexpected setting type {0}
- Unexpected setting type {0}
+ 예기치 않은 설정 유형 {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ 세션을 시작할 수 없습니다. 관리자가 아직 초기화되지 않았습니다.Spark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Spark 커널은 SQL Server 빅 데이터 클러스터 마스터 인스턴스에 연결해야 합니다.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ 노트북 서버 종료 실패: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ 노트북 종속성 설치가 진행 중입니다.Python download is complete
- Python download is complete
+ Python 다운로드가 완료되었습니다.Error while downloading python setup
- Error while downloading python setup
+ Python 설정을 다운로드하는 동안 오류가 발생했습니다.Downloading python package
- Downloading python package
+ Python 패키지 다운로드Unpacking python package
- Unpacking python package
+ Python 패키지 압축 풀기Error while creating python installation directory
- Error while creating python installation directory
+ Python 설치 디렉터리를 만드는 동안 오류가 발생했습니다.Error while unpacking python bundle
- Error while unpacking python bundle
+ Python 번들의 압축을 푸는 동안 오류가 발생했습니다.Installing Notebook dependencies
- Installing Notebook dependencies
+ 노트북 종속성 설치Installing Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ 노트북 종속성을 설치합니다. 자세한 내용은 작업 보기를 참조하세요.Notebook dependencies installation is complete
- Notebook dependencies installation is complete
+ 노트북 종속성 설치가 완료되었습니다.Cannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Python이 실행되는 동안 기존 Python 설치를 덮어 쓸 수 없습니다.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ 다른 Python 설치가 현재 진행 중입니다.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python이 특정 위치에 이미 있습니다. 설치를 건너뜁니다.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ 노트북 종속성 설치에 실패했습니다(오류: {0}).Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ {0} 플랫폼용 로컬 python을 {1}(으)로 다운로드하는 중Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ 노트북을 실행하는 데 필요한 패키지를 설치하는 중...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Jupyter 설치가 완료되었습니다.Installing SparkMagic...
- Installing SparkMagic...
+ SparkMagic을 설치하는 중...A notebook path is required
- A notebook path is required
+ 노트북 경로가 필요합니다.Notebooks
- Notebooks
+ 노트북Only .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ .ipynb 노트북만 지원됩니다.Are you sure you want to reinstall?
- Are you sure you want to reinstall?
+ 다시 설치하시겠습니까?Configure Python for Notebooks
- Configure Python for Notebooks
+ 노트북용 Python 구성Install
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Python 설치 위치Select
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ 이 설치에는 다소 시간이 걸릴 수 있습니다. 설치가 완료될 때까지 애플리케이션을 닫지 않는 것이 좋습니다.The specified install location is invalid.
- The specified install location is invalid.
+ 지정된 설치 위치가 잘못되었습니다.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ 지정된 위치에서 Python 설치를 찾을 수 없습니다.Python installation was declined.
- Python installation was declined.
+ Python 설치가 거부되었습니다.Installation Type
- Installation Type
+ 설치 유형New Python installation
- New Python installation
+ 새 Python 설치Use existing Python installation
- Use existing Python installation
+ 기존 Python 설치 사용Open file {0} failed: {1}
- Open file {0} failed: {1}
+ {0} 파일 열기 실패: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ {0} 파일 열기 실패: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ {0} 파일 열기 실패: {1}Missing file : {0}
- Missing file : {0}
+ 없는 파일: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ 이 샘플 코드는 파일을 데이터 프레임에 로드하고 처음 10개의 결과를 보여 줍니다.No notebook editor is active
- No notebook editor is active
+ 노트북 편집기가 활성 상태가 아님Code
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ 어떤 유형의 셀을 추가하시겠습니까?Notebooks
- Notebooks
+ 노트북SQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Azure Data Studio용 SQL Server 배포 확장Provides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Microsoft SQL Server 배포를 위한 전자 필기장 기반 환경 제공Deploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Docker에 SQL Server 배포...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ SQL Server 빅 데이터 클러스터 배포...Deploy SQL Server…
- Deploy SQL Server…
+ SQL 서버 배포...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ SQL Server 컨테이너 이미지Run SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Docker를 사용하여 SQL Server 컨테이너 이미지 실행SQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ SQL Server 빅 데이터 클러스터를 사용하면 Kubernetes에서 실행되는 SQL Server, Spark 및 HDFS 컨테이너의 확장 가능한 클러스터를 배포할 수 있습니다.Version
@@ -52,23 +52,23 @@
SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ SQL Server 2019 빅 데이터 클러스터 CTP 3.1Deployment target
- Deployment target
+ 배포 대상New Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ 클러스터 관리자가 REST API를 통해 빅 데이터 클러스터를 부트스트랩하고 관리할 수 있도록 Python으로 작성된 명령줄 유틸리티mssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ 명령줄 도구를 사용하면 Kubernetes 클러스터에 대한 명령을 실행할 수 있습니다.kubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ 격리 컨테이너에서 애플리케이션을 패키지하고 실행하는 기능을 제공합니다.Docker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Azure 리소스 관리를 위한 명령줄 도구Azure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ package.json을 찾을 수 없거나 이름/게시자가 설정되지 않았습니다.
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ {0} 노트북이 없습니다.
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ 배포 옵션 선택Open Notebook
- Open Notebook
+ 노트북 열기Tool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ 확장을 로드하지 못했습니다. {0}. package.json의 리소스 종류 정의에서 오류가 발생했습니다. 자세한 내용은 디버그 콘솔을 참조하세요.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ 리소스 유형 {0}이(가) 정의되지 않았습니다.
diff --git a/resources/xlf/ko/schema-compare.ko.xlf b/resources/xlf/ko/schema-compare.ko.xlf
index 646fd7bf17..a5f5476152 100644
--- a/resources/xlf/ko/schema-compare.ko.xlf
+++ b/resources/xlf/ko/schema-compare.ko.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server 스키마 비교SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Azure Data Studio에 대한 SQL Server 스키마 비교는 데이터베이스 및 dacpacs의 스키마 비교를 지원합니다.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ 옵션이 변경되었습니다. 비교를 확인하려면 [다시 비교]를 누르세요.Schema Compare Options
@@ -52,311 +52,311 @@
Include Object Types
- Include Object Types
+ 개체 유형 포함Ignore Table Options
- Ignore Table Options
+ 테이블 옵션 무시Ignore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ 문 사이의 세미콜론 무시Ignore Route Lifetime
- Ignore Route Lifetime
+ 경로 수명 무시Ignore Role Membership
- Ignore Role Membership
+ 역할 멤버 자격 무시Ignore Quoted Identifiers
- Ignore Quoted Identifiers
+ 따옴표가 붙은 식별자 무시Ignore Permissions
- Ignore Permissions
+ 사용 권한 무시Ignore Partition Schemes
- Ignore Partition Schemes
+ 파티션 구성표 무시Ignore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ 파티션 구성표에서 개체 배치 무시Ignore Not For Replication
- Ignore Not For Replication
+ 복제용 아님 무시Ignore Login Sids
- Ignore Login Sids
+ 로그인 SID 무시Ignore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ 인덱스의 잠금 힌트 무시Ignore Keyword Casing
- Ignore Keyword Casing
+ 키워드 대/소문자 무시Ignore Index Padding
- Ignore Index Padding
+ 인덱스 패딩 무시Ignore Index Options
- Ignore Index Options
+ 인덱스 옵션 무시Ignore Increment
- Ignore Increment
+ 증가값 무시Ignore Identity Seed
- Ignore Identity Seed
+ ID 시드 무시Ignore User Settings Objects
- Ignore User Settings Objects
+ 사용자 설정 개체 무시Ignore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ 전체 텍스트 카탈로그 파일 경로 무시Ignore Whitespace
- Ignore Whitespace
+ 공백 무시Ignore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ ForeignKeys 확인 없이 무시Verify Collation Compatibility
- Verify Collation Compatibility
+ 데이터 정렬 호환성 확인Unmodifiable Object Warnings
- Unmodifiable Object Warnings
+ 수정할 수 없는 개체 경고Treat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ 확인 오류를 경고로 처리Script Refresh Module
- Script Refresh Module
+ 스크립트 새로 고침 모듈Script New Constraint Validation
- Script New Constraint Validation
+ 스크립트 새 제약 조건 유효성 검사Script File Size
- Script File Size
+ 스크립트 파일 크기Script Deploy StateChecks
- Script Deploy StateChecks
+ 스크립트 배포 StateChecksScript Database Options
- Script Database Options
+ 스크립트 데이터베이스 옵션Script Database Compatibility
- Script Database Compatibility
+ 스크립트 데이터베이스 호환성Script Database Collation
- Script Database Collation
+ 스크립트 데이터베이스 데이터 정렬Run Deployment Plan Executors
- Run Deployment Plan Executors
+ 배포 계획 실행기 실행Register DataTier Application
- Register DataTier Application
+ DataTier 애플리케이션 등록Populate Files On File Groups
- Populate Files On File Groups
+ 파일 그룹에 파일 채우기No Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Clr 형식을 변경하는 Alter 문 없음Include Transactional Scripts
- Include Transactional Scripts
+ 트랜잭션 스크립트 포함Include Composite Objects
- Include Composite Objects
+ 복합 객체 포함Allow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ 안전하지 않은 행 수준 보안 데이터 이동 허용Ignore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Check 제안 조건 확인 없이 무시Ignore Fill Factor
- Ignore Fill Factor
+ 채우기 비율 무시Ignore File Size
- Ignore File Size
+ 파일 크기 무시Ignore Filegroup Placement
- Ignore Filegroup Placement
+ 파일 그룹 배치 무시Do Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ 복제된 개체 변경 안 함Do Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ 변경 데이터 캡처 개체 변경 안 함Disable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ DDL 트리거를 해제한 후 다시 설정Deploy Database In Single User Mode
- Deploy Database In Single User Mode
+ 단일 사용자 모드에서 데이터베이스 배포Create New Database
- Create New Database
+ 새 데이터베이스 만들기Compare Using Target Collation
- Compare Using Target Collation
+ 대상 데이터 정렬을 사용하여 비교Comment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Set Var 선언을 주석으로 처리Block When Drift Detected
- Block When Drift Detected
+ 드리프트 검색 시 차단Block On Possible Data Loss
- Block On Possible Data Loss
+ 데이터 손실 가능성이 있는 경우 차단Backup Database Before Changes
- Backup Database Before Changes
+ 변경하기 전에 데이터베이스 백업Allow Incompatible Platform
- Allow Incompatible Platform
+ 호환되지 않는 플랫폼 허용Allow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ 차단 어셈블리 삭제 허용Drop Constraints Not In Source
- Drop Constraints Not In Source
+ 소스에 없는 제약 조건 삭제Drop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ 소스에 없는 DML 트리거 삭제Drop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ 소스에 없는 확장 속성 삭제Drop Indexes Not In Source
- Drop Indexes Not In Source
+ 소스에 없는 인덱스 삭제Ignore File And Log File Path
- Ignore File And Log File Path
+ 파일 및 로그 파일 경로 무시Ignore Extended Properties
- Ignore Extended Properties
+ 확장 속성 무시Ignore Dml Trigger State
- Ignore Dml Trigger State
+ Dml 트리거 상태 무시Ignore Dml Trigger Order
- Ignore Dml Trigger Order
+ Dml 트리거 순서 무시Ignore Default Schema
- Ignore Default Schema
+ 기본 스키마 무시Ignore Ddl Trigger State
- Ignore Ddl Trigger State
+ DDL 트리거 상태 무시Ignore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ddl 트리거 순서 무시Ignore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ 암호화 공급자 파일 경로 무시Verify Deployment
- Verify Deployment
+ 배포 확인Ignore Comments
- Ignore Comments
+ 댓글 무시Ignore Column Collation
- Ignore Column Collation
+ 열 데이터 정렬 무시Ignore Authorizer
- Ignore Authorizer
+ 권한 부여자 무시Ignore AnsiNulls
- Ignore AnsiNulls
+ AnsiNulls 무시Generate SmartDefaults
- Generate SmartDefaults
+ SmartDefaults 생성Drop Statistics Not In Source
- Drop Statistics Not In Source
+ 소스에 없는 통계 삭제Drop Role Members Not In Source
- Drop Role Members Not In Source
+ 소스에 없는 역할 멤버 삭제Drop Permissions Not In Source
- Drop Permissions Not In Source
+ 소스에 없는 사용 권한 삭제Drop Objects Not In Source
- Drop Objects Not In Source
+ 소스에 없는 개체 삭제Ignore Column Order
- Ignore Column Order
+ 열 순서 무시Aggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ 데이터베이스 트리거Defaults
@@ -436,7 +436,7 @@
File Tables
- File Tables
+ 파일 테이블Full Text Catalogs
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ 스칼라 반환 함수Search Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ 테이블 반환 함수User Defined Data Types
- User Defined Data Types
+ 사용자 정의 데이터 형식User Defined Table Types
- User Defined Table Types
+ 사용자 정의 테이블 형식Clr User Defined Types
- Clr User Defined Types
+ CRL 사용자 정의 형식Users
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ 서버 트리거Specifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ ALTER ASSEMBLY 문을 실행하는 대신 차이가 있는 경우 게시에서 항상 어셈블리를 삭제하고 다시 만들도록 지정합니다.Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ true이면 배포하기 전에 데이터베이스가 단일 사용자 모드로 설정됩니다.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ 이 설정은 배포하는 동안 데이터베이스의 데이터 정렬을 처리하는 방법을 지정합니다. 기본적으로 대상 데이터베이스의 데이터 정렬이 소스에서 지정한 데이터 정렬과 일치하지 않으면 업데이트됩니다. 이 옵션을 설정하면 대상 데이터베이스(또는 서버의) 데이터 정렬을 사용해야 합니다.Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ 데이터베이스에 업데이트를 게시할 때 데이터베이스 스냅숏(.dacpac) 파일에 정의되지 않은 역할 멤버가 대상 데이터베이스에서 삭제되는지 여부를 지정합니다.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ 데이터 계층 애플리케이션 파일(.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ 다른 소스 스키마가 선택되었습니다. 비교를 확인하려면 [비교]를 누르세요.A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ 다른 대상 스키마가 선택되었습니다. 비교를 확인하려면 [비교]를 누르세요.Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ 다른 소스 및 대상 스키마가 선택되었습니다. 비교를 확인하려면 [비교]를 누르세요.Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ 세부 정보 비교Are you sure you want to update the target?
- Are you sure you want to update the target?
+ 대상을 업데이트하시겠습니까?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ [비교]를 눌러 비교를 새로 고칩니다.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ 대상에 변경 내용을 배포하는 스크립트 생성No changes to script
- No changes to script
+ 스크립트 변경 사항 없음Apply changes to target
- Apply changes to target
+ 대상에 변경 내용 적용No changes to apply
- No changes to apply
+ 적용할 변경 사항 없음Delete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ 비교를 초기화하는 중입니다. 어느 정도 시간이 걸릴 수 있습니다.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ 두 스키마를 비교하려면 먼저 소스 스키마 및 대상 스키마를 선택한 다음 [비교]를 누릅니다.No schema differences were found.
- No schema differences were found.
+ 스키마 차이점을 찾을 수 없습니다.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ 스키마 비교 실패: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ 대상이 데이터베이스이면 [스크립트 생성]이 사용하도록 설정됩니다.Apply is enabled when the target is a database
- Apply is enabled when the target is a database
+ 대상이 데이터베이스이면 [적용]이 사용하도록 설정됩니다.Compare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ 스키마 비교 취소 실패: '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ 스크립트 생성 실패: '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ 스키마 비교 적용 실패 '{0}'Switch direction
- Switch direction
+ 방향 전환Switch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ .scmp 파일 열기Load source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ .scmp 파일에 저장된 소스, 대상 및 옵션 로드Open
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ scmp 열기 실패: '{0}'Save .scmp file
- Save .scmp file
+ .scmp 파일 저장Save source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ 소스 및 대상, 옵션 및 제외된 요소 저장Save
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ scmp 저장 실패: '{0}'
diff --git a/resources/xlf/pt-br/admin-tool-ext-win.pt-BR.xlf b/resources/xlf/pt-br/admin-tool-ext-win.pt-BR.xlf
index 2632ed17d2..c97776ba11 100644
--- a/resources/xlf/pt-br/admin-tool-ext-win.pt-BR.xlf
+++ b/resources/xlf/pt-br/admin-tool-ext-win.pt-BR.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Extensões da Ferramenta de Administração de Banco de Dados para WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Adiciona funcionalidade adicional específica do Windows ao Azure Data StudioProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Gerar scripts...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Não foi fornecido nenhum ConnectionContext para handleLaunchSsmsMinPropertiesDialogCommandCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ Não foi possível determinar o nó do pesquisador de objetos de connectionContext: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Não foi fornecido nenhum ConnectionContext para handleLaunchSsmsMinPropertiesDialogCommandNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ Não foi fornecido nenhum connectionProfile do connectionContext: {0}Launching dialog...
- Launching dialog...
+ Caixa de diálogo de inicialização...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Erro ao chamar SsmsMin com os argumentos {0}' – {1}
diff --git a/resources/xlf/pt-br/agent.pt-BR.xlf b/resources/xlf/pt-br/agent.pt-BR.xlf
index fd70f89cc0..2109331d07 100644
--- a/resources/xlf/pt-br/agent.pt-BR.xlf
+++ b/resources/xlf/pt-br/agent.pt-BR.xlf
@@ -40,7 +40,7 @@
Proxy name
- Nome de proxy
+ Nome do proxyCredential name
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ Pacote do serviço de integração do SQL ServerSQL Server Agent Service Account
diff --git a/resources/xlf/pt-br/azurecore.pt-BR.xlf b/resources/xlf/pt-br/azurecore.pt-BR.xlf
index ac9620d5d6..c6baec500c 100644
--- a/resources/xlf/pt-br/azurecore.pt-BR.xlf
+++ b/resources/xlf/pt-br/azurecore.pt-BR.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: atualizar todas as contasRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: entrarSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Adicionar aos servidoresClear Azure Account Token Cache
@@ -112,7 +112,7 @@
No Subscriptions found.
- Nenhuma assinatura encontrada.
+ Não foi encontrada nenhuma assinatura.
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ Não foi encontrado nenhum Recurso
diff --git a/resources/xlf/pt-br/cms.pt-BR.xlf b/resources/xlf/pt-br/cms.pt-BR.xlf
index bf85514eea..428caa5709 100644
--- a/resources/xlf/pt-br/cms.pt-BR.xlf
+++ b/resources/xlf/pt-br/cms.pt-BR.xlf
@@ -4,23 +4,23 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Servidores de gerenciamento central do SQL ServerSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Suporte para gerenciar os Servidores de Gerenciamento Central do SQL ServerCentral Management Servers
- Central Management Servers
+ Servidores de gerenciamento centralMicrosoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerCentral Management Servers
- Central Management Servers
+ Servidores de gerenciamento centralRefresh
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Atualizar grupo de servidoresDelete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ Novo registro de servidor...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ Novo grupo de servidores...Add Central Management Server
- Add Central Management Server
+ Adicionar o Servidor de Gerenciamento CentralDelete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ Configuração do MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Opcional] Registre em log a saída de depuração no console (Exibir -> Saída) e, em seguida, selecione o canal de saída apropriado no menu suspenso[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Opcional] Nível de log para serviços de back-end. O Azure Data Studio gera um nome de arquivo sempre que é iniciado e, quando o arquivo já existe, as entradas de logs são acrescentadas a esse arquivo. Para a limpeza de arquivos de log antigos, confira as configurações logRetentionMinutes e logFilesRemovalLimit. O tracingLevel padrão não registra uma grande quantidade de log. A alteração do nível de detalhes pode levar ao aumento dos requisitos de log e de espaço em disco para os logs. Erro inclui Crítico, Aviso inclui Erro, informações inclui Aviso e Detalhado inclui InformaçõesNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ O número de minutos para reter os arquivos de log dos serviços de back-end. O padrão é uma semana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Número máximo de arquivos antigos a serem removidos na inicialização, com mssql.logRetentionMinutes expirado. Os arquivos que não forem limpos devido a essa limitação serão limpos na próxima vez em que o Azure Data Studio for iniciado.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Opcional] Não mostrar avisos de plataforma sem suporteRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ Tipo de PreçoCompatibility Level
@@ -164,15 +164,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nome (opcional)Custom name of the connection
- Custom name of the connection
+ Nome personalizado da conexãoServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nome da instância do SQL ServerServer Description
- Server Description
+ Descrição do servidorDescription of the SQL Server instance
- Description of the SQL Server instance
+ Descrição da instância do SQL ServerAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Especifica o método de autenticação com o SQL ServerSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory – Universal com suporte para MFAUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica a ID do usuário a ser usada ao conectar-se à fonte de dadosPassword
@@ -224,155 +224,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica a senha a ser usada ao conectar-se à fonte de dadosApplication intent
- Application intent
+ Intenção do aplicativoDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Declara o tipo de carga de trabalho do aplicativo ao conectar-se a um servidorAsynchronous processing
- Asynchronous processing
+ Processamento assíncronoWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Quando true, permite o uso da funcionalidade assíncrona no provedor de dados do .NET FrameworkConnect timeout
- Connect timeout
+ Tempo limite de conexãoThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ O período de tempo (em segundos) para aguardar uma conexão com o servidor antes de encerrar a tentativa e gerar um erroCurrent language
- Current language
+ Idioma atualThe SQL Server language record name
- The SQL Server language record name
+ O nome do registro de idioma do SQL ServerColumn encryption
- Column encryption
+ Criptografia de colunaDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Configuração de criptografia de coluna padrão para todos os comandos na conexãoEncrypt
- Encrypt
+ CriptografarWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Quando true, o SQL Server usa a criptografia SSL para todos os dados enviados entre o cliente e o servidor quando o servidor tem um certificado instaladoPersist security info
- Persist security info
+ Persistir as informações de segurançaWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Quando false, as informações confidenciais de segurança, como a senha, não são retornadas como parte da conexãoTrust server certificate
- Trust server certificate
+ Certificado do servidor de confiançaWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Quando true (e encrypt = true), o SQL Server usa a criptografia SSL para todos os dados enviados entre o cliente e o servidor sem validar o certificado do servidorAttached DB file name
- Attached DB file name
+ Nome do arquivo de banco de dados anexadoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ O nome do arquivo principal, incluindo o nome do caminho completo, de um banco de dados anexávelContext connection
- Context connection
+ Conexão de contextoWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Quando true, indica que a conexão deve ser do contexto de SQL Server. Disponível somente quando executado no processo do SQL ServerPort
- Port
+ PortaConnect retry count
- Connect retry count
+ Contagem de nova tentativa de conexãoNumber of attempts to restore connection
- Number of attempts to restore connection
+ Número de tentativas para restaurar a conexãoConnect retry interval
- Connect retry interval
+ Intervalo de nova tentativa de conexãoDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Atraso entre tentativas de restauração de conexãoApplication name
- Application name
+ Nome do aplicativoThe name of the application
- The name of the application
+ O nome do aplicativoWorkstation Id
- Workstation Id
+ ID da estação de trabalhoThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ O nome da estação de trabalho que se conecta ao SQL ServerPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Quando true, o objeto de conexão é extraído do pool apropriado ou, se necessário, é criado e adicionado ao pool apropriadoMax pool size
- Max pool size
+ Tamanho máximo do poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ O número máximo de conexões permitidas no poolMin pool size
- Min pool size
+ Tamanho mínimo do poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ O número mínimo de conexões permitidas no poolLoad balance timeout
- Load balance timeout
+ Tempo limite de balanceamento de cargaThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ O período mínimo de tempo (em segundos) para que essa conexão exista no pool antes de ser destruídaReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Usado pelo SQL Server na replicaçãoAttach DB filename
- Attach DB filename
+ Anexar nome de arquivo de banco de dadosFailover partner
- Failover partner
+ Parceiro de failoverThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ O nome ou o endereço de rede da instância do SQL Server que atua como um parceiro de failoverMulti subnet failover
- Multi subnet failover
+ Failover de várias sub-redesMultiple active result sets
- Multiple active result sets
+ Vários conjuntos de resultados ativosWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Quando true, vários conjuntos de resultados podem ser retornados e lidos de uma conexãoPacket size
- Packet size
+ Tamanho do pacoteSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Tamanho em bytes dos pacotes de rede usados para comunicar-se com uma instância do SQL ServerType system version
- Type system version
+ Versão do sistema de tiposIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica qual sistema de tipos de servidor o provedor poderá expor por meio do DataReader
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ O servidor de gerenciamento central {0} não pôde ser encontrado ou está offlineNo resources found
- No resources found
+ Não foi encontrado nenhum Recurso
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Adicionar servidor de gerenciamento central...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ O grupo de servidores de gerenciamento central já tem um servidor registrado com o nome {0}Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ Não foi possível adicionar o servidor registado {0}Are you sure you want to delete
- Are you sure you want to delete
+ Tem certeza de que deseja excluirYes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Descrição do grupo de servidores{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} já tem um grupo de servidores com o nome {1}Are you sure you want to delete
- Are you sure you want to delete
+ Tem certeza de que deseja excluir
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ Não é possível adicionar um servidor registado compartilhado com o mesmo nome que o Servidor de Configuração
diff --git a/resources/xlf/pt-br/dacpac.pt-BR.xlf b/resources/xlf/pt-br/dacpac.pt-BR.xlf
index b416832f5f..78b3e4fbca 100644
--- a/resources/xlf/pt-br/dacpac.pt-BR.xlf
+++ b/resources/xlf/pt-br/dacpac.pt-BR.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ Você pode exibir o status da geração de script na exibição de tarefas depois que o assistente for fechado. O script gerado será aberto quando concluído.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/pt-br/import.pt-BR.xlf b/resources/xlf/pt-br/import.pt-BR.xlf
index 88bc5e2bf2..85160b3c69 100644
--- a/resources/xlf/pt-br/import.pt-BR.xlf
+++ b/resources/xlf/pt-br/import.pt-BR.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Esta operação não teve êxito. Por favor, tente um arquivo de entrada diferente.Refresh
diff --git a/resources/xlf/pt-br/mssql.pt-BR.xlf b/resources/xlf/pt-br/mssql.pt-BR.xlf
index 5289e108f9..e074631bc5 100644
--- a/resources/xlf/pt-br/mssql.pt-BR.xlf
+++ b/resources/xlf/pt-br/mssql.pt-BR.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Carregar arquivosNew directory
- New directory
+ Novo diretórioDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Novo NotebookOpen Notebook
- Open Notebook
+ Abrir o NotebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Tarefas e informações sobre o cluster de Big data do SQL ServerSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Enviar o Trabalho do SparkNew Spark Job
- New Spark Job
+ Novo trabalho do SparkView Spark History
- View Spark History
+ Ver o Histórico do SparkView Yarn History
- View Yarn History
+ Ver Histórico do YarnTasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ Instalar pacotesConfigure Python for Notebooks
- Configure Python for Notebooks
+ Configurar o Python para notebooksCluster Status
- Cluster Status
+ Status do clusterSearch: Servers
- Search: Servers
+ Pesquisa: servidoresSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Pesquisa: Limpar os Resultados do Search ServerService Endpoints
- Service Endpoints
+ Pontos de extremidade de serviçoMSSQL configuration
- MSSQL configuration
+ Configuração do MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Opcional] Registre em log a saída da depuração no console (Exibir -> Saída) e, em seguida, selecione o canal de saída apropriado no menu suspenso[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Opcional] Nível de log para serviços de back-end. O Azure Data Studio gera um nome de arquivo sempre que é iniciado e, quando o arquivo já existe, as entradas de logs são acrescentadas a esse arquivo. Para a limpeza de arquivos de log antigos, confira as configurações logRetentionMinutes e logFilesRemovalLimit. O tracingLevel padrão não registra uma grande quantidade de log. A alteração do nível de detalhes pode levar ao aumento dos requisitos de log e de espaço em disco para os logs. Erro inclui Crítico, Aviso inclui Erro, informações inclui Aviso e Detalhado inclui InformaçõesNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Número de minutos para reter os arquivos de log dos serviços de back-end. O padrão é uma semana.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Número máximo de arquivos antigos a serem removidos na inicialização com o mssql.logRetentionMinutes expirado. Os arquivos que não forem limpos devido a essa limitação serão limpos na próxima vez em que o Azure Data Studio for iniciado.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Opcional] Não mostrar os avisos de plataforma sem suporteRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ Tipo de PreçoCompatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ Nome (opcional)Custom name of the connection
- Custom name of the connection
+ Nome personalizado da conexãoServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Nome da instância do SQL ServerDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ O nome do catálogo inicial ou do banco de dados na fonte de dadosAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Especifica o método de autenticação com o SQL ServerSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory – Universal com suporte para MFAUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Indica a ID do usuário a ser usada ao conectar-se à fonte de dadosPassword
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Indica a senha a ser usada ao conectar-se à fonte de dadosApplication intent
- Application intent
+ Intenção do aplicativoDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Declara o tipo de carga de trabalho do aplicativo ao conectar-se a um servidorAsynchronous processing
- Asynchronous processing
+ Processamento assíncronoWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Quando true, permite o uso da funcionalidade assíncrona no provedor de dados do .NET FrameworkConnect timeout
- Connect timeout
+ Tempo limite de conexãoThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ O período de tempo (em segundos) para aguardar uma conexão com o servidor antes de encerrar a tentativa e gerar um erroCurrent language
- Current language
+ Idioma atualThe SQL Server language record name
- The SQL Server language record name
+ O nome do registro de idioma do SQL ServerColumn encryption
- Column encryption
+ Criptografia de colunaDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ A configuração de criptografia de coluna padrão para todos os comandos na conexãoEncrypt
- Encrypt
+ CriptografarWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Quando true, o SQL Server usa a criptografia SSL para todos os dados enviados entre o cliente e o servidor, quando o servidor tem um certificado instaladoPersist security info
- Persist security info
+ Persistir as informações de segurançaWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Quando false, as informações confidenciais de segurança, como a senha, não são retornadas como parte da conexãoTrust server certificate
- Trust server certificate
+ Certificado do servidor de confiançaWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Quando true (e encrypt = true), o SQL Server usa a criptografia SSL para todos os dados enviados entre o cliente e o servidor sem validar o certificado do servidorAttached DB file name
- Attached DB file name
+ Nome do arquivo de banco de dados anexadoThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ O nome do arquivo principal, incluindo o nome do caminho completo, de um banco de dados anexávelContext connection
- Context connection
+ Conexão de contextoWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Quando true, indica que a conexão deve ser do contexto do SQL Server. Disponível somente quando executado no processo do SQL ServerPort
- Port
+ PortaConnect retry count
- Connect retry count
+ Contagem de nova tentativa de conexãoNumber of attempts to restore connection
- Number of attempts to restore connection
+ Número de tentativas para restaurar a conexãoConnect retry interval
- Connect retry interval
+ Intervalo de nova tentativa de conexãoDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Atraso entre as tentativas de restauração de conexãoApplication name
- Application name
+ Nome do aplicativoThe name of the application
- The name of the application
+ O nome do aplicativoWorkstation Id
- Workstation Id
+ ID da estação de trabalhoThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ O nome da estação de trabalho que se conecta ao SQL ServerPooling
- Pooling
+ PoolingWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Quando true, o objeto de conexão é extraído do pool apropriado ou, se necessário, é criado e adicionado ao pool apropriadoMax pool size
- Max pool size
+ Tamanho máximo do poolThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ O número máximo de conexões permitidas no poolMin pool size
- Min pool size
+ Tamanho mínimo do poolThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ O número mínimo de conexões permitidas no poolLoad balance timeout
- Load balance timeout
+ Tempo limite de balanceamento de cargaThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ O período mínimo de tempo (em segundos) para que essa conexão exista no pool antes de ser destruídaReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Usado pelo SQL Server na replicaçãoAttach DB filename
- Attach DB filename
+ Anexar o nome do arquivo de banco de dadosFailover partner
- Failover partner
+ Parceiro de failoverThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ O nome ou endereço de rede da instância do SQL Server que atua como um parceiro de failoverMulti subnet failover
- Multi subnet failover
+ Failover de várias sub-redesMultiple active result sets
- Multiple active result sets
+ Vários conjuntos de resultados ativosWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Quando true, vários conjuntos de resultados podem ser retornados e lidos de uma conexãoPacket size
- Packet size
+ Tamanho do pacoteSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Tamanho em bytes dos pacotes de rede usados para comunicar-se com uma instância do SQL ServerType system version
- Type system version
+ Versão do sistema de tiposIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Indica qual sistema de tipos de servidor o provedor poderá expor por meio do DataReader
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ Não foi retornada nenhuma ID de lote de trabalho do Spark da resposta. {0}[Erro]{1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ Nenhum log foi retornado na resposta.{0}[Error] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ Os parâmetros para SparkJobSubmissionModel são inválidossubmissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs é inválido. livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId é inválido. Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Obtenha o tempo limite da ID do Aplicativo. {0}[Log] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ A propriedade localFilePath ou hdfsFolderPath não está especificada.Property Path is not specified.
- Property Path is not specified.
+ O Caminho da Propriedade não está especificado.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ Os parâmetros de SparkJobSubmissionDialog são ilegaisNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Enviar{0} Spark Job Submission:
- {0} Spark Job Submission:
+ Envio do Trabalho do Spark {0}:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Início do Envio do Trabalho do Spark ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ Insira um nome...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Cluster do SparkPath to a .jar or .py file
- Path to a .jar or .py file
+ Caminho para um arquivo .jar ou .pyThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ O arquivo local selecionado será carregado no HDFS: {0}JAR/py File
- JAR/py File
+ Arquivo JAR/pyMain Class
- Main Class
+ Classe principalArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ Argumentos de linha de comando usados em sua classe principal. Vários argumentos devem ser divididos por espaço.Property Job Name is not specified.
- Property Job Name is not specified.
+ O Nome do Trabalho de Propriedade não está especificado.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ O arquivo JAR/py de propriedade não está especificado.Property Main Class is not specified.
- Property Main Class is not specified.
+ A Classe Principal da Propriedade não está especificada.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} não existe no Cluster ou na exceção gerada.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ O arquivo HDFS especificado não existe. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ Erro ao localizar o arquivo devido ao Erro: {0}Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Selecione o SQL Server com o Cluster de Big Data.No Sql Server is selected.
- No Sql Server is selected.
+ Nenhum SQL Server está selecionado.Error Get File Path: {0}
- Error Get File Path: {0}
+ Erro ao Obter o Caminho do Arquivo: {0}Invalid Data Structure
- Invalid Data Structure
+ Estrutura de Dados InválidaUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ Não é possível criar o cliente WebHDFS devido a opções ausentes: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '$ {0}' é indefinido.Bad Request
- Bad Request
+ Solicitação InválidaUnauthorized
- Unauthorized
+ Não autorizadoForbidden
- Forbidden
+ ProibidoNot Found
- Not Found
+ Não encontradoInternal Server Error
- Internal Server Error
+ Erro interno do servidorUnknown Error
- Unknown Error
+ Erro desconhecidoUnexpected Redirect
- Unexpected Redirect
+ Redirecionamento inesperadoPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Forneça a senha para conectar-se ao HDFS:Session for node {0} does not exist
- Session for node {0} does not exist
+ A sessão para o nó {0} não existeError notifying of node change: {0}
- Error notifying of node change: {0}
+ Erro ao notificar a alteração de nó: {0}Root
- Root
+ RaizHDFS
- HDFS
+ HDFSData Services
- Data Services
+ Serviços de dadosNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ Aviso: este arquivo foi truncado em {0} para visualização. The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ O arquivo foi truncado em {0} para visualização.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo é indefinido.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options está indefinido.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ Algumas propriedades ausentes em connectionInfo.options: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ Não há suporte para a ação {0} para esse manipuladorCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ Não é possível abrir o link {0} porque somente há suporte para os links HTTP e HTTPSDownload and open '{0}'?
- Download and open '{0}'?
+ Baixar e abrir '{0}'?Could not find the specified file
- Could not find the specified file
+ Não foi possível localizar o arquivo especificadoFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ Falha na solicitação de abertura de arquivo com o erro: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Erro ao parar o servidor do notebook: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ O processo do Notebook foi encerrado prematuramente com erro: {0}, Saída de StdErr: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Erro enviado do Jupyter: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... O jupyter está sendo executado em {0}... Starting Notebook server
- ... Starting Notebook server
+ ...Iniciando o servidor do NotebookUnexpected setting type {0}
- Unexpected setting type {0}
+ Tipo de configuração inesperado {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ Não é possível iniciar uma sessão. O gerente ainda não foi inicializadoSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Os kernels do Spark exigem uma conexão com uma instância mestra do cluster de Big data do SQL Server.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ Falha no desligamento do servidor do notebook: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ A instalação de dependências do Notebook está em andamentoPython download is complete
- Python download is complete
+ O download do Python está concluídoError while downloading python setup
- Error while downloading python setup
+ Erro ao baixar a configuração do PythonDownloading python package
- Downloading python package
+ Baixando o pacote PythonUnpacking python package
- Unpacking python package
+ Descompactando o pacote PythonError while creating python installation directory
- Error while creating python installation directory
+ Erro ao criar o diretório de instalação do PythonError while unpacking python bundle
- Error while unpacking python bundle
+ Erro ao descompactar o pacote PythonInstalling Notebook dependencies
- Installing Notebook dependencies
+ Instalando as dependências do NotebookInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Instalando as dependências do Notebook. Confira a exibição Tarefas para obter mais informaçõesNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ A instalação das dependências do notebook está concluídaCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Não é possível substituir a instalação existente do Python enquanto o Python está em execução.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ Outra instalação do Python está em andamento no momento.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ O Python já existe no local específico. Ignorando a instalação.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ Falha na instalação das dependências do Notebook com o erro: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Baixando o Python local para a plataforma: {0} a {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Instalando os pacotes necessários para executar Notebooks...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ...Instalação do Jupyter concluída.Installing SparkMagic...
- Installing SparkMagic...
+ Instalando o SparkMagic...A notebook path is required
- A notebook path is required
+ É necessário um caminho de notebookNotebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Somente há suporte para os Notebooks .ipynbAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ Tem certeza de que deseja reinstalar?Configure Python for Notebooks
- Configure Python for Notebooks
+ Configurar o Python para notebooksInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Localização da Instalação do PythonSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Esta instalação demorará um pouco. É recomendado não fechar o aplicativo até que a instalação seja concluída.The specified install location is invalid.
- The specified install location is invalid.
+ O local de instalação especificado é inválido.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ Não foi encontrada nenhuma instalação do Python no local especificado.Python installation was declined.
- Python installation was declined.
+ A instalação do Python foi recusada.Installation Type
- Installation Type
+ Tipo de instalaçãoNew Python installation
- New Python installation
+ Nova instalação do PythonUse existing Python installation
- Use existing Python installation
+ Usar a instalação existente do PythonOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ Falha ao abrir o arquivo {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Falha ao abrir o arquivo {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Falha ao abrir o arquivo {0}: {1}Missing file : {0}
- Missing file : {0}
+ Arquivo ausente: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Este código de exemplo carrega o arquivo em um quadro de dados e mostra os primeiros 10 resultados.No notebook editor is active
- No notebook editor is active
+ Não há nenhum editor de notebook ativoCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ Que tipo de célula você deseja adicionar?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Extensão de implantação do SQL Server para o Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Fornece uma experiência baseada em Notebook para implantar o Microsoft SQL ServerDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Implantar o SQL Server no Docker...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ Implantar o cluster de Big Data do SQL Server...Deploy SQL Server…
- Deploy SQL Server…
+ Implantar o SQL Server...Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ Imagem de contêiner do SQL ServerRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Executar a imagem de contêiner do SQL Server com o DockerSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ O cluster de Big Data do SQL Server permite implantar clusters escalonáveis de contêineres do SQL Server, do Spark e do HDFS em execução no KubernetesVersion
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/Docker/2017/Deploy-sql2017-Image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/Docker/2019/Deploy-sql2019-Image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Cluster de Big Data do SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ Destino de implantaçãoNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/BDC/2019/CTP3-1/Deploy-BDC-AKs.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/BDC/2019/CTP3-1/Deploy-BDC-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Um utilitário de linha de comando escrito em Python que permite que os administradores de cluster inicializem e gerenciem o cluster de Big Data por meio de APIs RESTmssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Uma ferramenta de linha de comando permite executar comandos em clusters do kuberneteskubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Fornece a capacidade de empacotar e executar um aplicativo em contêineres isoladosDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Uma ferramenta de linha de comando para gerenciar recursos do AzureAzure CLI
- Azure CLI
+ CLI do Azure
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ Não foi possível localizar o package.json ou o nome/editor não está definido
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ O notebook {0} não existe
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Selecione as opções de implantaçãoOpen Notebook
- Open Notebook
+ Abrir o NotebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ Falha ao carregar a extensão: {0}, Erro detectado na definição de tipo de recurso no package.json. Verifique o console de depuração para obter detalhes.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ O tipo de recurso: {0} não está definido
diff --git a/resources/xlf/pt-br/schema-compare.pt-BR.xlf b/resources/xlf/pt-br/schema-compare.pt-BR.xlf
index d82eb84dc0..404bf1c5de 100644
--- a/resources/xlf/pt-br/schema-compare.pt-BR.xlf
+++ b/resources/xlf/pt-br/schema-compare.pt-BR.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server Schema CompareSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ O SQL Server Schema Compare para o Azure Data Studio dá suporte à comparação de esquemas de bancos de dados e dacpacs.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ As opções mudaram. Comparar novamente para ver a comparação?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ Opções geraisInclude Object Types
- Include Object Types
+ Incluir tipos de objetoIgnore Table Options
- Ignore Table Options
+ Ignorar as Opções de TabelaIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Ignorar ponto-e-vírgula entre instruçõesIgnore Route Lifetime
- Ignore Route Lifetime
+ Ignorar o Tempo de Vida da RotaIgnore Role Membership
- Ignore Role Membership
+ Ignorar Associação de funçãoIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Ignorar identificadores entre aspasIgnore Permissions
- Ignore Permissions
+ Ignorar permissõesIgnore Partition Schemes
- Ignore Partition Schemes
+ Ignorar os Esquemas de PartiçãoIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Ignorar o Posicionamento de Objeto no Esquema de PartiçãoIgnore Not For Replication
- Ignore Not For Replication
+ Ignorar os que Não São para ReplicaçãoIgnore Login Sids
- Ignore Login Sids
+ Ignorar os SIDs de LogonIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Ignorar dicas de bloqueio em índicesIgnore Keyword Casing
- Ignore Keyword Casing
+ Ignorar o Uso de Maiúsculas e Minúsculas em Palavra-chaveIgnore Index Padding
- Ignore Index Padding
+ Ignorar o Preenchimento de ÍndiceIgnore Index Options
- Ignore Index Options
+ Ignorar as Opções de ÍndiceIgnore Increment
- Ignore Increment
+ Ignorar incrementoIgnore Identity Seed
- Ignore Identity Seed
+ Ignorar semente de identidadeIgnore User Settings Objects
- Ignore User Settings Objects
+ Ignorar os Objetos de Configurações do UsuárioIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ Ignorar o FilePath do Catálogo de Texto CompletoIgnore Whitespace
- Ignore Whitespace
+ Ignorar espaço em brancoIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Ignorar com NOCHECK em ForeignKeysVerify Collation Compatibility
- Verify Collation Compatibility
+ Verificar a Compatibilidade da OrdenaçãoUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Avisos de Objeto Não ModificávelTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Tratar os Erros de Verificação como AvisosScript Refresh Module
- Script Refresh Module
+ Módulo de atualização de scriptScript New Constraint Validation
- Script New Constraint Validation
+ Validação de nova restrição de scriptScript File Size
- Script File Size
+ Tamanho do arquivo de scriptScript Deploy StateChecks
- Script Deploy StateChecks
+ StateChecks de implantação de scriptScript Database Options
- Script Database Options
+ Opções de banco de dados de scriptScript Database Compatibility
- Script Database Compatibility
+ Compatibilidade do Banco de Dados de ScriptScript Database Collation
- Script Database Collation
+ Ordenação de Banco de Dados de ScriptRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Executar executores do plano de implementaçãoRegister DataTier Application
- Register DataTier Application
+ Registrar o aplicativo DataTierPopulate Files On File Groups
- Populate Files On File Groups
+ Popular Arquivos em Grupos de ArquivosNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Sem instruções ALTER para alterar tipos CLRInclude Transactional Scripts
- Include Transactional Scripts
+ Incluir scripts transacionaisInclude Composite Objects
- Include Composite Objects
+ Incluir objetos compostosAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Permitir a Movimentação Não Segura de Dados de Segurança em Nível de LinhaIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Ignorar sem Nenhuma Verificação nas Restrições de VerificaçãoIgnore Fill Factor
- Ignore Fill Factor
+ Ignorar fator de preenchimentoIgnore File Size
- Ignore File Size
+ Ignorar o Tamanho do ArquivoIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Ignorar o posicionamento do grupo de arquivosDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ Não Alterar os Objetos ReplicadosDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ Não Alterar os Objetos de Captura de Dados de AlteraçõesDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ Desabilitar e Reabilitar os Gatilhos DDLDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Implantar o Banco de Dados no Modo de Usuário ÚnicoCreate New Database
- Create New Database
+ Criar Banco de DadosCompare Using Target Collation
- Compare Using Target Collation
+ Comparar Usando a Ordenação de DestinoComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Comentário nas Declarações de Definição de VarBlock When Drift Detected
- Block When Drift Detected
+ Bloquear quando For Detectada DessincronizaçãoBlock On Possible Data Loss
- Block On Possible Data Loss
+ Bloquear em uma Possível Perda de DadosBackup Database Before Changes
- Backup Database Before Changes
+ Faça Backup do Banco de Dados antes das AlteraçõesAllow Incompatible Platform
- Allow Incompatible Platform
+ Permitir plataforma incompatívelAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Permitir a Remoção de Assemblies de BloqueioDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Remover as Restrições que Não Estão na OrigemDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Remover os Gatilhos DML que Não Estão na OrigemDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Remover as Propriedades Estendidas que Não Estão na OrigemDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Remover os Índices que Não Estão na OrigemIgnore File And Log File Path
- Ignore File And Log File Path
+ Ignorar o Arquivo e o Caminho do Arquivo de LogIgnore Extended Properties
- Ignore Extended Properties
+ Ignorar as Propriedades EstendidasIgnore Dml Trigger State
- Ignore Dml Trigger State
+ Ignorar o Estado do Gatilho DMLIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ Ignorar a Ordem do Gatilho DMLIgnore Default Schema
- Ignore Default Schema
+ Ignorar o Esquema PadrãoIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Ignorar o Estado do Gatilho de DDLIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Ignorar a Ordem do Gatilho DDLIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ Ignorar o FilePath do Provedor de CriptografiaVerify Deployment
- Verify Deployment
+ Verificar a ImplantaçãoIgnore Comments
- Ignore Comments
+ Ignorar comentáriosIgnore Column Collation
- Ignore Column Collation
+ Ignorar a Ordenação de ColunasIgnore Authorizer
- Ignore Authorizer
+ Ignorar o AutorizadorIgnore AnsiNulls
- Ignore AnsiNulls
+ Ignorar AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ Gerar SmartDefaultsDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Remover as Estatísticas que Não Estão na OrigemDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Remover os Membros da Função que Não Estão na OrigemDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Remover as Permissões que Não Estão na OrigemDrop Objects Not In Source
- Drop Objects Not In Source
+ Remover os Objetos que Não Estão na OrigemIgnore Column Order
- Ignore Column Order
+ Ignorar a Ordem de ColunasAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ Tabelas de arquivosFull Text Catalogs
- Full Text Catalogs
+ Catálogos de texto completoFull Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ Funções de valor escalarSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ Funções com valor de tabelaUser Defined Data Types
- User Defined Data Types
+ Tipos de dados definidos pelo usuárioUser Defined Table Types
- User Defined Table Types
+ Tipos de tabela definidos pelo usuárioClr User Defined Types
- Clr User Defined Types
+ Tipos definidos pelo usuário CLRUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ Gatilhos de servidorSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Especifica que a publicação sempre deve remover e recriar um assembly quando há uma diferença em vez de emitir uma instrução ALTER ASSEMBLYSpecifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Se true, o banco de dados é definido para o Modo de Usuário Único antes da implantação.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Essa configuração determina como a ordenação do banco de dados é manipulada durante a implantação. Por padrão, a ordenação do banco de dados de destino será atualizada se não corresponder à ordenação especificada pela origem. Quando essa opção estiver definida, a ordenação do banco de dados de destino (ou do servidor) deverá ser usada.Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Especifica se os membros da função que não estão definidos no arquivo de instantâneo (.dacpac) do banco de dados serão removidos do banco de dados de destino quando você publicar as atualizações em um banco de dados.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ Arquivo do Aplicativo da Camada de Dados (.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ Um esquema de origem diferente foi selecionado. Comparar para ver a comparação?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ Um esquema de destino diferente foi selecionado. Comparar para ver a comparação?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Diferentes esquemas de origem e de destino foram selecionados. Comparar para ver a comparação?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Comparar os DetalhesAre you sure you want to update the target?
- Are you sure you want to update the target?
+ Tem certeza de que deseja atualizar o destino?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Pressione comparar para atualizar a comparação.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Gerar script para implantar alterações no destinoNo changes to script
- No changes to script
+ Nenhuma alteração no scriptApply changes to target
- Apply changes to target
+ Aplicar alterações ao destinoNo changes to apply
- No changes to apply
+ Não há nenhuma alteração a ser aplicadaDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Inicializando a comparação. Isso pode demorar um pouco.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Para comparar dois esquemas, primeiro selecione um esquema de origem e um esquema de destino e, em seguida, pressione Comparar.No schema differences were found.
- No schema differences were found.
+ Não foi encontrada nenhuma diferença de esquema.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ Falha na comparação de esquema: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ Gerar script é habilitado quando o destino é um banco de dadosApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ Aplicar é habilitado quando o destino é um banco de dadosCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ Falha no cancelamento da comparação de esquema: '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ Falha ao gerar script: '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ Falha ao Aplicar a Comparação de Esquema '{0}'Switch direction
- Switch direction
+ Direção da opçãoSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ Abrir arquivo .scmpLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Carregar a origem, o destino e as opções salvas em um arquivo .scmpOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ Falha ao abrir o SCMP: '{0}'Save .scmp file
- Save .scmp file
+ Salvar o arquivo .scmpSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Salvar a origem e o destino, as opções e os elementos excluídosSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ Falha ao salvar SCMP: '{0}'
diff --git a/resources/xlf/ru/admin-tool-ext-win.ru.xlf b/resources/xlf/ru/admin-tool-ext-win.ru.xlf
index ce689047d4..00262a85cd 100644
--- a/resources/xlf/ru/admin-tool-ext-win.ru.xlf
+++ b/resources/xlf/ru/admin-tool-ext-win.ru.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ Расширения средства администрирования баз данных для WindowsAdds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ Добавляет в Azure Data Studio дополнительные возможности для WindowsProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ Создание сценариев...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Не указан ConnectionContext для handleLaunchSsmsMinPropertiesDialogCommandCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ Не удалось определить узел обозревателя объектов из connectionContext: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ Не указан ConnectionContext для handleLaunchSsmsMinPropertiesDialogCommandNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ Не указан connectionProfile из connectionContext: {0}Launching dialog...
- Launching dialog...
+ Запуск диалогового окна...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ Ошибка при вызове SsmsMin с аргументами "{0}" — {1}
diff --git a/resources/xlf/ru/agent.ru.xlf b/resources/xlf/ru/agent.ru.xlf
index 736f47f3d6..ad3c356328 100644
--- a/resources/xlf/ru/agent.ru.xlf
+++ b/resources/xlf/ru/agent.ru.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ Пакет службы интеграции SQL ServerSQL Server Agent Service Account
diff --git a/resources/xlf/ru/azurecore.ru.xlf b/resources/xlf/ru/azurecore.ru.xlf
index cfe86e2219..8a7cab10f6 100644
--- a/resources/xlf/ru/azurecore.ru.xlf
+++ b/resources/xlf/ru/azurecore.ru.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: обновление всех учетных записейRefresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: входSelect Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ Добавить к серверамClear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ Ресурсы не найдены
diff --git a/resources/xlf/ru/cms.ru.xlf b/resources/xlf/ru/cms.ru.xlf
index 95a9320ab3..e1d8d313df 100644
--- a/resources/xlf/ru/cms.ru.xlf
+++ b/resources/xlf/ru/cms.ru.xlf
@@ -4,11 +4,11 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ Центральные серверы управления SQL ServerSupport for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ Поддержка управления центральными серверами управления SQL ServerCentral Management Servers
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ Обновить группу серверовDelete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ Новая регистрация сервера...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ Новая группа серверов...Add Central Management Server
- Add Central Management Server
+ Добавить центральный сервер управленияDelete
@@ -56,51 +56,51 @@
MSSQL configuration
- MSSQL configuration
+ Конфигурация MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
- Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
+ Нужно ли отображать столбцы BIT как числа (1 или 0)? Если задано значение false, столбцы BIT будут отображаться как "true" или "false"Should column definitions be aligned?
- Should column definitions be aligned?
+ Должны ли определения столбцов быть выровнены?Should data types be formatted as UPPERCASE, lowercase, or none (not formatted)
- Should data types be formatted as UPPERCASE, lowercase, or none (not formatted)
+ Следует ли форматировать типы данных в верхнем регистре, нижнем регистре или оставить без форматирования ("нет")Should keywords be formatted as UPPERCASE, lowercase, or none (not formatted)
- Should keywords be formatted as UPPERCASE, lowercase, or none (not formatted)
+ Следует ли форматировать ключевые слова в верхнем регистре, нижнем регистре или оставить без форматирования ("нет")should commas be placed at the beginning of each statement in a list e.g. ', mycolumn2' instead of at the end e.g. 'mycolumn1,'
- should commas be placed at the beginning of each statement in a list e.g. ', mycolumn2' instead of at the end e.g. 'mycolumn1,'
+ нужно ли ставить запятые в начале каждого оператора в списке, например ", mycolumn2", а не в конце, например "mycolumn1,"Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines
- Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines
+ Нужно ли разделять на отдельные строки ссылки на объекты в выбранных операторах? Например, для "SELECT C1, C2 FROM T1" как C1, так и C2 будут находиться на отдельных строках[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Дополнительно] Выведите выходные данные отладки в консоль (Вид -> Вывод), а затем выберите подходящий выходной канал в раскрывающемся списке[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Дополнительно] Уровень ведения журнала для серверных служб. Azure Data Studio создает имя файла при каждом запуске, а если такой файл уже существует, записи журналов добавляются в него. Для очистки старых файлов журналов см. описание параметров logRetentionMinutes и logFilesRemovalLimit. Параметр tracingLevel по умолчанию регистрирует не слишком многое. Изменение детализации может привести к тому, что журналы будут занимать слишком много места. Ошибка включает критический уровень, предупреждение включает ошибку, информационный уровень включает предупреждение, а подробный уровень включает информационный уровеньNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Срок хранения файлов журналов (в минутах) для серверных служб. По умолчанию задана 1 неделя.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Максимальное число старых файлов, удаляемых при запуске, с истекшим сроком mssql.logRetentionMinutes. Файлы, которые не были очищены из-за этого ограничения, очищаются при следующем запуске Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Дополнительно] Не показывать предупреждения для неподдерживаемых платформRecovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ Ценовая категорияCompatibility Level
@@ -168,11 +168,11 @@
Name (optional)
- Name (optional)
+ Имя (необязательно)Custom name of the connection
- Custom name of the connection
+ Настраиваемое имя подключенияServer
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Имя экземпляра SQL ServerServer Description
- Server Description
+ Описание сервераDescription of the SQL Server instance
- Description of the SQL Server instance
+ Описание экземпляра SQL ServerAuthentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Указывает способ проверки подлинности в SQL ServerSQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory — универсальный с поддержкой MFAUser name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Указывает идентификатор пользователя, который необходимо использовать для подключения к источнику данныхPassword
@@ -224,107 +224,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Указывает пароль, который необходимо использовать для подключения к источнику данныхApplication intent
- Application intent
+ Намерение приложенияDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Объявляет тип рабочей нагрузки приложения при соединении с серверомAsynchronous processing
- Asynchronous processing
+ Асинхронная обработкаWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Когда задано значение true, разрешено использовать асинхронные функции в поставщике данных .NET FrameworkConnect timeout
- Connect timeout
+ Истекло время ожидания подключенияThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Длительность (в секундах) ожидания при подключении к серверу, после чего попытка прекращается и выводится ошибкаCurrent language
- Current language
+ Текущий языкThe SQL Server language record name
- The SQL Server language record name
+ Имя записи языка SQL ServerColumn encryption
- Column encryption
+ Шифрование столбцовDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Параметр шифрования столбца по умолчанию для всех команд подключенияEncrypt
- Encrypt
+ ШифроватьWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Когда задано значение true, SQL Server использует шифрование SSL для всех данных, передаваемых между клиентом и сервером, если на сервере установлен сертификатPersist security info
- Persist security info
+ Сохранение сведений о безопасностиWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Если задано значение false, то секретные данные (например, пароль) не возвращаются в составе подключенияTrust server certificate
- Trust server certificate
+ Доверять сертификату сервераWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Когда задано значение true (и encrypt=true), SQL Server использует шифрование SSL для всех данных, передаваемых между клиентом и сервером без проверки сертификата сервераAttached DB file name
- Attached DB file name
+ Имя вложенного файла базы данныхThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Имя первичного файла прикрепляемой базы данных, включая полный путьContext connection
- Context connection
+ Контекстное подключениеWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Если задано значение true, указывает, что подключение должно быть произведено в контексте SQL Server. Доступно только при выполнении в процессе SQL ServerPort
- Port
+ ПортConnect retry count
- Connect retry count
+ Счетчик повторных попыток для подключенияNumber of attempts to restore connection
- Number of attempts to restore connection
+ Число попыток восстановления подключенияConnect retry interval
- Connect retry interval
+ Интервал повторных попыток для подключенияDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Задержка между попытками восстановления подключенияApplication name
@@ -332,47 +332,47 @@
The name of the application
- The name of the application
+ Имя приложенияWorkstation Id
- Workstation Id
+ Идентификатор рабочей станцииThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Имя рабочей станции, подключающейся к SQL ServerPooling
- Pooling
+ Объединение в пулWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Если задано значение true, объект соединения извлекается из соответствующего пула или при необходимости создается и добавляется в соответствующий пулMax pool size
- Max pool size
+ Максимальный размер пулаThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Максимально допустимое число подключений в пулеMin pool size
- Min pool size
+ Минимальный размер пулаThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Минимально допустимое число подключений в пулеLoad balance timeout
- Load balance timeout
+ Истекло время ожидания при балансировке нагрузкиThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Минимальное время (в секундах), которое это подключение будет оставаться в пуле до уничтоженияReplication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Используется SQL Server при репликацииAttach DB filename
- Attach DB filename
+ Имя вложенного файла базы данныхFailover partner
- Failover partner
+ Партнер по обеспечению отработки отказаThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Имя или сетевой адрес экземпляра SQL Server, выступающего в роли партнера по обеспечению отработки отказаMulti subnet failover
- Multi subnet failover
+ Отработка отказа в нескольких подсетяхMultiple active result sets
- Multiple active result sets
+ Множественные активные результирующие наборыWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Если задано значение true, из одного подключения может быть возвращено и считано несколько результирующих наборовPacket size
- Packet size
+ Размер пакетаSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Размер (в байтах) сетевых пакетов, которые используются для взаимодействия с экземпляром SQL ServerType system version
- Type system version
+ Версия системы типовIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Указывает, какую систему серверного типа предоставить поставщик через DataReader
@@ -436,11 +436,11 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ Центральный сервер управления {0} не найден или отключенNo resources found
- No resources found
+ Ресурсы не найдены
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ Добавить центральный сервер управления...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ Группа центральных серверов управления уже содержит зарегистрированный сервер с именем {0}Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ Не удалось добавить зарегистрированный сервер {0}Are you sure you want to delete
- Are you sure you want to delete
+ Вы действительно хотите удалитьYes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ Описание группы серверов{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} уже имеет группу серверов с именем {1}Are you sure you want to delete
- Are you sure you want to delete
+ Вы действительно хотите удалить
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ Вы не можете добавить общий зарегистрированный сервер с таким же именем, что и сервер конфигурации
diff --git a/resources/xlf/ru/dacpac.ru.xlf b/resources/xlf/ru/dacpac.ru.xlf
index b2a4ac5b3b..1bf2527f46 100644
--- a/resources/xlf/ru/dacpac.ru.xlf
+++ b/resources/xlf/ru/dacpac.ru.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ Вы можете просмотреть состояние создания сценариев в представлении задач после закрытия мастера. После завершения созданный сценарий откроется.Generating deploy plan failed '{0}'
diff --git a/resources/xlf/ru/import.ru.xlf b/resources/xlf/ru/import.ru.xlf
index 4b570a6f7e..5a7ff751a8 100644
--- a/resources/xlf/ru/import.ru.xlf
+++ b/resources/xlf/ru/import.ru.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ Эта операция не была выполнена. Попробуйте использовать другой входной файл.Refresh
diff --git a/resources/xlf/ru/mssql.ru.xlf b/resources/xlf/ru/mssql.ru.xlf
index 12a57e4945..ec73d74251 100644
--- a/resources/xlf/ru/mssql.ru.xlf
+++ b/resources/xlf/ru/mssql.ru.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ Отправить файлыNew directory
- New directory
+ Создать каталогDelete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ Создать записную книжкуOpen Notebook
- Open Notebook
+ Открыть NotebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ Задачи и сведения о вашем кластере больших данных SQL ServerSQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ Отправить задание SparkNew Spark Job
- New Spark Job
+ Создать задание SparkView Spark History
- View Spark History
+ Просмотреть журнал SparkView Yarn History
- View Yarn History
+ Просмотреть журнал YARNTasks
@@ -88,75 +88,75 @@
Install Packages
- Install Packages
+ Установка пакетовConfigure Python for Notebooks
- Configure Python for Notebooks
+ Настройка Python для Записных книжекCluster Status
- Cluster Status
+ Состояние кластераSearch: Servers
- Search: Servers
+ Поиск: СерверыSearch: Clear Search Server Results
- Search: Clear Search Server Results
+ Поиск: Очистить результаты поиска сервераService Endpoints
- Service Endpoints
+ Конечные точки службыMSSQL configuration
- MSSQL configuration
+ Конфигурация MSSQLShould BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
- Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
+ Нужно ли отображать столбцы BIT как числа (1 или 0)? Если задано значение false, столбцы BIT будут отображаться как "true" или "falseShould column definitions be aligned?
- Should column definitions be aligned?
+ Должны ли определения столбцов быть выровнены?Should data types be formatted as UPPERCASE, lowercase, or none (not formatted)
- Should data types be formatted as UPPERCASE, lowercase, or none (not formatted)
+ Следует ли форматировать типы данных в верхнем регистре, нижнем регистре или оставить без форматирования ("нет")Should keywords be formatted as UPPERCASE, lowercase, or none (not formatted)
- Should keywords be formatted as UPPERCASE, lowercase, or none (not formatted)
+ Следует ли форматировать ключевые слова в верхнем регистре, нижнем регистре или оставить без форматирования ("нет")should commas be placed at the beginning of each statement in a list e.g. ', mycolumn2' instead of at the end e.g. 'mycolumn1,'
- should commas be placed at the beginning of each statement in a list e.g. ', mycolumn2' instead of at the end e.g. 'mycolumn1,'
+ нужно ли ставить запятые в начале каждого оператора в списке, например ", mycolumn2", а не в конце, например "mycolumn1,"Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines
- Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines
+ Нужно ли разделять на отдельные строки ссылки на объекты в выбранных операторах? Например, для "SELECT C1, C2 FROM T1" как C1, так и C2 будут находиться на отдельных строках[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [Дополнительно] Выведите выходные данные отладки в консоль (Вид --gt; Вывод), а затем выберите подходящий выходной канал в раскрывающемся списке[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [Дополнительно] Уровень ведения журнала для серверных служб. Azure Data Studio создает имя файла при каждом запуске, а если такой файл уже существует, записи журналов добавляются в него. Для очистки старых файлов журналов см. описание параметров logRetentionMinutes и logFilesRemovalLimit. Параметр tracingLevel по умолчанию регистрирует не слишком многое. Изменение детализации может привести к тому, что журналы будут занимать слишком много места. Ошибка включает критический уровень, предупреждение включает ошибку, информационный уровень включает предупреждение, а подробный уровень включает информационный уровеньNumber of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ Срок хранения файлов журналов (в минутах) для серверных служб. По умолчанию задана 1 неделя.Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ Максимальное число старых файлов, удаляемых при запуске, с истекшим сроком mssql.logRetentionMinutes. Файлы, которые не были очищены из-за этого ограничения, очищаются при следующем запуске Azure Data Studio.[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [Дополнительно] Не показывать предупреждения для неподдерживаемых платформRecovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ Ценовая категорияCompatibility Level
@@ -224,11 +224,11 @@
Name (optional)
- Name (optional)
+ Имя (необязательно)Custom name of the connection
- Custom name of the connection
+ Настраиваемое имя подключенияServer
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ Имя экземпляра SQL ServerDatabase
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ Имя исходного каталога или базы данных в источнике данныхAuthentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ Указывает способ проверки подлинности в SQL ServerSQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory — универсальный с поддержкой MFAUser name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ Указывает идентификатор пользователя, который необходимо использовать для подключения к источнику данныхPassword
@@ -280,107 +280,107 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ Указывает пароль, который необходимо использовать для подключения к источнику данныхApplication intent
- Application intent
+ Намерение приложенияDeclares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ Объявляет тип рабочей нагрузки приложения при соединении с серверомAsynchronous processing
- Asynchronous processing
+ Асинхронная обработкаWhen true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ Когда задано значение true, разрешено использовать асинхронные функции в поставщике данных .NET FrameworkConnect timeout
- Connect timeout
+ Истекло время ожидания подключенияThe length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ Длительность (в секундах) ожидания при подключении к серверу, после чего попытка прекращает и выводится ошибкаCurrent language
- Current language
+ Текущий языкThe SQL Server language record name
- The SQL Server language record name
+ Имя записи языка SQL ServerColumn encryption
- Column encryption
+ Шифрование столбцовDefault column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ Параметр шифрования столбца по умолчанию для всех команд подключенияEncrypt
- Encrypt
+ ШифроватьWhen true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ Когда задано значение true, SQL Server использует шифрование SSL для всех данных, передаваемых между клиентом и сервером, если на сервере установлен сертификатPersist security info
- Persist security info
+ Сохранение сведений о безопасностиWhen false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ Если задано значение false, то секретные данные (например, пароль) не возвращаются в составе подключенияTrust server certificate
- Trust server certificate
+ Доверять сертификату сервераWhen true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ Когда задано значение true (и encrypt=true), SQL Server использует шифрование SSL для всех данных, передаваемых между клиентом и сервером без проверки сертификата сервераAttached DB file name
- Attached DB file name
+ Имя вложенного файла базы данныхThe name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ Имя первичного файла прикрепляемой базы данных, включая полный путьContext connection
- Context connection
+ Контекстное подключениеWhen true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ Если задано значение true, указывает, что подключение должно быть произведено в контексте SQL Server. Доступно только при выполнении в процессе SQL ServerPort
- Port
+ ПортConnect retry count
- Connect retry count
+ Счетчик повторных попыток для подключенияNumber of attempts to restore connection
- Number of attempts to restore connection
+ Число попыток восстановления подключенияConnect retry interval
- Connect retry interval
+ Интервал повторных попыток для подключенияDelay between attempts to restore connection
- Delay between attempts to restore connection
+ Задержка между попытками восстановления подключенияApplication name
@@ -388,47 +388,47 @@
The name of the application
- The name of the application
+ Имя приложенияWorkstation Id
- Workstation Id
+ Идентификатор рабочей станцииThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ Имя рабочей станции, подключающейся к SQL ServerPooling
- Pooling
+ Объединение в пулWhen true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ Если задано значение true, объект соединения извлекается из соответствующего пула или при необходимости создается и добавляется в соответствующий пулMax pool size
- Max pool size
+ Максимальный размер пулаThe maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ Максимально допустимое число подключений в пулеMin pool size
- Min pool size
+ Минимальный размер пулаThe minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ Минимально допустимое число подключений в пулеLoad balance timeout
- Load balance timeout
+ Истекло время ожидания при балансировке нагрузкиThe minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ Минимальное время (в секундах), которое это подключение будет оставаться в пуле до уничтоженияReplication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ Используется SQL Server при репликацииAttach DB filename
- Attach DB filename
+ Имя вложенного файла базы данныхFailover partner
- Failover partner
+ Партнер по обеспечению отработки отказаThe name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ Имя или сетевой адрес экземпляра SQL Server, выступающего в роли партнера по обеспечению отработки отказаMulti subnet failover
- Multi subnet failover
+ Отработка отказа в нескольких подсетяхMultiple active result sets
- Multiple active result sets
+ Множественные активные результирующие наборыWhen true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ Если задано значение true, из одного подключения может быть возвращено и считано несколько результирующих наборовPacket size
- Packet size
+ Размер пакетаSize in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ Размер (в байтах) сетевых пакетов, которые используются для взаимодействия с экземпляром SQL ServerType system version
- Type system version
+ Версия системы типовIndicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ Указывает, какую систему серверного типа предоставит поставщик через DataReader
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ Идентификатор пакета задания Spark не возвращен из ответа.{0}[Ошибка] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ Журнал не возвращен в ответе.{0}[Ошибка] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ Недопустимые параметры для SparkJobSubmissionModelsubmissionArgs is invalid.
- submissionArgs is invalid.
+ Недопустимый submissionArgs.livyBatchId is invalid.
- livyBatchId is invalid.
+ Недопустимый livyBatchId.Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ Истекло время ожидания при получении идентификатора приложения. {0}[Журнал] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ Не указано свойство localFilePath или hdfsFolderPath.Property Path is not specified.
- Property Path is not specified.
+ Путь к свойству не указан.
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ Недопустимые параметры для SparkJobSubmissionDialogNew Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ Отправить{0} Spark Job Submission:
- {0} Spark Job Submission:
+ Отправка задания Spark {0}:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... Начало отправки задания Spark ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ Введите имя...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Кластер SparkPath to a .jar or .py file
- Path to a .jar or .py file
+ Путь к файлу JAR или PYThe selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ Выбранный локальный файл будет отправлен в HDFS: {0}JAR/py File
- JAR/py File
+ ФАЙЛ JAR/pyMain Class
- Main Class
+ Класс MainArguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ Аргументы командной строки, используемые в классе main; несколько аргументов следует разделять пробелом.Property Job Name is not specified.
- Property Job Name is not specified.
+ Имя задания свойства не указано.Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ Не указан файл свойств JAR/PY.Property Main Class is not specified.
- Property Main Class is not specified.
+ Не указан класс свойств Main.{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} не существует в кластере, или возникло исключение.The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ Указанный файл HDFS не существует. Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ Не удалось обнаружить файл из-за ошибки: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ JAR-файлы ссылокJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ JAR-файлы, помещаемые в рабочий каталог исполнителя. Путь к JAR-файлу должен быть путем HDFS. Несколько путей следует разделять точкой с запятой (;)Reference py Files
- Reference py Files
+ PY-файлы ссылокPy Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Файлы PY, помещаемые в рабочий каталог исполнителя. Путь к файлу должен быть путем HDFS. Несколько путей следует разделять точкой с запятой (;)Reference Files
- Reference Files
+ Файлы ссылокFiles to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ Файлы, помещаемые в рабочий каталог исполнителя. Путь к файлу должен быть путем HDFS. Несколько путей следует разделять точкой с запятой (;)Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ Выберите SQL Server с кластером больших данных.No Sql Server is selected.
- No Sql Server is selected.
+ SQL Server нее выбран.Error Get File Path: {0}
- Error Get File Path: {0}
+ Ошибка при получении пути к файлу: {0}Invalid Data Structure
- Invalid Data Structure
+ Недопустимая структура данныхUnable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ Не удалось создать клиент WebHDFS из-за отсутствующих параметров: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ "${0}" не определен.Bad Request
- Bad Request
+ Неправильный запросUnauthorized
- Unauthorized
+ Не авторизованоForbidden
- Forbidden
+ ЗапрещеноNot Found
@@ -724,7 +724,7 @@
Internal Server Error
- Internal Server Error
+ Внутренняя ошибка сервераUnknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ Неожиданное перенаправлениеPlease provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ Укажите пароль для подключения HDFS:Session for node {0} does not exist
- Session for node {0} does not exist
+ Сеанс для узла {0} не существуетError notifying of node change: {0}
- Error notifying of node change: {0}
+ Ошибка при уведомлении об изменении узла: {0}Root
- Root
+ КореньHDFS
- HDFS
+ HdfsData Services
- Data Services
+ Службы данныхNOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ Уведомление. Этот файл был обрезан на {0} для предварительного просмотра.The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ Файл был обрезан на {0} для предварительного просмотра.ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ ConnectionInfo не определен.ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ ConnectionInfo.options не определен.Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ Отсутствуют некоторые свойства в connectionInfo.options: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ Действие {0} не поддерживается для этого обработчикаCannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ Не удается открыть ссылку {0}, так как поддерживаются только ссылки HTTP и HTTPSDownload and open '{0}'?
- Download and open '{0}'?
+ Скачать и открыть "{0}"?Could not find the specified file
- Could not find the specified file
+ Не удалось найти указанный файлFile open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ Запрос на открытие файла завершился с ошибкой: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ Ошибка при остановке сервера Notebook: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ Процесс Notebook преждевременно завершил работу с ошибкой: {0}, выходные данные StdErr: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ Ошибка, отправленная из Jupyter: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter выполняется в {0}... Starting Notebook server
- ... Starting Notebook server
+ ... Выполняется запуск сервера NotebookUnexpected setting type {0}
- Unexpected setting type {0}
+ Непредвиденный тип параметра {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ Не удается запустить сеанс, диспетчер еще не инициализированSpark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Ядрам Spark требуется подключение к главному экземпляру кластера больших данных SQL Server.Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ Сбой при завершении работы сервера Notebook: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ Выполняется установка зависимостей NotebookPython download is complete
- Python download is complete
+ Скачивание Python завершеноError while downloading python setup
- Error while downloading python setup
+ Ошибка при скачивании программы установки PythonDownloading python package
- Downloading python package
+ Идет скачивание пакета PythonUnpacking python package
- Unpacking python package
+ Распаковка пакета PythonError while creating python installation directory
- Error while creating python installation directory
+ Ошибка при создании каталога установки PythonError while unpacking python bundle
- Error while unpacking python bundle
+ Ошибка при распаковке пакета PythonInstalling Notebook dependencies
- Installing Notebook dependencies
+ Идет установка зависимостей NotebookInstalling Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ Установка зависимостей Notebook, дополнительные сведения см. в представлении задачNotebook dependencies installation is complete
- Notebook dependencies installation is complete
+ Установка зависимостей Notebook завершенаCannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ Не удается перезаписать существующую установку Python, пока Python выполняется.Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ Уже выполняется другая установка Python.Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python уже существует в указанном расположении. Выполняется пропуск установки.Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ Установка зависимостей Notebook завершилась с ошибкой: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ Скачивание локального Python для платформы: {0} в {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ Установка требуемых пакетов для запуска Записных книжек...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Установка Jupyter завершена.Installing SparkMagic...
- Installing SparkMagic...
+ Установка SparkMagic...A notebook path is required
- A notebook path is required
+ Требуется путь к записной книжкеNotebooks
- Notebooks
+ Записные книжкиOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ Поддерживаются только записные книжки IPYNBAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ Вы действительно хотите переустановить?Configure Python for Notebooks
- Configure Python for Notebooks
+ Настройка Python для Записных книжекInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Расположение установки PythonSelect
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ Эта установка может занять некоторое время. Рекомендуется не закрывать приложение до завершения установки.The specified install location is invalid.
- The specified install location is invalid.
+ Указано недопустимое расположение установки.No python installation was found at the specified location.
- No python installation was found at the specified location.
+ В указанном расположении установка Python не найдена.Python installation was declined.
- Python installation was declined.
+ Установка python была отклонена.Installation Type
- Installation Type
+ Тип установкиNew Python installation
- New Python installation
+ Новая установка PythonUse existing Python installation
- Use existing Python installation
+ Использовать существующую установку PythonOpen file {0} failed: {1}
- Open file {0} failed: {1}
+ Сбой при открытии файла {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Сбой при открытии файла {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ Не удалось открыть файл {0}: {1}Missing file : {0}
- Missing file : {0}
+ Отсутствует файл: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ Этот пример кода загружает файл в кадр данных и отображает первые 10 результатов.No notebook editor is active
- No notebook editor is active
+ Нет активного редактора записных книжекCode
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ Какой тип ячейки требуется добавить?Notebooks
- Notebooks
+ Записные книжкиSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Расширение развертывания SQL Server для Azure Data StudioProvides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ Обеспечивает развертывание Microsoft SQL Server на основе записных книжекDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ Развернуть SQL Server в Docker…Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ Развернуть кластер больших данных SQL Server…Deploy SQL Server…
- Deploy SQL Server…
+ Развернуть SQL Server…Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ Образ контейнера SQL ServerRun SQL Server container image with Docker
- Run SQL Server container image with Docker
+ Запустить образ контейнера SQL Server с помощью DockerSQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ Кластер больших данных SQL Server позволяет развертывать масштабируемые кластеры контейнеров SQL Server, Spark и HDFS, работающие на базе KubernetesVersion
@@ -52,23 +52,23 @@
SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ Кластер больших данных SQL Server 2019 CTP 3.1Deployment target
- Deployment target
+ Целевой объект развертыванияNew Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ Программа командной строки, написанная на Python, которая позволяет администраторам кластера осуществлять начальную загрузку кластера больших данных и управление им с помощью REST APImssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ Программа командной строки, позволяющее выполнять команды для кластеров Kuberneteskubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ Позволяет упаковать и запустить приложение в изолированных контейнерахDocker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ Программа командной строки для управления ресурсами AzureAzure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ Не удалось найти package.json, либо не задано имя или издатель
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ Записная книжка {0} не существует
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ Выбор параметров развертыванияOpen Notebook
- Open Notebook
+ Открыть NotebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ Сбой при загрузке расширения: {0}, обнаружена ошибка в определении типа ресурса в package.json, дополнительные сведения см. в консоли отладки.The resource type: {0} is not defined
- The resource type: {0} is not defined
+ Тип ресурса: {0} не определен
diff --git a/resources/xlf/ru/schema-compare.ru.xlf b/resources/xlf/ru/schema-compare.ru.xlf
index d0cbe95f73..3c5bf75bdf 100644
--- a/resources/xlf/ru/schema-compare.ru.xlf
+++ b/resources/xlf/ru/schema-compare.ru.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ Сравнение схем SQL ServerSQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Сравнение схем SQL для Azure Data Studio поддерживает сравнение схем баз данных и файлов DACPAC.Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ Параметры изменились. Выполнить повторное сравнение для просмотра его результатов?Schema Compare Options
@@ -52,311 +52,311 @@
Include Object Types
- Include Object Types
+ Включить типы объектовIgnore Table Options
- Ignore Table Options
+ Игнорировать параметры таблицыIgnore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ Игнорировать точки с запятой между операторамиIgnore Route Lifetime
- Ignore Route Lifetime
+ Игнорировать время существования маршрутаIgnore Role Membership
- Ignore Role Membership
+ Игнорировать членство в ролиIgnore Quoted Identifiers
- Ignore Quoted Identifiers
+ Игнорировать нестандартные идентификаторыIgnore Permissions
- Ignore Permissions
+ Игнорировать разрешенияIgnore Partition Schemes
- Ignore Partition Schemes
+ Игнорировать схемы секционированияIgnore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ Пропускать размещение объекта в схеме секционированияIgnore Not For Replication
- Ignore Not For Replication
+ Игнорировать указание "Не для репликации"Ignore Login Sids
- Ignore Login Sids
+ Игнорировать идентификаторы SID имени входаIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ Игнорировать указания блокировки для индексовIgnore Keyword Casing
- Ignore Keyword Casing
+ Игнорировать регистр ключевых словIgnore Index Padding
- Ignore Index Padding
+ Игнорировать заполнение индексаIgnore Index Options
- Ignore Index Options
+ Игнорировать параметры индексаIgnore Increment
- Ignore Increment
+ Игнорировать приращениеIgnore Identity Seed
- Ignore Identity Seed
+ Игнорировать начальное значение IDENTITYIgnore User Settings Objects
- Ignore User Settings Objects
+ Игнорировать объекты параметров пользователейIgnore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ Игнорировать путь к файлу полнотекстового каталогаIgnore Whitespace
- Ignore Whitespace
+ Игнорировать пробелыIgnore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ Игнорировать со значением Nocheck для ForeignKeysVerify Collation Compatibility
- Verify Collation Compatibility
+ Проверить совместимость параметров сортировкиUnmodifiable Object Warnings
- Unmodifiable Object Warnings
+ Предупреждения о невозможности изменения объектаTreat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ Рассматривать ошибки проверок как предупрежденияScript Refresh Module
- Script Refresh Module
+ Модуль обновления сценарияScript New Constraint Validation
- Script New Constraint Validation
+ Проверка новых ограничений для сценарияScript File Size
- Script File Size
+ Размер файла сценарияScript Deploy StateChecks
- Script Deploy StateChecks
+ StateChecks для развертывания сценарияScript Database Options
- Script Database Options
+ Параметры базы данных сценариевScript Database Compatibility
- Script Database Compatibility
+ Совместимость базы данных сценариевScript Database Collation
- Script Database Collation
+ Параметры сортировки базы данных сценариевRun Deployment Plan Executors
- Run Deployment Plan Executors
+ Запустить исполнители плана развертыванияRegister DataTier Application
- Register DataTier Application
+ Регистрация приложения DataTierPopulate Files On File Groups
- Populate Files On File Groups
+ Заполнить файл в группах файловNo Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ Не использовать инструкции ALTER для изменения типов CLRInclude Transactional Scripts
- Include Transactional Scripts
+ Включить транзакционные сценарииInclude Composite Objects
- Include Composite Objects
+ Включить составные объектыAllow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ Разрешить небезопасное перемещение данных безопасности на уровне строкIgnore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ Игнорировать со значением "Без проверки" для параметра "Проверочные ограничения"Ignore Fill Factor
- Ignore Fill Factor
+ Игнорировать коэффициент заполненияIgnore File Size
- Ignore File Size
+ Игнорировать размер файлаIgnore Filegroup Placement
- Ignore Filegroup Placement
+ Игнорировать размещение файловой группыDo Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ Не изменяйте реплицированные объектыDo Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ Не изменяйте объекты отслеживания измененных данныхDisable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ Отключить и снова включить триггеры DDLDeploy Database In Single User Mode
- Deploy Database In Single User Mode
+ Развернуть базу данных в однопользовательском режимеCreate New Database
- Create New Database
+ Создать базу данныхCompare Using Target Collation
- Compare Using Target Collation
+ Сравнивать с помощью параметров сортировки целевого объектаComment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Закомментировать объявления заданных переменныхBlock When Drift Detected
- Block When Drift Detected
+ Блокировать при обнаружении смещенияBlock On Possible Data Loss
- Block On Possible Data Loss
+ Блокировать при возможной потере данныхBackup Database Before Changes
- Backup Database Before Changes
+ Создать резервную копию базы данных перед изменениемAllow Incompatible Platform
- Allow Incompatible Platform
+ Разрешить несовместимые платформыAllow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ Разрешить удаление блокирующих сборокDrop Constraints Not In Source
- Drop Constraints Not In Source
+ Удалить ограничения, отсутствующие в источникеDrop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ Удалить триггеры DML, отсутствующие в источникеDrop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ Удалить расширенные свойства, отсутствующие в источникеDrop Indexes Not In Source
- Drop Indexes Not In Source
+ Удалить индексы, отсутствующие в источникеIgnore File And Log File Path
- Ignore File And Log File Path
+ Игнорировать путь к файлу и файлу журналаIgnore Extended Properties
- Ignore Extended Properties
+ Игнорировать расширенные свойстваIgnore Dml Trigger State
- Ignore Dml Trigger State
+ Игнорировать состояние триггеров DMLIgnore Dml Trigger Order
- Ignore Dml Trigger Order
+ Игнорировать порядок триггеров DMLIgnore Default Schema
- Ignore Default Schema
+ Игнорировать схему по умолчаниюIgnore Ddl Trigger State
- Ignore Ddl Trigger State
+ Игнорировать состояние триггеров DDLIgnore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ Игнорировать порядок триггеров DDLIgnore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ Игнорировать путь к файлу поставщика шифрованияVerify Deployment
- Verify Deployment
+ Проверить развертываниеIgnore Comments
- Ignore Comments
+ Игнорировать примечанияIgnore Column Collation
- Ignore Column Collation
+ Игнорировать параметры сортировки столбцовIgnore Authorizer
- Ignore Authorizer
+ Игнорировать авторизаторIgnore AnsiNulls
- Ignore AnsiNulls
+ Игнорировать AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ Создание SmartDefaultsDrop Statistics Not In Source
- Drop Statistics Not In Source
+ Удалить статистику, отсутствующую в источникеDrop Role Members Not In Source
- Drop Role Members Not In Source
+ Удалить члены ролей, отсутствующие в источникеDrop Permissions Not In Source
- Drop Permissions Not In Source
+ Удалить разрешения, отсутствующие в источникеDrop Objects Not In Source
- Drop Objects Not In Source
+ Удалить объекты, отсутствующие в источникеIgnore Column Order
- Ignore Column Order
+ Игнорировать порядок столбцовAggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,7 +436,7 @@
File Tables
- File Tables
+ Таблицы файловFull Text Catalogs
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ Скалярные функцииSearch Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ Функции с табличным значениемUser Defined Data Types
- User Defined Data Types
+ Определяемые пользователем типы данныхUser Defined Table Types
- User Defined Table Types
+ Определяемые пользователем табличные типыClr User Defined Types
- Clr User Defined Types
+ Определяемые пользователем типы данных CLRUsers
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ Триггеры сервераSpecifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ Указывает, что при обнаружении различий в процессе публикации всегда следует удалять и повторно создавать сборку, а не использовать оператор ALTER ASSEMBLY.Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ Если задано значение true, то перед развертыванием база данных переводится в однопользовательский режим.Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ Этот параметр указывает, как обрабатывать параметры сортировки базы данных при развертывании. По умолчанию параметры сортировки целевой базы данных будут обновлены, если они не соответствуют аналогичным параметрам, указанным в источнике. Когда этот параметр задан, требуется использовать параметры сортировки целевой базы данных (или сервера).Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ Определяет, будут ли члены ролей, которые не определены в моментальном снимке базы данных (DACPAC), удаляться из целевой базы данных при публикации обновлений в базе данных.</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ Файл приложения уровня данных (DACPAC)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ Выбрана другая исходная схема. Выполнить сравнение для просмотра его результатов?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ Выбрана другая целевая схема. Выполнить сравнение для просмотра его результатов?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ Выбраны другие исходная и целевая схемы. Выполнить сравнение для просмотра его результатов?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ Сравнить сведенияAre you sure you want to update the target?
- Are you sure you want to update the target?
+ Вы действительно хотите обновить целевой объект?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ Нажмите "Сравнить", чтобы обновить сравнение.Generate script to deploy changes to target
- Generate script to deploy changes to target
+ Создать сценарий для развертывания изменений в целевом объектеNo changes to script
- No changes to script
+ Нет изменений в сценарииApply changes to target
- Apply changes to target
+ Применить изменения к целевому объектуNo changes to apply
- No changes to apply
+ Нет изменений для примененияDelete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ Инициализация сравнения. Это может занять некоторое время.To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ Чтобы сравнить две схемы, сначала выберите исходную и целевую схему, а затем нажмите кнопку "Сравнить".No schema differences were found.
- No schema differences were found.
+ Различия в схемах не найдены.Schema Compare failed: {0}
- Schema Compare failed: {0}
+ Сбой при сравнении схем: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ Создание сценария включено, когда целевой объект является базой данныхApply is enabled when the target is a database
- Apply is enabled when the target is a database
+ Применение включено, когда целевой объект является базой данныхCompare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ Сбой при отмене сравнения схем: "{0}"Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ Сбой при создании сценария: "{0}"Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ Сбой при применении сравнения схем "{0}"Switch direction
- Switch direction
+ Сменить направлениеSwitch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ Открыть файл SCMPLoad source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ Загрузить источник, целевой объект т параметры, сохраненные в файле SCMPOpen
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ Сбой при открытии файла SCMP: "{0}"Save .scmp file
- Save .scmp file
+ Сохранить файл SCMPSave source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ Сохранить источник, целевой объект, параметры и исключенные элементыSave
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ Сбой при сохранении файла SCMP: "{0}"
diff --git a/resources/xlf/zh-hans/admin-tool-ext-win.zh-Hans.xlf b/resources/xlf/zh-hans/admin-tool-ext-win.zh-Hans.xlf
index 77759364a2..0a43349d42 100644
--- a/resources/xlf/zh-hans/admin-tool-ext-win.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/admin-tool-ext-win.zh-Hans.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ 适用于 Windows 的数据库管理工具扩展Adds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ 向 Azure Data Studio 添加其他特定于 Windows 的功能Properties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ 生成脚本...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ 未为 handleLaunchSsmsMinPropertiesDialogCommand 提供 ConnectionContextCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ 无法基于 connectionContex 确定对象资源管理器节点: {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ 未为 handleLaunchSsmsMinPropertiesDialogCommand 提供 ConnectionContextNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ 未从 connectionContext 提供任何 connectionProfile: {0}Launching dialog...
- Launching dialog...
+ 正在启动对话框...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ 使用 args“{0}” 调用 SsmsMin 时出错 - {1}
diff --git a/resources/xlf/zh-hans/agent.zh-Hans.xlf b/resources/xlf/zh-hans/agent.zh-Hans.xlf
index 3fcd8958dc..29b3688ea1 100644
--- a/resources/xlf/zh-hans/agent.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/agent.zh-Hans.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ SQL Server 集成服务包SQL Server Agent Service Account
diff --git a/resources/xlf/zh-hans/azurecore.zh-Hans.xlf b/resources/xlf/zh-hans/azurecore.zh-Hans.xlf
index 1612e8c1d8..fb2b6f0851 100644
--- a/resources/xlf/zh-hans/azurecore.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/azurecore.zh-Hans.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: 刷新所有帐户Refresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: 登录Select Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ 添加到服务器Clear Azure Account Token Cache
diff --git a/resources/xlf/zh-hans/cms.zh-Hans.xlf b/resources/xlf/zh-hans/cms.zh-Hans.xlf
index 96ee271a3e..2ab51d6426 100644
--- a/resources/xlf/zh-hans/cms.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/cms.zh-Hans.xlf
@@ -4,23 +4,23 @@
SQL Server Central Management Servers
- SQL Server Central Management Servers
+ SQL Server 中央管理服务器Support for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+ 支持管理 SQL Server 中央管理服务器Central Management Servers
- Central Management Servers
+ 中央管理服务器Microsoft SQL Server
- Microsoft SQL Server
+ 微软 SQL 服务器Central Management Servers
- Central Management Servers
+ 中央管理服务器Refresh
@@ -28,7 +28,7 @@
Refresh Server Group
- Refresh Server Group
+ 刷新服务器组Delete
@@ -36,7 +36,7 @@
New Server Registration...
- New Server Registration...
+ 新建服务器注册...Delete
@@ -44,11 +44,11 @@
New Server Group...
- New Server Group...
+ 新服务器组...Add Central Management Server
- Add Central Management Server
+ 添加中央管理服务器Delete
@@ -56,7 +56,7 @@
MSSQL configuration
- MSSQL configuration
+ MSSQL 配置Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -84,23 +84,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [可选]将调试输出记录到控制台(查看 -> 输出),然后从下拉列表中选择相应的输出通道[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [可选]后端服务的日志级别。Azure Data Studio 在每次启动时都会生成文件名;如果文件已存在,日志条目将追加到该文件。有关旧日志文件的清理,请参阅 logRetentionMinutes 和 logFilesRemovalLimit 设置。默认跟踪级别不记录太多。更改详细级别可能导致过多的日志记录和磁盘空间要求。错误包括“严重”,警告包括“错误”,信息包括“警告”,而详细级别包括“信息”Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ 将后端服务的日志文件保留的时长(分钟数)。默认为 1 周。Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ 启动时要删除的已超过 mssql.logRetentionMinutes 的旧文件的最大数量。下次启动 Azure Data Studio 时,不清理由于此限制而未清理的文件。[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [可选]不显示不受支持的平台警告Recovery Model
@@ -144,7 +144,7 @@
Pricing Tier
- Pricing Tier
+ 定价层Compatibility Level
@@ -164,15 +164,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ 名称(可选)Custom name of the connection
- Custom name of the connection
+ 连接的自定义名称Server
@@ -180,15 +180,15 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server 实例的名称Server Description
- Server Description
+ 服务器描述Description of the SQL Server instance
- Description of the SQL Server instance
+ SQL Server 实例的说明Authentication type
@@ -196,7 +196,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ 指定使用 SQL Server 进行身份验证的方法SQL Login
@@ -208,7 +208,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - 支持 MFA 的通用目录User name
@@ -216,7 +216,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ 指示连接到数据源时使用的用户 IDPassword
@@ -224,155 +224,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ 指示连接到数据源时使用的密码Application intent
- Application intent
+ 应用意图Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ 连接到服务器时声明应用程序工作负载类型Asynchronous processing
- Asynchronous processing
+ 异步处理When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ 如果为 true,则允许在 .Net Framework 数据提供程序中使用异步功能Connect timeout
- Connect timeout
+ 连接超时The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ 在终止尝试并生成错误之前等待连接到服务器的时长(以秒为单位)Current language
- Current language
+ 当前语言The SQL Server language record name
- The SQL Server language record name
+ SQL Server 语言记录名称Column encryption
- Column encryption
+ 列加密Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 连接上所有命令的默认列加密设置Encrypt
- Encrypt
+ 加密When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ 如果为 true,且服务器安装了证书,则 SQL Server 对客户端与服务器之间发送的所有数据使用 SSL 加密Persist security info
- Persist security info
+ 持久安全信息When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ 当为 false 时,安全敏感信息(如密码)不作为连接的一部分返回Trust server certificate
- Trust server certificate
+ 信任服务器证书When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ 当为 true(且加密=true)时,SQL Server 对客户端与服务器之间发送的所有数据使用 SSL 加密,而无需验证服务器证书Attached DB file name
- Attached DB file name
+ 附加的数据库文件名The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 可附加的数据库的主文件的名称(包括完整路径名称)Context connection
- Context connection
+ 上下文连接When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ 如果为 true,则指示连接应来自 SQL Server 上下文。仅在 SQL Server 进程中运行时可用Port
- Port
+ 端口Connect retry count
- Connect retry count
+ 连接重试计数Number of attempts to restore connection
- Number of attempts to restore connection
+ 尝试还原连接的次数Connect retry interval
- Connect retry interval
+ 连接重试间隔Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 尝试恢复连接之间的延迟Application name
- Application name
+ 应用程序名称The name of the application
- The name of the application
+ 应用程序的名称Workstation Id
- Workstation Id
+ 工作站 IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ 连接到 SQL Server 的工作站的名称Pooling
- Pooling
+ 池When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ 如果为 true,则从相应的池中绘制连接对象;如有必要,将创建连接对象并将其添加到相应的池中Max pool size
- Max pool size
+ 最大池大小The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ 池中允许的最大连接数Min pool size
- Min pool size
+ 最小池大小The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ 池中允许的最小连接数Load balance timeout
- Load balance timeout
+ 负载均衡超时The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ 此连接在销毁前在池中生存的最短时间(以秒为单位)Replication
@@ -380,47 +380,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ SQL Server 已在复制中使用Attach DB filename
- Attach DB filename
+ 附加数据库文件名Failover partner
- Failover partner
+ 故障转移合作伙伴The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ 充当故障转移合作伙伴的 SQL Server 实例的名称或网络地址Multi subnet failover
- Multi subnet failover
+ 多子网故障转移Multiple active result sets
- Multiple active result sets
+ 多个活动结果集When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ 如果为 true,则可返回多个结果集并从一个连接读取Packet size
- Packet size
+ 数据包大小Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ 用于与 SQL Server 实例通信的网络数据包的大小(以字节为单位)Type system version
- Type system version
+ 键入系统版本Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ 指示提供程序将通过 DataReader 公开的服务器类型系统
@@ -436,7 +436,7 @@
The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
+ 找不到中央管理服务器 {0} 或它处于脱机状态No resources found
@@ -448,7 +448,7 @@
Add Central Management Server...
- Add Central Management Server...
+ 添加中央管理服务器...
@@ -456,15 +456,15 @@
Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
+ 中央管理服务器组已具有名称为 {0} 的注册服务器Could not add the Registered Server {0}
- Could not add the Registered Server {0}
+ 无法添加已注册的服务器 {0}Are you sure you want to delete
- Are you sure you want to delete
+ 是否确实要删除Yes
@@ -492,15 +492,15 @@
Server Group Description
- Server Group Description
+ 服务器组说明{0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+ {0} 已具有名称为 {1} 的服务器组Are you sure you want to delete
- Are you sure you want to delete
+ 是否确实要删除
@@ -508,7 +508,7 @@
You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+ 不能添加与配置服务器同名的共享注册服务器
diff --git a/resources/xlf/zh-hans/dacpac.zh-Hans.xlf b/resources/xlf/zh-hans/dacpac.zh-Hans.xlf
index d772a34502..3c3e9e827e 100644
--- a/resources/xlf/zh-hans/dacpac.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/dacpac.zh-Hans.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ 关闭向导后,可在任务视图中查看脚本生成的状态。生成的脚本将在完成后打开。Generating deploy plan failed '{0}'
diff --git a/resources/xlf/zh-hans/import.zh-Hans.xlf b/resources/xlf/zh-hans/import.zh-Hans.xlf
index 59b6465baa..27aff584bb 100644
--- a/resources/xlf/zh-hans/import.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/import.zh-Hans.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ 此操作不成功。请尝试其他输入文件。Refresh
diff --git a/resources/xlf/zh-hans/mssql.zh-Hans.xlf b/resources/xlf/zh-hans/mssql.zh-Hans.xlf
index d408c8da9b..38b0784d20 100644
--- a/resources/xlf/zh-hans/mssql.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/mssql.zh-Hans.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ 上传文件New directory
- New directory
+ 新目录Delete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ 新笔记本Open Notebook
- Open Notebook
+ 打开笔记本Tasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ 有关 SQL Server 大数据群集的任务和信息SQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ 提交 Spark 作业New Spark Job
- New Spark Job
+ 新的 Spark 作业View Spark History
- View Spark History
+ 查看 Spark 历史记录View Yarn History
- View Yarn History
+ 查看 YARN 历史记录Tasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ 安装包Configure Python for Notebooks
- Configure Python for Notebooks
+ 为笔记本配置 PythonCluster Status
- Cluster Status
+ 群集状态Search: Servers
- Search: Servers
+ 搜索: 服务器Search: Clear Search Server Results
- Search: Clear Search Server Results
+ 搜索: 清除搜索服务器结果Service Endpoints
- Service Endpoints
+ 服务终结点MSSQL configuration
- MSSQL configuration
+ MSSQL 配置Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [可选]将调试输出记录到控制台(查看 -> 输出),然后从下拉列表中选择相应的输出通道[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [可选]后端服务的日志级别。Azure Data Studio 在每次启动时都会生成文件名;如果文件已存在,则日志条目将追加到该文件。有关旧日志文件的清理,请参阅 logRetentionMinutes 和 logFilesRemovalLimit 设置。默认跟踪级别不记录太多。更改详细级别可能导致过多的日志记录和磁盘空间要求。错误包括“严重”,警告包括“错误”,信息包括“警告”,而详细级别包括“信息”Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ 将后端服务的日志文件保留的时长(分钟数)。默认为 1 周。Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ 启动时要删除的已超过 mssql.logRetentionMinutes 的旧文件的最大数量。下次启动 Azure Data Studio 时,不清理由于此限制而未清理的文件。[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [可选]不显示不受支持的平台警告Recovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ 定价层Compatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ 名称(可选)Custom name of the connection
- Custom name of the connection
+ 连接的自定义名称Server
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server 实例的名称Database
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ 数据源中初始目录或数据库的名称Authentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ 指定使用 SQL Server 进行身份验证的方法SQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ Azure Active Directory - 支持 MFA 的通用目录User name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ 指示连接到数据源时使用的用户 IDPassword
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ 指示连接到数据源时使用的密码Application intent
- Application intent
+ 应用意图Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ 连接到服务器时声明应用程序工作负载类型Asynchronous processing
- Asynchronous processing
+ 异步处理When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ 如果为 true,则允许在 .Net Framework 数据提供程序中使用异步功能Connect timeout
- Connect timeout
+ 连接超时The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ 在终止尝试并生成错误之前等待连接到服务器的时长(以秒为单位)Current language
- Current language
+ 当前语言The SQL Server language record name
- The SQL Server language record name
+ SQL Server 语言记录名称Column encryption
- Column encryption
+ 列加密Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 连接上所有命令的默认列加密设置Encrypt
- Encrypt
+ 加密When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ 如果为 true,且服务器安装了证书,则 SQL Server 对客户端与服务器之间发送的所有数据使用 SSL 加密Persist security info
- Persist security info
+ 持久安全信息When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ 当为false 时,安全敏感信息(如密码)不作为连接的一部分返回Trust server certificate
- Trust server certificate
+ 信任服务器证书When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ 当为 true(且加密=true)时,SQL Server 对客户端与服务器之间发送的所有数据使用 SSL 加密,而无需验证服务器证书Attached DB file name
- Attached DB file name
+ 附加的数据库文件名The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 可附加的数据库的主文件的名称(包括完整路径名称)Context connection
- Context connection
+ 上下文连接When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ 如果为 true,则指示连接应来自 SQL Server 上下文。仅在 SQL Server 进程中运行时可用Port
- Port
+ 端口Connect retry count
- Connect retry count
+ 连接重试计数Number of attempts to restore connection
- Number of attempts to restore connection
+ 尝试还原连接的次数Connect retry interval
- Connect retry interval
+ 连接重试间隔Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 尝试恢复连接之间的延迟Application name
- Application name
+ 应用程序名称The name of the application
- The name of the application
+ 应用程序的名称Workstation Id
- Workstation Id
+ 工作站 IDThe name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ 连接到 SQL Server 的工作站的名称Pooling
- Pooling
+ 池When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ 如果为 true,则从相应的池中绘制连接对象;如有必要,将创建连接对象并将其添加到相应的池中Max pool size
- Max pool size
+ 最大池大小The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ 池中允许的最大连接数Min pool size
- Min pool size
+ 最小池大小The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ 池中允许的最小连接数Load balance timeout
- Load balance timeout
+ 负载均衡超时The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ 此连接在销毁前在池中生存的最短时间(以秒为单位)Replication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ SQL Server 已在复制中使用Attach DB filename
- Attach DB filename
+ 附加数据库文件名Failover partner
- Failover partner
+ 故障转移合作伙伴The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ 充当故障转移合作伙伴的 SQL Server 实例的名称或网络地址Multi subnet failover
- Multi subnet failover
+ 多子网故障转移Multiple active result sets
- Multiple active result sets
+ 多个活动结果集When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ 如果为 true,则可返回多个结果集并从一个连接读取Packet size
- Packet size
+ 数据包大小Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ 用于与 SQL Server 实例通信的网络数据包的大小(以字节为单位)Type system version
- Type system version
+ 键入系统版本Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ 指示提供程序将通过 DataReader 公开的服务器类型系统
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ 不会从响应中返回 Spark 作业批处理 ID。{0}[错误] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ 响应中不返回日志。{0}[错误][1]
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ SparkJobSubmissionModel 的参数是非法的submissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs 无效。livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId 无效。Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ 获取应用程序 ID 超时。{0}[日志] [1]Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ 未指定 localFilePath 或 hdfsFolderPath 属性。Property Path is not specified.
- Property Path is not specified.
+ 未指定属性路径。
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ SparkJobSubmissionDialog 的参数是非法的New Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ 提交{0} Spark Job Submission:
- {0} Spark Job Submission:
+ {0} Spark 作业提交:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... 开始提交 Spark 作业 ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ 输入名称...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Spark 群集Path to a .jar or .py file
- Path to a .jar or .py file
+ 到 .jar 或 .py 文件的路径The selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ 所选本地文件将上传到 HDFS: {0}JAR/py File
- JAR/py File
+ JAR/py 文件Main Class
- Main Class
+ 主类Arguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ 在主类中使用的命令行参数,多个参数应按空格隔开。Property Job Name is not specified.
- Property Job Name is not specified.
+ 未指定属性作业名。Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ 未指定属性 JAR/py 文件。Property Main Class is not specified.
- Property Main Class is not specified.
+ 未指定属性主类。{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ 群集中没有 {0} 或已引发异常。The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ 指定的 HDFS 文件不存在。Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ 由于错误导致查找文件时出错: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ 引用 JarJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ 要放置在执行器工作目录中的 Jar。Jar 路径必须是 HDFS 路径。多个路径应按分号(;)隔开Reference py Files
- Reference py Files
+ 引用 py 文件Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 要放置在执行器工作目录中的 Py 文件。文件路径必须是 HDFS 路径。多个路径应按分号(;)隔开Reference Files
- Reference Files
+ 参考文件Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 要放置在执行器工作目录中的文件。文件路径必须是 HDFS 路径。多个路径应按分号(;)隔开
@@ -656,15 +656,15 @@
Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ 请选择具有大数据群集的 SQL Server。No Sql Server is selected.
- No Sql Server is selected.
+ 未选择 SQL Server。Error Get File Path: {0}
- Error Get File Path: {0}
+ 获取文件路径时出错: {0}
@@ -676,7 +676,7 @@
Process exited with code {0}
- Process exited with code {0}
+ 已通过代码 {0} 退出进程Invalid Data Structure
- Invalid Data Structure
+ 无效数据结构Unable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ 由于缺少选项,无法创建 WebHDFS 客户端: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ “${0}”未定义。Bad Request
- Bad Request
+ 请求错误Unauthorized
- Unauthorized
+ 未经授权Forbidden
- Forbidden
+ 已禁止Not Found
- Not Found
+ 未找到Internal Server Error
- Internal Server Error
+ 内部服务器错误Unknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ 意外重定向Please provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ 请提供连接到 HDFS 的密码:Session for node {0} does not exist
- Session for node {0} does not exist
+ 节点 {0} 的会话不存在Error notifying of node change: {0}
- Error notifying of node change: {0}
+ 通知节点更改时出错: {0}Root
- Root
+ 根HDFS
- HDFS
+ HdfsData Services
- Data Services
+ 数据服务NOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ 注意: 此文件已在 {0} 处截断以供预览。The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ 该文件已在 {0} 处截断以供预览。ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ 连接信息未定义。ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ 未定义 ConnectionInfo.optionsSome missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ connectionInfo.options 中缺少的一些属性: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ 此处理程序不支持操作 {0}Cannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ 无法打开链接 {0},因为仅支持 HTTP 和 HTTPS 链接Download and open '{0}'?
- Download and open '{0}'?
+ 下载并打开“{0}”?Could not find the specified file
- Could not find the specified file
+ 找不到指定的文件File open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ 文件打开请求失败,出现错误: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ 停止笔记本服务器时出错: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ 笔记本进程过早退出,出现错误: {0},StdErr 输出: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ 从 Jupyter 发送时出错: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter 正在 {0} 中运行... Starting Notebook server
- ... Starting Notebook server
+ ...正在启动笔记本服务器Unexpected setting type {0}
- Unexpected setting type {0}
+ 意外设置类型 {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ 无法启动会话,管理器尚未初始化Spark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Spark 内核需要连接到 SQL Server 大数据群集主实例。Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ 未能关闭笔记本服务器: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ 正在安装笔记本依赖项Python download is complete
- Python download is complete
+ Python 下载完毕Error while downloading python setup
- Error while downloading python setup
+ 下载 python 安装程序时出错Downloading python package
- Downloading python package
+ 正在下载 python 包Unpacking python package
- Unpacking python package
+ 正在解包 python 包Error while creating python installation directory
- Error while creating python installation directory
+ 创建 python 安装目录时出错Error while unpacking python bundle
- Error while unpacking python bundle
+ 解包 Python 捆绑时出错Installing Notebook dependencies
- Installing Notebook dependencies
+ 正在安装笔记本依赖项Installing Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ 正在安装笔记本依赖项;有关详细信息,请参阅任务视图Notebook dependencies installation is complete
- Notebook dependencies installation is complete
+ 笔记本依赖项安装完成Cannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ 在 Python 运行时无法覆盖现有的 Python 安装。Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ 另一个 Python 安装正在进行中。Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ 特定位置已存在 Python。正在跳过安装。Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ 未能安装笔记本依赖项,错误: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ 正在下载平台的本地 python: {0} 到 {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ 正在安装运行笔记本所需的包...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Jupyter 安装完毕。Installing SparkMagic...
- Installing SparkMagic...
+ 正在安装 SparkMagic...A notebook path is required
- A notebook path is required
+ 需要笔记本路径Notebooks
- Notebooks
+ 笔记本Only .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ 仅支持 .ipynb 笔记本Are you sure you want to reinstall?
- Are you sure you want to reinstall?
+ 确定要重新安装吗?Configure Python for Notebooks
- Configure Python for Notebooks
+ 为笔记本配置 PythonInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Python 安装位置Select
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ 此安装需要一些时间。建议在安装完成之前不要关闭应用程序。The specified install location is invalid.
- The specified install location is invalid.
+ 指定的安装位置无效。No python installation was found at the specified location.
- No python installation was found at the specified location.
+ 在指定位置未找到 python 安装。Python installation was declined.
- Python installation was declined.
+ Python 安装被拒绝。Installation Type
- Installation Type
+ 安装类型New Python installation
- New Python installation
+ 新 Python 安装Use existing Python installation
- Use existing Python installation
+ 使用现有的 Python 安装Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 未能打开文件 {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 未能打开文件 {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 未能打开文件 {0}: {1}Missing file : {0}
- Missing file : {0}
+ 缺少文件: {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ 此示例代码将文件加载到数据帧中,并显示前 10 个结果。No notebook editor is active
- No notebook editor is active
+ 没有笔记本编辑器处于活动状态Code
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ 你要添加哪种类型的单元格?Notebooks
- Notebooks
+ 笔记本SQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ Azure Data Studio 的 SQL Server 部署扩展Provides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ 提供基于笔记本的体验来部署 Microsoft SQL ServerDeploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ 在 Docker 上部署 SQL Server...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ 部署 SQL Server 大数据群集...Deploy SQL Server…
- Deploy SQL Server…
+ 部署 SQL Server…Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ SQL Server 容器映像Run SQL Server container image with Docker
- Run SQL Server container image with Docker
+ 使用 Docker 运行 SQL Server 容器映像SQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ 借助 SQL Server 大数据群集,可部署在 Kubernetes 上运行的 SQL Server、Spark 和 HDFS 容器的可扩展群集Version
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ SQL Server 2019 大数据群集 CTP 3.1Deployment target
- Deployment target
+ 部署目标New Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ 一种用 Python 编写的命令行实用程序,使群集管理员能够通过 REST API 引导和管理大数据群集mssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ 可使用命令行工具针对 Kubernetes 群集运行命令kubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ 提供在隔离容器中打包和运行应用程序的能力Docker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ 用于管理 Azure 资源的命令行工具Azure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ 找不到 package.json 或未设置名称/发布者
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ 笔记本 {0} 不存在
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ 选择部署选项Open Notebook
- Open Notebook
+ 打开笔记本Tool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ 未能加载扩展: {0},在 package.json 的资源类型定义中检测到错误,请查看调试控制台了解详细信息。The resource type: {0} is not defined
- The resource type: {0} is not defined
+ 未定义资源类型: {0}
diff --git a/resources/xlf/zh-hans/schema-compare.zh-Hans.xlf b/resources/xlf/zh-hans/schema-compare.zh-Hans.xlf
index 1d6ab60b07..81294423f8 100644
--- a/resources/xlf/zh-hans/schema-compare.zh-Hans.xlf
+++ b/resources/xlf/zh-hans/schema-compare.zh-Hans.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server 架构比较SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Azure Data Studio 的 SQL Server 架构比较支持对数据库和 dacpacs 的架构进行比较。Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ 选项已更改。是否重新比较以查看比较结果?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ 常规选项Include Object Types
- Include Object Types
+ 包括对象类型Ignore Table Options
- Ignore Table Options
+ 忽略表选项Ignore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ 忽略语句之间的分号Ignore Route Lifetime
- Ignore Route Lifetime
+ 忽略路由生存期Ignore Role Membership
- Ignore Role Membership
+ 忽略角色成员资格Ignore Quoted Identifiers
- Ignore Quoted Identifiers
+ 忽略引用的标识符Ignore Permissions
- Ignore Permissions
+ 忽略权限Ignore Partition Schemes
- Ignore Partition Schemes
+ 忽略分区方案Ignore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ 忽略分区方案中的对象放置Ignore Not For Replication
- Ignore Not For Replication
+ 忽略不用于复制Ignore Login Sids
- Ignore Login Sids
+ 忽略登录 SIDIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ 忽略索引上的锁定提示Ignore Keyword Casing
- Ignore Keyword Casing
+ 忽略关键字大小写Ignore Index Padding
- Ignore Index Padding
+ 忽略索引填充Ignore Index Options
- Ignore Index Options
+ 忽略索引选项Ignore Increment
- Ignore Increment
+ 忽略增量Ignore Identity Seed
- Ignore Identity Seed
+ 忽略标识种子Ignore User Settings Objects
- Ignore User Settings Objects
+ 忽略用户设置对象Ignore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ 忽略全文目录文件路径Ignore Whitespace
- Ignore Whitespace
+ 忽略空格Ignore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ 忽略且不检查外键Verify Collation Compatibility
- Verify Collation Compatibility
+ 验证排序兼容性Unmodifiable Object Warnings
- Unmodifiable Object Warnings
+ 不可修改的对象警告Treat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ 将验证错误视为警告Script Refresh Module
- Script Refresh Module
+ 脚本刷新模块Script New Constraint Validation
- Script New Constraint Validation
+ 脚本新约束验证Script File Size
- Script File Size
+ 脚本文件大小Script Deploy StateChecks
- Script Deploy StateChecks
+ 脚本部署状态检查Script Database Options
- Script Database Options
+ 脚本数据库选项Script Database Compatibility
- Script Database Compatibility
+ 脚本数据库兼容性Script Database Collation
- Script Database Collation
+ 脚本数据库排序规则Run Deployment Plan Executors
- Run Deployment Plan Executors
+ 运行部署计划执行器Register DataTier Application
- Register DataTier Application
+ 注册数据层应用程序Populate Files On File Groups
- Populate Files On File Groups
+ 在文件组上填充文件No Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ 没有 Alter 语句来更改 Clr 类型Include Transactional Scripts
- Include Transactional Scripts
+ 包括事务脚本Include Composite Objects
- Include Composite Objects
+ 包括复合对象Allow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ 允许不安全行级安全数据移动Ignore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ 在检查约束时不检查且忽略Ignore Fill Factor
- Ignore Fill Factor
+ 忽略填充因子Ignore File Size
- Ignore File Size
+ 忽略文件大小Ignore Filegroup Placement
- Ignore Filegroup Placement
+ 忽略文件组放置Do Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ 不更改复制的对象Do Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ 不更改数据捕获对象Disable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ 禁用和重新启用 Ddl 触发器Deploy Database In Single User Mode
- Deploy Database In Single User Mode
+ 在单用户模式下部署数据库Create New Database
- Create New Database
+ 创建新数据库Compare Using Target Collation
- Compare Using Target Collation
+ 使用目标排序规则进行比较Comment Out Set Var Declarations
- Comment Out Set Var Declarations
+ 注释出 Set Var 声明Block When Drift Detected
- Block When Drift Detected
+ 检测到漂移时阻止Block On Possible Data Loss
- Block On Possible Data Loss
+ 在可能发生数据丢失时阻止Backup Database Before Changes
- Backup Database Before Changes
+ 在更改前备份数据库Allow Incompatible Platform
- Allow Incompatible Platform
+ 允许不兼容的平台Allow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ 允许删除阻止程序集Drop Constraints Not In Source
- Drop Constraints Not In Source
+ 删除不在源中的约束Drop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ 删除未在源中的 Dml 触发器Drop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ 删除不在源中的扩展属性Drop Indexes Not In Source
- Drop Indexes Not In Source
+ 删除不在源中的索引Ignore File And Log File Path
- Ignore File And Log File Path
+ 忽略文件和日志文件路径Ignore Extended Properties
- Ignore Extended Properties
+ 忽略扩展属性Ignore Dml Trigger State
- Ignore Dml Trigger State
+ 忽略 Dml 触发状态Ignore Dml Trigger Order
- Ignore Dml Trigger Order
+ 忽略 Dml 触发顺序Ignore Default Schema
- Ignore Default Schema
+ 忽略默认架构Ignore Ddl Trigger State
- Ignore Ddl Trigger State
+ 忽略 Ddl 触发状态Ignore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ 忽略 Ddl 触发顺序Ignore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ 忽略加密提供程序文件路径Verify Deployment
- Verify Deployment
+ 验证部署Ignore Comments
- Ignore Comments
+ 忽略注释Ignore Column Collation
- Ignore Column Collation
+ 忽略列排序规则Ignore Authorizer
- Ignore Authorizer
+ 忽略授权者Ignore AnsiNulls
- Ignore AnsiNulls
+ 忽略 AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ 生成智能默认值Drop Statistics Not In Source
- Drop Statistics Not In Source
+ 删除未在源中的统计信息Drop Role Members Not In Source
- Drop Role Members Not In Source
+ 删除不在源中的角色成员Drop Permissions Not In Source
- Drop Permissions Not In Source
+ 删除未在源中的权限Drop Objects Not In Source
- Drop Objects Not In Source
+ 删除未在源中的对象Ignore Column Order
- Ignore Column Order
+ 忽略列顺序Aggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ 数据库触发器Defaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ 文件表Full Text Catalogs
- Full Text Catalogs
+ 全文目录Full Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ 标量值函数Search Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ 对称键Synonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ 表值函数User Defined Data Types
- User Defined Data Types
+ 用户定义的数据类型User Defined Table Types
- User Defined Table Types
+ 用户定义的表类型Clr User Defined Types
- Clr User Defined Types
+ Clr 用户定义的类型Users
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ 服务器触发器Specifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ 指定在存在差异时,发布应始终删除并重新创建程序集,而不是发出 ALTER ASSEMBLY 语句Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ 如果为 true,则在部署之前将数据库设置为“单用户模式”。Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ 此设置指示在部署期间如何处理数据库的排序规则;默认情况下,如果目标数据库的排序规则与源指定的排序规则不匹配,则该数据库的排序规则将更新。 设置此选项后,应使用目标数据库(或服务器)的排序规则。Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ 指定在将更新发布到数据库时,是否将从目标数据库中删除未在数据库快照(.dacpac)文件中定义的角色成员。</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ 数据层应用程序文件(.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ 已选择其他源架构。是否进行比较以查看比较结果?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ 已选择其他目标架构。是否进行比较以查看比较结果?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ 已选择其他源和目标架构。是否进行比较以查看比较结果?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ 比较详细信息Are you sure you want to update the target?
- Are you sure you want to update the target?
+ 确定要更新目标吗?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ 按“比较”以刷新比较结果。Generate script to deploy changes to target
- Generate script to deploy changes to target
+ 生成脚本以将更改部署到目标No changes to script
- No changes to script
+ 未更改脚本Apply changes to target
- Apply changes to target
+ 将更改应用于目标No changes to apply
- No changes to apply
+ 没有要应用的更改Delete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ 初始化比较。这可能需要一段时间。To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ 要比较两个架构,请先选择源架构和目标架构,然后按“比较”。No schema differences were found.
- No schema differences were found.
+ 未找到架构差异。Schema Compare failed: {0}
- Schema Compare failed: {0}
+ 架构比较失败: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ 当目标为数据库时启用生成脚本Apply is enabled when the target is a database
- Apply is enabled when the target is a database
+ 当目标为数据库时启用应用Compare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ 未能取消架构比较:“{0}”Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ 未能生成脚本:“{0}”Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ 架构比较应用失败“{0}”Switch direction
- Switch direction
+ 开关方向Switch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ 打开 .scmp 文件Load source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ 加载保存在 .smp 文件中的源、目标和选项Open
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ 未能打开 scmp:“{0}”Save .scmp file
- Save .scmp file
+ 保存 .smp 文件Save source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ 保存源、目标、选项和所排除的元素Save
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ 未能保存 scmp 失败:“{0}”
diff --git a/resources/xlf/zh-hant/admin-tool-ext-win.zh-Hant.xlf b/resources/xlf/zh-hant/admin-tool-ext-win.zh-Hant.xlf
index 04e648efd8..27c7019939 100644
--- a/resources/xlf/zh-hant/admin-tool-ext-win.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/admin-tool-ext-win.zh-Hant.xlf
@@ -4,11 +4,11 @@
Database Administration Tool Extensions for Windows
- Database Administration Tool Extensions for Windows
+ 適用於 Windows 的資料庫系統管理工具延伸模組Adds additional Windows-specific functionality to Azure Data Studio
- Adds additional Windows-specific functionality to Azure Data Studio
+ 將額外的 Windows 特定功能新增至 Azure Data StudioProperties
@@ -16,7 +16,7 @@
Generate Scripts...
- Generate Scripts...
+ 產生指令碼...
@@ -24,27 +24,27 @@
No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ 未為 handleLaunchSsmsMinPropertiesDialogCommand 提供任何 ConnectionContextCould not determine Object Explorer node from connectionContext : {0}
- Could not determine Object Explorer node from connectionContext : {0}
+ 無法判斷來自 connectionContext 的物件總管節點 : {0}No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
- No ConnectionContext provided for handleLaunchSsmsMinPropertiesDialogCommand
+ 未為 handleLaunchSsmsMinPropertiesDialogCommand 提供任何 ConnectionContextNo connectionProfile provided from connectionContext : {0}
- No connectionProfile provided from connectionContext : {0}
+ connectionContext 未提供任何 connectionProfile : {0}Launching dialog...
- Launching dialog...
+ 正在啟動對話方塊...Error calling SsmsMin with args '{0}' - {1}
- Error calling SsmsMin with args '{0}' - {1}
+ 以引數 '{0}' - {1} 呼叫 SsmsMin 時發生錯誤
diff --git a/resources/xlf/zh-hant/agent.zh-Hant.xlf b/resources/xlf/zh-hant/agent.zh-Hant.xlf
index 22e11dc921..fc6c07a3e4 100644
--- a/resources/xlf/zh-hant/agent.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/agent.zh-Hant.xlf
@@ -396,7 +396,7 @@
SQL Server Integration Service Package
- SQL Server Integration Service Package
+ SQL Server 整合服務套件SQL Server Agent Service Account
diff --git a/resources/xlf/zh-hant/azurecore.zh-Hant.xlf b/resources/xlf/zh-hant/azurecore.zh-Hant.xlf
index 063478820c..6f755bda92 100644
--- a/resources/xlf/zh-hant/azurecore.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/azurecore.zh-Hant.xlf
@@ -28,7 +28,7 @@
Azure: Refresh All Accounts
- Azure: Refresh All Accounts
+ Azure: 重新整理所有帳戶Refresh
@@ -36,7 +36,7 @@
Azure: Sign In
- Azure: Sign In
+ Azure: 登入Select Subscriptions
@@ -48,7 +48,7 @@
Add to Servers
- Add to Servers
+ 新增至伺服器Clear Azure Account Token Cache
@@ -136,7 +136,7 @@
No Resources found
- No Resources found
+ 找不到任何資源
diff --git a/resources/xlf/zh-hant/cms.zh-Hant.xlf b/resources/xlf/zh-hant/cms.zh-Hant.xlf
index cc5ea33e3c..6631075fac 100644
--- a/resources/xlf/zh-hant/cms.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/cms.zh-Hant.xlf
@@ -1,514 +1,536 @@
-
+
-
- SQL Server Central Management Servers
- SQL Server Central Management Servers
+
+ Summary
+ 摘要
-
- Support for managing SQL Server Central Management Servers
- Support for managing SQL Server Central Management Servers
+
+ Cluster type
+ 叢集類型
-
- Central Management Servers
- Central Management Servers
+
+ Cluster context
+ 叢集內容
-
- Microsoft SQL Server
- Microsoft SQL Server
+
+ Cluster name
+ 叢集名稱
-
- Central Management Servers
- Central Management Servers
+
+ Cluster Admin username
+ 叢集系統管理員使用者名稱
-
- Refresh
- 重新整理
+
+ Accept license agreement
+ 接受授權合約
-
- Refresh Server Group
- Refresh Server Group
+
+ Deployment profile
+ 部署設定檔
-
- Delete
- 刪除
+
+ SQL Server master scale
+ SQL Server 主機調整
-
- New Server Registration...
- New Server Registration...
+
+ Compute pool scale
+ 計算集區調整
-
- Delete
- 刪除
+
+ Data pool scale
+ 資料集區調整
-
- New Server Group...
- New Server Group...
+
+ Storage pool scale
+ 儲存體集區調整
-
- Add Central Management Server
- Add Central Management Server
+
+ Spark pool scale
+ Spark 集區調整
-
- Delete
- 刪除
+
+ TARGET CLUSTER
+ 目標叢集
-
- MSSQL configuration
- MSSQL configuration
+
+ SQL SERVER BIG DATA CLUSTER
+ SQL Server 巨量資料叢集
-
- Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
- BIT 欄位是否顯示為數值 (1或0)? 若為 false, BIT 欄位將會顯示為 'true' 或 'false'
-
-
- Should column definitions be aligned?
- 行定義是否一致?
-
-
- Should data types be formatted as UPPERCASE, lowercase, or none (not formatted)
- 是否轉換資料類型為大寫,小寫或無(不轉換)
-
-
- Should keywords be formatted as UPPERCASE, lowercase, or none (not formatted)
- 是否轉換關鍵字為大寫,小寫或無(不轉換)
-
-
- should commas be placed at the beginning of each statement in a list e.g. ', mycolumn2' instead of at the end e.g. 'mycolumn1,'
- 逗號是否放在 list 中每個語句的開頭,例如:", mycolumn2" 而非在結尾,例如:"mycolumn1,"
-
-
- Should references to objects in a select statements be split into separate lines? E.g. for 'SELECT C1, C2 FROM T1' both C1 and C2 will be on separate lines
- 在 select 敘述句中參考的物件是否分行? 如 'SELECT C1, C2 FROM T1' 中 C1 與C2 將會分行顯示
-
-
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
-
-
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
-
-
- Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
-
-
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
-
-
- [Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
-
-
- Recovery Model
- 復原模式
-
-
- Last Database Backup
- 上次資料庫備份
-
-
- Last Log Backup
- 上次日誌備份
-
-
- Compatibility Level
- 相容層級
-
-
- Owner
- 擁有者
-
-
- Version
- 版本
-
-
- Edition
- 版本
-
-
- Computer Name
- 電腦名稱
-
-
- OS Version
- 作業系統版本
-
-
- Edition
- 版本
-
-
- Pricing Tier
- Pricing Tier
-
-
- Compatibility Level
- 相容層級
-
-
- Owner
- 擁有者
-
-
- Version
- 版本
-
-
- Type
- 型別
-
-
- Microsoft SQL Server
- Microsoft SQL Server
-
-
- Name (optional)
- Name (optional)
-
-
- Custom name of the connection
- Custom name of the connection
-
-
- Server
- 伺服器
-
-
- Name of the SQL Server instance
- Name of the SQL Server instance
-
-
- Server Description
- Server Description
-
-
- Description of the SQL Server instance
- Description of the SQL Server instance
-
-
- Authentication type
- 驗證類型
-
-
- Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
-
-
- SQL Login
- SQL 登入
-
-
- Windows Authentication
- Windows 驗證
-
-
- Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
-
-
- User name
- 使用者名稱
-
-
- Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
-
-
- Password
- 密碼
-
-
- Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
-
-
- Application intent
- Application intent
-
-
- Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
-
-
- Asynchronous processing
- Asynchronous processing
-
-
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
-
-
- Connect timeout
- Connect timeout
-
-
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
-
-
- Current language
- Current language
-
-
- The SQL Server language record name
- The SQL Server language record name
-
-
- Column encryption
- Column encryption
-
-
- Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
-
-
- Encrypt
- Encrypt
-
-
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
-
-
- Persist security info
- Persist security info
-
-
- When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
-
-
- Trust server certificate
- Trust server certificate
-
-
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
-
-
- Attached DB file name
- Attached DB file name
-
-
- The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
-
-
- Context connection
- Context connection
-
-
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
-
-
- Port
- Port
-
-
- Connect retry count
- Connect retry count
-
-
- Number of attempts to restore connection
- Number of attempts to restore connection
-
-
- Connect retry interval
- Connect retry interval
-
-
- Delay between attempts to restore connection
- Delay between attempts to restore connection
-
-
- Application name
- Application name
-
-
- The name of the application
- The name of the application
-
-
- Workstation Id
- Workstation Id
-
-
- The name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
-
-
- Pooling
- Pooling
-
-
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
-
-
- Max pool size
- Max pool size
-
-
- The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
-
-
- Min pool size
- Min pool size
-
-
- The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
-
-
- Load balance timeout
- Load balance timeout
-
-
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
-
-
- Replication
- 複寫
-
-
- Used by SQL Server in Replication
- Used by SQL Server in Replication
-
-
- Attach DB filename
- Attach DB filename
-
-
- Failover partner
- Failover partner
-
-
- The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
-
-
- Multi subnet failover
- Multi subnet failover
-
-
- Multiple active result sets
- Multiple active result sets
-
-
- When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
-
-
- Packet size
- Packet size
-
-
- Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
-
-
- Type system version
- Type system version
-
-
- Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
-
-
-
-
-
-
- Loading ...
- 正在載入...
-
-
-
-
-
-
- The Central Management Server {0} could not be found or is offline
- The Central Management Server {0} could not be found or is offline
-
-
- No resources found
- No resources found
-
-
-
-
-
-
- Add Central Management Server...
- Add Central Management Server...
-
-
-
-
-
-
- Central Management Server Group already has a Registered Server with the name {0}
- Central Management Server Group already has a Registered Server with the name {0}
-
-
- Could not add the Registered Server {0}
- Could not add the Registered Server {0}
-
-
- Are you sure you want to delete
- Are you sure you want to delete
-
-
+ Yes是
-
- No
- 否
+
+
+
+
+
+ Restore Default Values
+ 還原預設值
-
- Add Server Group
- 新增伺服器群組
+
+ Settings
+ 設定
-
- OK
- 確定
+
+ Configure the settings required for deploying SQL Server big data cluster
+ 進行部署 SQL Server 巨量資料叢集所需的設定
-
- Cancel
- 取消
+
+ You need to accept the terms of services and privacy policy in order to proceed
+ 您需要接受服務條款和隱私權原則才能繼續
-
- Server Group Name
- 伺服器群組名稱
+
+ Cluster name
+ 叢集名稱
-
- Server Group Description
- Server Group Description
+
+ Admin username
+ 系統管理員使用者名稱
-
- {0} already has a Server Group with the name {1}
- {0} already has a Server Group with the name {1}
+
+ Password
+ 密碼
-
- Are you sure you want to delete
- Are you sure you want to delete
+
+ SQL Server master
+ SQL 伺服器主機
+
+
+ Knox
+ Knox
+
+
+ Controller
+ 控制器
+
+
+ Proxy
+ Proxy
+
+
+ Grafana dashboard
+ Grafana 儀表板
+
+
+ Kibana dashboard
+ Kibana 儀表板
+
+
+ only required for private registries
+ 只有私人登錄需要
+
+
+ Registry
+ 登錄
+
+
+ Repository
+ 儲存庫
+
+
+ Image tag
+ 映像標籤
+
+
+ Username
+ 使用者名稱
+
+
+ Password
+ 密碼
+
+
+ Basic Settings
+ 基本設定
+
+
+ Container Registry Settings
+ 容器登錄設定
+
+
+ Port Settings (Optional)
+ 連接埠設定 (選用)
+
+
+ license terms
+ 授權條款
+
+
+ privacy policy
+ 隱私權原則
+
+
+ I accept the {0} and {1}.
+ 我接受 {0} 和 {1}。
+ {0} is the place holder for license terms, {1} is the place holder for privacy policy
-
+
-
- You cannot add a shared registered server with the same name as the Configuration Server
- You cannot add a shared registered server with the same name as the Configuration Server
+
+ Install Tools
+ 安裝工具
+
+
+ Installing...
+ 正在安裝...
+
+
+ What is your target cluster environment?
+ 您的目標叢集環境為何?
+
+
+ Choose the target environment and then install the required tools for it.
+ 選擇目標環境,然後為其安裝所需的工具。
+
+
+ Refresh Status
+ 重新整理狀態
+
+
+ Tool
+ 工具
+
+
+ Description
+ 描述
+
+
+ Version
+ 版本
+
+
+ Status
+ 狀態
+
+
+ Pick target environment
+ 挑選目標環境
+
+
+ Please wait while the required tools status is being refreshed.
+ 請稍候,正在重新整理所需的工具狀態。
+
+
+ Please select a target cluster type.
+ 請選取目標叢集類型。
+
+
+ Please install the required tools.
+ 請安裝所需的工具。
+
+
+ (Coming Soon)
+ (即將推出)
+
+
+ Required tools
+ 所需工具
+
+
+ Installed
+ 已安裝
+
+
+ Not Installed
+ 未安裝
+
+
+ Installing...
+ 正在安裝...
+
+
+ Install Failed
+ 安裝失敗
+
+
+
+
+
+
+ Where do you want to deploy this SQL Server big data cluster?
+ 要在哪裡部署此 SQL Server 巨量資料叢集?
+
+
+ Select the kubeconfig file and then select a cluster context from the list
+ 選取 kubeconfig 檔案,然後從清單選取叢集內容
+
+
+ Please select a cluster context.
+ 請選取叢集內容。
+
+
+ Kube config file path
+ Kube 設定檔路徑
+
+
+ Browse
+ 瀏覽
+
+
+ Cluster Contexts
+ 叢集內容
+
+
+ No cluster information is found in the config file or an error ocurred while loading the config file
+ 在設定檔中找不到叢集資訊,或載入設定檔時發生錯誤
+
+
+ Select
+ 選擇
+
+
+
+
+
+
+ Select a cluster profile
+ 選取叢集設定檔
+
+
+ Select your requirement and we will provide you a pre-defined default scaling. You can later go to cluster configuration and customize it.
+ 選取您的需求,我們將會為您提供預先定義的預設規模調整。您稍後可前往叢集設定予以自訂。
+
+
+ Target cluster scale overview
+ 目標叢集調整概觀
+
+
+ Deployment profile
+ 部署設定檔
+
+
+ Hardware profile
+ 硬體設定檔
+
+
+ Label
+ 標籤
+
+
+ Nodes
+ 節點
+
+
+ Cores
+ 核心
+
+
+ Memory
+ 記憶體
+
+
+ Disks
+ 磁碟
+
+
+ Scale
+ 小數位數
+
+
+ Hardware profile label
+ 硬體設定檔標籤
+
+
+ Feature set
+ 功能集
+
+
+ Engine only
+ 僅限引擎
+
+
+ Engine with optional features
+ 具有選用功能的引擎
+
+
+ SQL Server master
+ SQL 伺服器主機
+
+
+ Compute pool
+ 計算集區
+
+
+ Data pool
+ 資料集區
+
+
+ Storage pool
+ 儲存體集區
+
+
+ Spark pool
+ Spark 集區
+
+
+ The SQL Server instance provides an externally accessible TDS endpoint for the cluster
+ SQL Server 執行個體為叢集提供了外部可存取的 TDS 端點
+
+
+ TODO: Add description
+ TODO: 新增描述
+
+
+ TODO: Add description
+ TODO: 新增描述
+
+
+ TODO: Add description
+ TODO: 新增描述
+
+
+ TODO: Add description
+ TODO: 新增描述
+
+
+ {0} ({1})
+ {0} ({1})
+ {0} is the pool name, {1} is the scale number
+
+
+
+
+
+
+ Create a big data cluster
+ 建立巨量資料叢集
+
+
+ Generate Scripts
+ 產生指令碼
+
+
+ Create
+ 建立
+
+
+
+
+
+
+ New AKS Cluster
+ 新 AKS 叢集
+
+
+ New Azure Kubernetes Service cluster
+ 新 Azure Kubernetes Service 叢集
+
+
+ This option configures new Azure Kubernetes Service (AKS) for SQL Server big data cluster deployments. AKS makes it simple to create, configure and manage a cluster of virutal machines that are preconfigured with a Kubernetes cluster to run containerized applications.
+ 此選項會針對 SQL Server 巨量資料叢集部署設定新的 Azure Kubernetes Service (AKS)。AKS 讓已預先設定 Kubernetes 叢集來執行容器化應用程式的虛擬機器叢集,變得易於建立、設定和管理。
+
+
+ Existing Cluster
+ 現有的叢集
+
+
+ Existing Kubernetes cluster
+ 現有的 Kubernetes 叢集
+
+
+ This option assumes you already have a Kubernetes cluster installed, Once a prerequisite check is done, ensure the correct cluster context is selected.
+ 此選項假設您已安裝 Kubernetes 叢集,必要條件檢查完成後,請確認已選取正確的叢集內容。
+
+
+
+
+
+
+ SQL Server big data cluster
+ SQL Server 巨量資料叢集
+
+
+
+
+
+
+ Unable to run kubectl
+ 無法執行 kubectl
+
+
+ Failed to set '{0}' as current cluster: {1}
+ 無法將 ‘{0}’ 設定為目前的叢集: {1}
+
+
+
+
+
+
+ Could not find {0} binary. {1}
+ 找不到 {0} 二進位檔。{1}
+
+
+ {0} is not installed. {1}
+ 未安裝 {0}。{1}
+
+
+ SQL Server Big data cluster requires kubernetes.
+ SQL Server 巨量資料叢集需要 kubernetes。
+
+
+ Cannot execute command.
+ 無法執行命令。
+
+
+ kubectl version ${0} may be incompatible with cluster Kubernetes version {1}
+ kubectl 版本 ${0} 可能與叢集 Kubernetes 版本 {1} 不相容
+
+
+ Unable to run command ({0})
+ 無法執行命令 ({0})
+
+
+
+
+
+
+ Install dependencies
+ 安裝相依性
+
+
+ Learn more
+ 深入了解
+
+
+ Add {0} directory to path, or set "mssql-bdc.{0}-path" config to {0} binary.
+ 將 {0} 目錄新增至路徑,或將 "mssql-bdc.{0}-path” 設定設為 {0} 二進位檔。
+
+
+
+
+
+
+ Failed to download kubectl: {0}
+ 無法下載 kubectl: {0}
+
+
+ Failed to establish kubectl stable version: {0}
+ 無法建立 kubectl 穩定版本: {0}
+
+
+
+
+
+
+ Done
+ 完成
+
+
+ {0} already installed...
+ 已安裝 {0}...
+
+
+ Installing {0}...
+ 正在安裝 {0}...
+
+
+ Unable to install {0}: {1}
+ 無法安裝 {0}: {1}
diff --git a/resources/xlf/zh-hant/dacpac.zh-Hant.xlf b/resources/xlf/zh-hant/dacpac.zh-Hant.xlf
index 861fd6a2bb..05b167d484 100644
--- a/resources/xlf/zh-hant/dacpac.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/dacpac.zh-Hant.xlf
@@ -280,7 +280,7 @@
You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
- You can view the status of script generation in the Tasks View once the wizard is closed. The generated script will open when complete.
+ 精靈關閉後,您可於工作檢視中檢視指令碼產生的狀態。產生的指令碼將於完成時開啟。Generating deploy plan failed '{0}'
diff --git a/resources/xlf/zh-hant/import.zh-Hant.xlf b/resources/xlf/zh-hant/import.zh-Hant.xlf
index c35fc99630..f7445f07b3 100644
--- a/resources/xlf/zh-hant/import.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/import.zh-Hant.xlf
@@ -44,7 +44,7 @@
This operation was unsuccessful. Please try a different input file.
- This operation was unsuccessful. Please try a different input file.
+ 這項作業不成功。請嘗試其他輸入檔案。Refresh
diff --git a/resources/xlf/zh-hant/mssql.zh-Hant.xlf b/resources/xlf/zh-hant/mssql.zh-Hant.xlf
index 773e4611fd..29fd770812 100644
--- a/resources/xlf/zh-hant/mssql.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/mssql.zh-Hant.xlf
@@ -28,11 +28,11 @@
Upload files
- Upload files
+ 上傳檔案New directory
- New directory
+ 新增目錄Delete
@@ -52,15 +52,15 @@
New Notebook
- New Notebook
+ 新增 NotebookOpen Notebook
- Open Notebook
+ 開啟 NotebookTasks and information about your SQL Server Big Data Cluster
- Tasks and information about your SQL Server Big Data Cluster
+ SQL Server 巨量資料叢集的工作和資訊SQL Server Big Data Cluster
@@ -68,19 +68,19 @@
Submit Spark Job
- Submit Spark Job
+ 提交 Spark 作業New Spark Job
- New Spark Job
+ 新增 Spark 作業View Spark History
- View Spark History
+ 檢視 Spark 歷程記錄View Yarn History
- View Yarn History
+ 檢視 Yarn 歷程記錄Tasks
@@ -88,31 +88,31 @@
Install Packages
- Install Packages
+ 安裝套件Configure Python for Notebooks
- Configure Python for Notebooks
+ 為 Notebooks 設定 PythonCluster Status
- Cluster Status
+ 叢集狀態Search: Servers
- Search: Servers
+ 搜尋: 伺服器Search: Clear Search Server Results
- Search: Clear Search Server Results
+ 搜尋: 清除搜尋伺服器結果Service Endpoints
- Service Endpoints
+ 服務端點MSSQL configuration
- MSSQL configuration
+ MSSQL 設定Should BIT columns be displayed as numbers (1 or 0)? If false, BIT columns will be displayed as 'true' or 'false'
@@ -140,23 +140,23 @@
[Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
- [Optional] Log debug output to the console (View -> Output) and then select appropriate output channel from the dropdown
+ [選擇性] 將偵錯輸出記錄至主控台 ([檢視] -> [輸出]),並從下拉式清單選取適當的輸出通道[Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
- [Optional] Log level for backend services. Azure Data Studio generates a file name every time it starts and if the file already exists the logs entries are appended to that file. For cleanup of old log files see logRetentionMinutes and logFilesRemovalLimit settings. The default tracingLevel does not log much. Changing verbosity could lead to extensive logging and disk space requirements for the logs. Error includes Critical, Warning includes Error, Information includes Warning and Verbose includes Information
+ [選擇性] 後端服務的記錄層級。每當 Azure Data Studio 啟動,或是檔案已經有附加至該檔案的記錄項目時,Azure Data Studio 都會產生檔案名稱。如需舊記錄檔的清除,請查看 logRetentionMinutes 和 logFilesRemovalLimit 設定。預設 tracingLevel 不會記錄太多項目。變更詳細資訊可能會導致大量記錄和記錄的磁碟空間需求。錯誤包含嚴重,警告包含錯誤,資訊包含警告而詳細資訊包含資訊Number of minutes to retain log files for backend services. Default is 1 week.
- Number of minutes to retain log files for backend services. Default is 1 week.
+ 為後端服務保留記錄檔的分鐘數。預設為 1 週。Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
- Maximum number of old files to remove upon startup that have expired mssql.logRetentionMinutes. Files that do not get cleaned up due to this limitation get cleaned up next time Azure Data Studio starts up.
+ 具有到期的 logRetentionMinutes,且要於啟動時移除的舊檔案數上限。因為此限制而未清除的檔案,將於下次 Azure Data Studio 啟動時受到清除。[Optional] Do not show unsupported platform warnings
- [Optional] Do not show unsupported platform warnings
+ [選擇性] 不要顯示不支援的平台警告Recovery Model
@@ -200,7 +200,7 @@
Pricing Tier
- Pricing Tier
+ 定價層Compatibility Level
@@ -220,15 +220,15 @@
Microsoft SQL Server
- Microsoft SQL Server
+ Microsoft SQL ServerName (optional)
- Name (optional)
+ 名稱 (選擇性)Custom name of the connection
- Custom name of the connection
+ 連線的自訂名稱Server
@@ -236,7 +236,7 @@
Name of the SQL Server instance
- Name of the SQL Server instance
+ SQL Server 執行個體的名稱Database
@@ -244,7 +244,7 @@
The name of the initial catalog or database int the data source
- The name of the initial catalog or database int the data source
+ 資料來源中,初始類別目錄或資料庫的名稱。Authentication type
@@ -252,7 +252,7 @@
Specifies the method of authenticating with SQL Server
- Specifies the method of authenticating with SQL Server
+ 指定向 SQL Server 驗證的方法SQL Login
@@ -264,7 +264,7 @@
Azure Active Directory - Universal with MFA support
- Azure Active Directory - Universal with MFA support
+ 具 MFA 支援的 Azure Active Directory - 通用User name
@@ -272,7 +272,7 @@
Indicates the user ID to be used when connecting to the data source
- Indicates the user ID to be used when connecting to the data source
+ 代表要在連線至資料來源時使用的使用者識別碼Password
@@ -280,155 +280,155 @@
Indicates the password to be used when connecting to the data source
- Indicates the password to be used when connecting to the data source
+ 代表要在連線至資料來源時使用的密碼Application intent
- Application intent
+ 應用程式的意圖Declares the application workload type when connecting to a server
- Declares the application workload type when connecting to a server
+ 當連線至伺服器時宣告應用程式工作負載類型Asynchronous processing
- Asynchronous processing
+ 非同步處理When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
- When true, enables usage of the Asynchronous functionality in the .Net Framework Data Provider
+ 若為 true,則允許使用 .Net Framework Data Provider 中的非同步功能Connect timeout
- Connect timeout
+ 連線逾時The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
- The length of time (in seconds) to wait for a connection to the server before terminating the attempt and generating an error
+ 終止嘗試並產生錯誤前,要等待伺服器連線的時間長度 (秒)Current language
- Current language
+ 目前的語言The SQL Server language record name
- The SQL Server language record name
+ SQL Server 語言記錄名稱Column encryption
- Column encryption
+ 資料行加密Default column encryption setting for all the commands on the connection
- Default column encryption setting for all the commands on the connection
+ 連線上所有命令的預設資料行加密設定Encrypt
- Encrypt
+ 加密When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
- When true, SQL Server uses SSL encryption for all data sent between the client and server if the server has a certificate installed
+ 若為 true,則 SQL Server 會在伺服器已安裝憑證的情況下,對用戶端和伺服器間傳送的所有資料使用 SSL 加密Persist security info
- Persist security info
+ 持續安全性資訊When false, security-sensitive information, such as the password, is not returned as part of the connection
- When false, security-sensitive information, such as the password, is not returned as part of the connection
+ 若為 false,則不會於連線中傳回密碼等安全性敏感資訊Trust server certificate
- Trust server certificate
+ 信任伺服器憑證When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
- When true (and encrypt=true), SQL Server uses SSL encryption for all data sent between the client and server without validating the server certificate
+ 若為 true (且 encrypt=true),則 SQL Server 會對用戶端和伺服器間傳送的所有資料使用 SSL 加密,而不驗證伺服器憑證Attached DB file name
- Attached DB file name
+ 已附加 DB 檔案名稱The name of the primary file, including the full path name, of an attachable database
- The name of the primary file, including the full path name, of an attachable database
+ 主要檔案的名稱,包含可附加資料庫的完整路徑名稱Context connection
- Context connection
+ 內容連線When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
- When true, indicates the connection should be from the SQL server context. Available only when running in the SQL Server process
+ 若為 true,則代表連線應來自 SQL 伺服器內容。僅於在 SQL Server 處理序中執行時可用Port
- Port
+ 連接埠Connect retry count
- Connect retry count
+ 連線重試計數Number of attempts to restore connection
- Number of attempts to restore connection
+ 嘗試還原連線的次數Connect retry interval
- Connect retry interval
+ 連線重試間隔Delay between attempts to restore connection
- Delay between attempts to restore connection
+ 還原連線之嘗試之間的延遲Application name
- Application name
+ 應用程式名稱The name of the application
- The name of the application
+ 應用程式的名稱Workstation Id
- Workstation Id
+ 工作站識別碼The name of the workstation connecting to SQL Server
- The name of the workstation connecting to SQL Server
+ 正在連線至 SQL Server 的工作站名稱Pooling
- Pooling
+ 共用When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
- When true, the connection object is drawn from the appropriate pool, or if necessary, is created and added to the appropriate pool
+ 若為 true,則會從適當的集區提取連線物件,或在有需要時建立並新增至適當的集區Max pool size
- Max pool size
+ 集區大小上限The maximum number of connections allowed in the pool
- The maximum number of connections allowed in the pool
+ 集區中允許的連線數上限Min pool size
- Min pool size
+ 集區大小下限The minimum number of connections allowed in the pool
- The minimum number of connections allowed in the pool
+ 集區中允許的連線數下限Load balance timeout
- Load balance timeout
+ 負載平衡逾時The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
- The minimum amount of time (in seconds) for this connection to live in the pool before being destroyed
+ 此連線在終結前於集區中存留的時間下限 (秒)Replication
@@ -436,47 +436,47 @@
Used by SQL Server in Replication
- Used by SQL Server in Replication
+ 在複寫中由 SQL Server 使用Attach DB filename
- Attach DB filename
+ 附加 DB 檔案名稱Failover partner
- Failover partner
+ 容錯移轉合作夥伴The name or network address of the instance of SQL Server that acts as a failover partner
- The name or network address of the instance of SQL Server that acts as a failover partner
+ 充當容錯移轉合作夥伴之 SQL Server 執行個體的名稱或網路位址Multi subnet failover
- Multi subnet failover
+ 多重子網容錯移轉Multiple active result sets
- Multiple active result sets
+ 多個正在使用的結果集When true, multiple result sets can be returned and read from one connection
- When true, multiple result sets can be returned and read from one connection
+ 若為 true,則可傳回多個結果集並從一個連線讀取Packet size
- Packet size
+ 封包大小Size in bytes of the network packets used to communicate with an instance of SQL Server
- Size in bytes of the network packets used to communicate with an instance of SQL Server
+ 用於和 SQL Server 執行個體通訊的網路封包大小 (位元組)Type system version
- Type system version
+ 鍵入系統版本Indicates which server type system then provider will expose through the DataReader
- Indicates which server type system then provider will expose through the DataReader
+ 指出會依序透過 DataReader 公開的伺服器類型系統和提供者
@@ -484,11 +484,11 @@
No Spark job batch id is returned from response.{0}[Error] {1}
- No Spark job batch id is returned from response.{0}[Error] {1}
+ 回應未傳回任何 Spark 作業批次識別碼。{0}[錯誤] {1}No log is returned within response.{0}[Error] {1}
- No log is returned within response.{0}[Error] {1}
+ 回應中未傳回任何記錄。{0}[錯誤] {1}
@@ -496,27 +496,27 @@
Parameters for SparkJobSubmissionModel is illegal
- Parameters for SparkJobSubmissionModel is illegal
+ SparkJobSubmissionModel 的參數不合法submissionArgs is invalid.
- submissionArgs is invalid.
+ submissionArgs 無效。livyBatchId is invalid.
- livyBatchId is invalid.
+ livyBatchId 無效。Get Application Id time out. {0}[Log] {1}
- Get Application Id time out. {0}[Log] {1}
+ 取得應用程式識別碼逾時。{0}[記錄] {1}Property localFilePath or hdfsFolderPath is not specified.
- Property localFilePath or hdfsFolderPath is not specified.
+ 未指定屬性 localFilePath 或 hdfsFolderPath。Property Path is not specified.
- Property Path is not specified.
+ 未指定屬性路徑。
@@ -524,7 +524,7 @@
Parameters for SparkJobSubmissionDialog is illegal
- Parameters for SparkJobSubmissionDialog is illegal
+ SparkJobSubmissionDialog 的參數不合法New Job
@@ -536,15 +536,15 @@
Submit
- Submit
+ 提交{0} Spark Job Submission:
- {0} Spark Job Submission:
+ {0} Spark 作業提交:.......................... Submit Spark Job Start ..........................
- .......................... Submit Spark Job Start ..........................
+ .......................... 提交 Spark 作業開始 ..........................
@@ -556,7 +556,7 @@
Enter a name ...
- Enter a name ...
+ 請輸入名稱...Job Name
@@ -564,23 +564,23 @@
Spark Cluster
- Spark Cluster
+ Spark 叢集Path to a .jar or .py file
- Path to a .jar or .py file
+ .jar 或 .py 檔案的路徑The selected local file will be uploaded to HDFS: {0}
- The selected local file will be uploaded to HDFS: {0}
+ 選取的本機檔案將會上傳至 HDFS: {0}JAR/py File
- JAR/py File
+ JAR/py 檔案Main Class
- Main Class
+ 主要類別Arguments
@@ -588,27 +588,27 @@
Command line arguments used in your main class, multiple arguments should be split by space.
- Command line arguments used in your main class, multiple arguments should be split by space.
+ 在您主要類別中使用的命令列引數,多個引數應以空格分隔。Property Job Name is not specified.
- Property Job Name is not specified.
+ 未指定屬性作業名稱。Property JAR/py File is not specified.
- Property JAR/py File is not specified.
+ 未指定屬性 JAR/py 檔案。Property Main Class is not specified.
- Property Main Class is not specified.
+ 未指定屬性主要類別。{0} does not exist in Cluster or exception thrown.
- {0} does not exist in Cluster or exception thrown.
+ {0} 不存在於叢集或擲回例外狀況中。The specified HDFS file does not exist.
- The specified HDFS file does not exist.
+ 指定的 HDFS 檔案不存在。Select
@@ -616,7 +616,7 @@
Error in locating the file due to Error: {0}
- Error in locating the file due to Error: {0}
+ 因為發生錯誤,所以在尋找檔案時發生錯誤: {0}
@@ -628,27 +628,27 @@
Reference Jars
- Reference Jars
+ 參考 JarsJars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
- Jars to be placed in executor working directory. The Jar path needs to be an HDFS Path. Multiple paths should be split by semicolon (;)
+ 要放置在執行程式工作目錄中的 Jar。Jar 路徑必須為 HDFS 路徑。多個路徑應以分號 (;) 分隔Reference py Files
- Reference py Files
+ 參考 py 檔案Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Py Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 要放置於執行程式工作目錄的 Py 檔案。檔案路徑必須為 HDFS 路徑。多個路徑應以分號 (;) 分隔Reference Files
- Reference Files
+ 參考檔案Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
- Files to be placed in executor working directory. The file path needs to be an HDFS Path. Multiple paths should be split by semicolon(;)
+ 要放置於執行程式工作目錄的檔案。檔案路徑必須為 HDFS 路徑。多個路徑應以分號 (;) 分隔Please select SQL Server with Big Data Cluster.
- Please select SQL Server with Big Data Cluster.
+ 請選取具有巨量資料叢集的 SQL Server。No Sql Server is selected.
- No Sql Server is selected.
+ 未選取任何 Sql 伺服器。Error Get File Path: {0}
- Error Get File Path: {0}
+ 取得檔案路徑時發生錯誤: {0}Invalid Data Structure
- Invalid Data Structure
+ 資料結構無效Unable to create WebHDFS client due to missing options: ${0}
- Unable to create WebHDFS client due to missing options: ${0}
+ 因為缺少選項,所以無法建立 WebHDFS 用戶端: ${0}'${0}' is undefined.
- '${0}' is undefined.
+ '${0}' 未定義。Bad Request
- Bad Request
+ 不正確的要求Unauthorized
- Unauthorized
+ 未經授權Forbidden
- Forbidden
+ 禁止Not Found
- Not Found
+ 未找到Internal Server Error
- Internal Server Error
+ 內部伺服器錯誤Unknown Error
@@ -732,7 +732,7 @@
Unexpected Redirect
- Unexpected Redirect
+ 未預期的重新導向Please provide the password to connect to HDFS:
- Please provide the password to connect to HDFS:
+ 請提供連線至 HDFS 的密碼:Session for node {0} does not exist
- Session for node {0} does not exist
+ 節點 {0} 的工作階段不存在Error notifying of node change: {0}
- Error notifying of node change: {0}
+ 通知節點變更時發生錯誤: {0}Root
- Root
+ 根HDFS
- HDFS
+ HdfsData Services
- Data Services
+ 資料服務NOTICE: This file has been truncated at {0} for preview.
- NOTICE: This file has been truncated at {0} for preview.
+ 注意: 此檔案已於 {0} 截斷,以供預覽。The file has been truncated at {0} for preview.
- The file has been truncated at {0} for preview.
+ 檔案已於 {0} 截斷,以供預覽。ConnectionInfo is undefined.
- ConnectionInfo is undefined.
+ 未定義 ConnectionInfo。ConnectionInfo.options is undefined.
- ConnectionInfo.options is undefined.
+ 未定義 ConnectionInfo.options。Some missing properties in connectionInfo.options: {0}
- Some missing properties in connectionInfo.options: {0}
+ connectionInfo.options 中的部分遺失屬性: {0}Action {0} is not supported for this handler
- Action {0} is not supported for this handler
+ 這個處理常式不支援動作 {0}Cannot open link {0} as only HTTP and HTTPS links are supported
- Cannot open link {0} as only HTTP and HTTPS links are supported
+ 因為只支援 HTTP 和 HTTPS 連結,所以無法開啟連結 {0}Download and open '{0}'?
- Download and open '{0}'?
+ 要下載並開啟 '{0}' 嗎?Could not find the specified file
- Could not find the specified file
+ 無法尋找指定的檔案File open request failed with error: {0} {1}
- File open request failed with error: {0} {1}
+ 檔案開啟要求失敗。錯誤: {0} {1}Error stopping Notebook Server: {0}
- Error stopping Notebook Server: {0}
+ 停止 Notebook 伺服器時發生錯誤: {0}Notebook process exited prematurely with error: {0}, StdErr Output: {1}
- Notebook process exited prematurely with error: {0}, StdErr Output: {1}
+ Notebook 處理序提前結束。錯誤: {0},StdErr 輸出: {1}Error sent from Jupyter: {0}
- Error sent from Jupyter: {0}
+ 從 Jupyter 傳送的錯誤: {0}... Jupyter is running at {0}
- ... Jupyter is running at {0}
+ ... Jupyter 正在 {0} 中執行... Starting Notebook server
- ... Starting Notebook server
+ ... 正在啟動 Notebook 伺服器Unexpected setting type {0}
- Unexpected setting type {0}
+ 未預期的設定類型 {0}Cannot start a session, the manager is not yet initialized
- Cannot start a session, the manager is not yet initialized
+ 無法啟動工作階段,管理員尚未初始化Spark kernels require a connection to a SQL Server big data cluster master instance.
- Spark kernels require a connection to a SQL Server big data cluster master instance.
+ Spark 核心需要對 SQL Server 巨量資料叢集主要執行個體的連線。Shutdown of Notebook server failed: {0}
- Shutdown of Notebook server failed: {0}
+ 無法關閉 Notebook 伺服器: {0}Notebook dependencies installation is in progress
- Notebook dependencies installation is in progress
+ 正在安裝 Notebook 相依性Python download is complete
- Python download is complete
+ Python 下載完成Error while downloading python setup
- Error while downloading python setup
+ 下載 python 設定時發生錯誤Downloading python package
- Downloading python package
+ 正在下載 python 套件Unpacking python package
- Unpacking python package
+ 正在解壓縮 python 套件Error while creating python installation directory
- Error while creating python installation directory
+ 建立 python 安裝目錄時發生錯誤Error while unpacking python bundle
- Error while unpacking python bundle
+ 將 python 套件組合解壓縮時發生錯誤Installing Notebook dependencies
- Installing Notebook dependencies
+ 正在安裝 Notebook 相依性Installing Notebook dependencies, see Tasks view for more information
- Installing Notebook dependencies, see Tasks view for more information
+ 正在安裝 Notebook 相依性,請參閱 [工作] 檢視以取得詳細資訊Notebook dependencies installation is complete
- Notebook dependencies installation is complete
+ Notebook 相依性安裝完成Cannot overwrite existing Python installation while python is running.
- Cannot overwrite existing Python installation while python is running.
+ 無法於 python 執行時覆寫現有的 Python 安裝。Another Python installation is currently in progress.
- Another Python installation is currently in progress.
+ 另一個 Python 安裝正在進行。Python already exists at the specific location. Skipping install.
- Python already exists at the specific location. Skipping install.
+ Python 已經存在於特定位置。即將跳過安裝。Installing Notebook dependencies failed with error: {0}
- Installing Notebook dependencies failed with error: {0}
+ 安裝 Notebook 相依性失敗。錯誤: {0}Downloading local python for platform: {0} to {1}
- Downloading local python for platform: {0} to {1}
+ 正在下載平台的本機 python: {0} 至 {1}Installing required packages to run Notebooks...
- Installing required packages to run Notebooks...
+ 正在安裝執行 Notebooks 所需的套件...... Jupyter installation complete.
- ... Jupyter installation complete.
+ ... Jupyter 安裝完成。Installing SparkMagic...
- Installing SparkMagic...
+ 正在安裝 SparkMagic...A notebook path is required
- A notebook path is required
+ 筆記本路徑為必要項Notebooks
- Notebooks
+ NotebooksOnly .ipynb Notebooks are supported
- Only .ipynb Notebooks are supported
+ 僅支援 .ipynb NotebooksAre you sure you want to reinstall?
- Are you sure you want to reinstall?
+ 確定要重新安裝嗎?Configure Python for Notebooks
- Configure Python for Notebooks
+ 為 Notebooks 設定 PythonInstall
@@ -424,7 +424,7 @@
Python Install Location
- Python Install Location
+ Python 安裝位置Select
@@ -432,31 +432,31 @@
This installation will take some time. It is recommended to not close the application until the installation is complete.
- This installation will take some time. It is recommended to not close the application until the installation is complete.
+ 這個安裝需要一些時間。建議不要在安裝完成前關閉應用程式。The specified install location is invalid.
- The specified install location is invalid.
+ 指定的安裝位置無效。No python installation was found at the specified location.
- No python installation was found at the specified location.
+ 未於指定的位置找到任何 python 安裝。Python installation was declined.
- Python installation was declined.
+ Python 安裝已受拒絕。Installation Type
- Installation Type
+ 安裝類型New Python installation
- New Python installation
+ 新增 Python 安裝Use existing Python installation
- Use existing Python installation
+ 使用現有的 Python 安裝Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 無法開啟檔案 {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 無法開啟檔案 {0}: {1}Open file {0} failed: {1}
- Open file {0} failed: {1}
+ 無法開啟檔案 {0}: {1}Missing file : {0}
- Missing file : {0}
+ 缺少檔案 : {0}This sample code loads the file into a data frame and shows the first 10 results.
- This sample code loads the file into a data frame and shows the first 10 results.
+ 這個範例指令碼會將檔案載入資料框架,並顯示前 10 個結果。No notebook editor is active
- No notebook editor is active
+ 沒有任何正在使用的筆記本編輯器Code
@@ -528,11 +528,11 @@
What type of cell do you want to add?
- What type of cell do you want to add?
+ 您要新增什麼類型的儲存格?Notebooks
- Notebooks
+ NotebooksSQL Server Deployment extension for Azure Data Studio
- SQL Server Deployment extension for Azure Data Studio
+ 適用於 Azure Data Studio 的 SQL Server 部署延伸模組Provides a notebook-based experience to deploy Microsoft SQL Server
- Provides a notebook-based experience to deploy Microsoft SQL Server
+ 提供部署 Microsoft SQL Server 的筆記本式體驗Deploy SQL Server on Docker…
- Deploy SQL Server on Docker…
+ 在 Docker 上部署 SQL Server...Deploy SQL Server big data cluster…
- Deploy SQL Server big data cluster…
+ 部署 SQL Server 巨量資料叢集...Deploy SQL Server…
- Deploy SQL Server…
+ 部署 SQL Server…Deployment
@@ -28,11 +28,11 @@
SQL Server container image
- SQL Server container image
+ SQL Server 容器映像Run SQL Server container image with Docker
- Run SQL Server container image with Docker
+ 以 Docker 執行 SQL Server 容器映像SQL Server big data cluster
@@ -40,7 +40,7 @@
SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
- SQL Server big data cluster allows you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes
+ SQL Server 巨量資料叢集,可讓您部署於 Kubernetes 上執行且可調整的 SQL Server、Spark 和 HDFS 容器叢集Version
@@ -48,27 +48,27 @@
SQL Server 2017
- SQL Server 2017
+ SQL Server 2017SQL Server 2019
- SQL Server 2019
+ SQL Server 2019./notebooks/docker/2017/deploy-sql2017-image.ipynb
- ./notebooks/docker/2017/deploy-sql2017-image.ipynb
+ ./notebooks/docker/2017/deploy-sql2017-image.ipynb./notebooks/docker/2019/deploy-sql2019-image.ipynb
- ./notebooks/docker/2019/deploy-sql2019-image.ipynb
+ ./notebooks/docker/2019/deploy-sql2019-image.ipynbSQL Server 2019 big data cluster CTP 3.1
- SQL Server 2019 big data cluster CTP 3.1
+ SQL Server 2019 巨量資料叢集 CTP 3.1Deployment target
- Deployment target
+ 部署目標New Azure Kubernetes Service Cluster
@@ -80,11 +80,11 @@
./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-aks.ipynb./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
- ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynb
+ ./notebooks/bdc/2019/ctp3-1/deploy-bdc-existing-cluster.ipynbA command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
- A command-line utility written in Python that enables cluster administrators to bootstrap and manage the big data cluster via REST APIs
+ 以 Python 編寫的命令列公用程式,可讓叢集系統管理員透過 REST API 進行巨量資料叢集的啟動程序和管理mssqlctl
- mssqlctl
+ mssqlctlA command-line tool allows you to run commands against Kubernetes clusters
- A command-line tool allows you to run commands against Kubernetes clusters
+ 可讓您對 Kubernetes 叢集執行命令的命令列工具kubectl
- kubectl
+ kubectlProvides the ability to package and run an application in isolated containers
- Provides the ability to package and run an application in isolated containers
+ 提供在隔離容器中封裝和執行應用程式的能力Docker
@@ -128,11 +128,11 @@
A command-line tool for managing Azure resources
- A command-line tool for managing Azure resources
+ 用於管理 Azure 資源的命令列工具Azure CLI
- Azure CLI
+ Azure CLI
@@ -140,7 +140,7 @@
Could not find package.json or the name/publisher is not set
- Could not find package.json or the name/publisher is not set
+ 找不到 package.json,或是未設定名稱/發佈者
@@ -148,7 +148,7 @@
The notebook {0} does not exist
- The notebook {0} does not exist
+ 筆記本 {0} 不存在
@@ -156,11 +156,11 @@
Select the deployment options
- Select the deployment options
+ 選擇部署選項Open Notebook
- Open Notebook
+ 開啟 NotebookTool
@@ -184,11 +184,11 @@
Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
- Failed to load extension: {0}, Error detected in the resource type definition in package.json, check debug console for details.
+ 無法載入延伸模組: {0},在 package.json 的資源類型定義中偵測到錯誤,請查看偵錯主控台以取得詳細資料。The resource type: {0} is not defined
- The resource type: {0} is not defined
+ 資源類型: 未定義 {0}
diff --git a/resources/xlf/zh-hant/schema-compare.zh-Hant.xlf b/resources/xlf/zh-hant/schema-compare.zh-Hant.xlf
index 52d11b49b8..edd7f69a11 100644
--- a/resources/xlf/zh-hant/schema-compare.zh-Hant.xlf
+++ b/resources/xlf/zh-hant/schema-compare.zh-Hant.xlf
@@ -4,11 +4,11 @@
SQL Server Schema Compare
- SQL Server Schema Compare
+ SQL Server 結構描述比較SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
- SQL Server Schema Compare for Azure Data Studio supports comparing the schemas of databases and dacpacs.
+ Azure Data Studio 的 SQL Server 結構描述比較支援比較資料庫和 dacpac 的結構描述。Schema Compare
@@ -40,7 +40,7 @@
Options have changed. Recompare to see the comparison?
- Options have changed. Recompare to see the comparison?
+ 選項已變更。要重新比較以查看比較嗎?Schema Compare Options
@@ -48,315 +48,315 @@
General Options
- General Options
+ 一般選項Include Object Types
- Include Object Types
+ 包含物件類型Ignore Table Options
- Ignore Table Options
+ 忽略資料表選項Ignore Semicolon Between Statements
- Ignore Semicolon Between Statements
+ 忽略陳述式之間的分號Ignore Route Lifetime
- Ignore Route Lifetime
+ 忽略路由存留期Ignore Role Membership
- Ignore Role Membership
+ 忽略角色成員資格Ignore Quoted Identifiers
- Ignore Quoted Identifiers
+ 忽略引號識別碼Ignore Permissions
- Ignore Permissions
+ 忽略權限Ignore Partition Schemes
- Ignore Partition Schemes
+ 忽略資料分隔配置Ignore Object Placement On Partition Scheme
- Ignore Object Placement On Partition Scheme
+ 忽略磁碟分割配置上的物件放置Ignore Not For Replication
- Ignore Not For Replication
+ 忽略不可複寫Ignore Login Sids
- Ignore Login Sids
+ 忽略登入 SidIgnore Lock Hints On Indexes
- Ignore Lock Hints On Indexes
+ 忽略索引的鎖定提示Ignore Keyword Casing
- Ignore Keyword Casing
+ 忽略關鍵字大小寫Ignore Index Padding
- Ignore Index Padding
+ 忽略索引填補Ignore Index Options
- Ignore Index Options
+ 忽略索引選項Ignore Increment
- Ignore Increment
+ 忽略增量Ignore Identity Seed
- Ignore Identity Seed
+ 忽略識別值種子Ignore User Settings Objects
- Ignore User Settings Objects
+ 忽略使用者設定物件Ignore Full Text Catalog FilePath
- Ignore Full Text Catalog FilePath
+ 忽略全文檢索目錄 FilePathIgnore Whitespace
- Ignore Whitespace
+ 忽略空白Ignore With Nocheck On ForeignKeys
- Ignore With Nocheck On ForeignKeys
+ 忽略 With Nocheck On ForeignKeysVerify Collation Compatibility
- Verify Collation Compatibility
+ 驗證定序相容性Unmodifiable Object Warnings
- Unmodifiable Object Warnings
+ 無法修改的物件警告Treat Verification Errors As Warnings
- Treat Verification Errors As Warnings
+ 將驗證錯誤視為警告Script Refresh Module
- Script Refresh Module
+ 指令碼重新整理模組Script New Constraint Validation
- Script New Constraint Validation
+ 指令碼新增條件約束驗證Script File Size
- Script File Size
+ 指令碼檔案大小Script Deploy StateChecks
- Script Deploy StateChecks
+ 指令碼部署 StateChecksScript Database Options
- Script Database Options
+ 指令碼資料庫選項Script Database Compatibility
- Script Database Compatibility
+ 指令碼資料庫相容性Script Database Collation
- Script Database Collation
+ 指令碼資料庫定序Run Deployment Plan Executors
- Run Deployment Plan Executors
+ 執行部署計劃執行程式Register DataTier Application
- Register DataTier Application
+ 註冊 DataTier 應用程式Populate Files On File Groups
- Populate Files On File Groups
+ 填入檔案群組上的檔案No Alter Statements To Change Clr Types
- No Alter Statements To Change Clr Types
+ 不更改陳述式以變更 Cir 類型Include Transactional Scripts
- Include Transactional Scripts
+ 包含交易指令碼Include Composite Objects
- Include Composite Objects
+ 包含複合物件Allow Unsafe Row Level Security Data Movement
- Allow Unsafe Row Level Security Data Movement
+ 允許不安全的資料列層級安全性資料移動Ignore With No check On Check Constraints
- Ignore With No check On Check Constraints
+ 忽略 With No check On Check 條件約束Ignore Fill Factor
- Ignore Fill Factor
+ 忽略填滿因數Ignore File Size
- Ignore File Size
+ 忽略檔案大小Ignore Filegroup Placement
- Ignore Filegroup Placement
+ 忽略檔案群組放置Do Not Alter Replicated Objects
- Do Not Alter Replicated Objects
+ 不要改變已複寫物件Do Not Alter Change Data Capture Objects
- Do Not Alter Change Data Capture Objects
+ 不要更改異動資料擷取物件Disable And Reenable Ddl Triggers
- Disable And Reenable Ddl Triggers
+ 停用再重新啟用 Ddl 觸發程序Deploy Database In Single User Mode
- Deploy Database In Single User Mode
+ 在單一使用者模式中部署資料庫Create New Database
- Create New Database
+ 建立新的資料庫Compare Using Target Collation
- Compare Using Target Collation
+ 使用目標定序進行比較Comment Out Set Var Declarations
- Comment Out Set Var Declarations
+ Comment Out Set Var 宣告Block When Drift Detected
- Block When Drift Detected
+ 當偵測到飄移時封鎖Block On Possible Data Loss
- Block On Possible Data Loss
+ 於可能遺失資料時封鎖Backup Database Before Changes
- Backup Database Before Changes
+ 在變更前備份資料庫Allow Incompatible Platform
- Allow Incompatible Platform
+ 允許不相容的平台Allow Drop Blocking Assemblies
- Allow Drop Blocking Assemblies
+ 允許捨棄封鎖的組件Drop Constraints Not In Source
- Drop Constraints Not In Source
+ 捨棄不在來源中的條件約束Drop Dml Triggers Not In Source
- Drop Dml Triggers Not In Source
+ 捨棄不在來源中的 Dml 觸發程序Drop Extended Properties Not In Source
- Drop Extended Properties Not In Source
+ 捨棄不在來源中的擴充屬性Drop Indexes Not In Source
- Drop Indexes Not In Source
+ 捨棄不在來源中的索引Ignore File And Log File Path
- Ignore File And Log File Path
+ 忽略檔案和記錄檔路徑Ignore Extended Properties
- Ignore Extended Properties
+ 忽略擴充屬性Ignore Dml Trigger State
- Ignore Dml Trigger State
+ 忽略 Dml 觸發程序狀態Ignore Dml Trigger Order
- Ignore Dml Trigger Order
+ 忽略 Dml 觸發程序順序Ignore Default Schema
- Ignore Default Schema
+ 忽略預設結構描述Ignore Ddl Trigger State
- Ignore Ddl Trigger State
+ 忽略 Ddl 觸發程序狀態Ignore Ddl Trigger Order
- Ignore Ddl Trigger Order
+ 忽略 Ddl 觸發程序順序Ignore Cryptographic Provider FilePath
- Ignore Cryptographic Provider FilePath
+ 忽略密碼編譯提供者 FilePathVerify Deployment
- Verify Deployment
+ 驗證部署Ignore Comments
- Ignore Comments
+ 忽略註解Ignore Column Collation
- Ignore Column Collation
+ 忽略資料行定序Ignore Authorizer
- Ignore Authorizer
+ 忽略授權者Ignore AnsiNulls
- Ignore AnsiNulls
+ 忽略 AnsiNullsGenerate SmartDefaults
- Generate SmartDefaults
+ 產生 SmartDefaultsDrop Statistics Not In Source
- Drop Statistics Not In Source
+ 捨棄不在來源中的統計資料Drop Role Members Not In Source
- Drop Role Members Not In Source
+ 捨棄不在來源中的角色成員Drop Permissions Not In Source
- Drop Permissions Not In Source
+ 捨棄不在來源中的權限Drop Objects Not In Source
- Drop Objects Not In Source
+ 捨棄不在來源中的物件Ignore Column Order
- Ignore Column Order
+ 忽略資料行順序Aggregates
@@ -408,7 +408,7 @@
DatabaseTriggers
- DatabaseTriggers
+ DatabaseTriggersDefaults
@@ -436,11 +436,11 @@
File Tables
- File Tables
+ 檔案資料表Full Text Catalogs
- Full Text Catalogs
+ 全文檢索目錄Full Text Stoplists
@@ -480,7 +480,7 @@
Scalar Valued Functions
- Scalar Valued Functions
+ 純量值函式Search Property Lists
@@ -508,7 +508,7 @@
SymmetricKeys
- SymmetricKeys
+ SymmetricKeysSynonyms
@@ -520,19 +520,19 @@
Table Valued Functions
- Table Valued Functions
+ 資料表值函式User Defined Data Types
- User Defined Data Types
+ 使用者定義的資料類型User Defined Table Types
- User Defined Table Types
+ 使用者定義的資料表類型Clr User Defined Types
- Clr User Defined Types
+ Clr 使用者定義的類型Users
@@ -620,7 +620,7 @@
Server Triggers
- Server Triggers
+ 伺服器觸發程序Specifies whether differences in the table options will be ignored or updated when you publish to a database.
@@ -756,7 +756,7 @@
Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
- Specifies that publish should always drop and re-create an assembly if there is a difference instead of issuing an ALTER ASSEMBLY statement
+ 指定發佈應在有差異時一律捨棄並重新建立組件,而非發出 ALTER ASSEMBLY 陳述式Specifies whether transactional statements should be used where possible when you publish to a database.
@@ -800,7 +800,7 @@
If true, the database is set to Single User Mode before deploying.
- If true, the database is set to Single User Mode before deploying.
+ 若為 true,則會在部署前將資料庫設為單一使用者模式。Specifies whether the target database should be updated or whether it should be dropped and re-created when you publish to a database.
@@ -808,7 +808,7 @@
This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
- This setting dictates how the database's collation is handled during deployment; by default the target database's collation will be updated if it does not match the collation specified by the source. When this option is set, the target database's (or server's) collation should be used.
+ 這項設定會指出部署期間的資料庫定序處理方式; 根據預設,如果目標資料庫的定序與來源所指定的定序不相符,就會受到更新。設定此選項時,應使用目標資料庫 (或伺服器) 的定序。Specifies whether the declaration of SETVAR variables should be commented out in the generated publish script. You might choose to do this if you plan to specify the values on the command line when you publish by using a tool such as SQLCMD.EXE.
@@ -912,7 +912,7 @@
Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
- Specifies whether role members that are not defined in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.</
+ 指定是否要在您對資料庫發佈更新時,將未於資料庫快照集 (.dacpac) 檔案中定義的角色成員從目標資料庫捨棄。</Specifies whether permissions that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish updates to a database.
@@ -952,7 +952,7 @@
Data-tier Application File (.dacpac)
- Data-tier Application File (.dacpac)
+ 資料層應用程式檔案 (.dacpac)Database
@@ -980,15 +980,15 @@
A different source schema has been selected. Compare to see the comparison?
- A different source schema has been selected. Compare to see the comparison?
+ 已選取其他來源結構描述。要比較以查看比較嗎?A different target schema has been selected. Compare to see the comparison?
- A different target schema has been selected. Compare to see the comparison?
+ 已選取其他目標結構描述。要比較以查看比較嗎?Different source and target schemas have been selected. Compare to see the comparison?
- Different source and target schemas have been selected. Compare to see the comparison?
+ 已選取不同的來源和結構描述。要比較以查看比較嗎?Yes
@@ -1012,31 +1012,31 @@
Compare Details
- Compare Details
+ 比較詳細資料Are you sure you want to update the target?
- Are you sure you want to update the target?
+ 確定要更新目標嗎?Press Compare to refresh the comparison.
- Press Compare to refresh the comparison.
+ 按下 [比較] 即可重新整理比較。Generate script to deploy changes to target
- Generate script to deploy changes to target
+ 產生指令碼以將變更部署至目標No changes to script
- No changes to script
+ 指令碼沒有任何變更Apply changes to target
- Apply changes to target
+ 將變更套用至目標No changes to apply
- No changes to apply
+ 沒有任何要套用的變更Delete
@@ -1064,23 +1064,23 @@
➔
- ➔
+ ➔Initializing Comparison. This might take a moment.
- Initializing Comparison. This might take a moment.
+ 正在將比較初始化。這可能需要一些時間。To compare two schemas, first select a source schema and target schema, then press Compare.
- To compare two schemas, first select a source schema and target schema, then press Compare.
+ 若要比較兩個結構描述,請先選取來源結構描述,然後按下 [比較]。No schema differences were found.
- No schema differences were found.
+ 找不到任何結構描述差異。Schema Compare failed: {0}
- Schema Compare failed: {0}
+ 結構描述比較失敗: {0}Type
@@ -1104,11 +1104,11 @@
Generate script is enabled when the target is a database
- Generate script is enabled when the target is a database
+ 當目標為資料庫時,會啟用產生指令碼Apply is enabled when the target is a database
- Apply is enabled when the target is a database
+ 當目標為資料庫時會啟用套用Compare
@@ -1128,7 +1128,7 @@
Cancel schema compare failed: '{0}'
- Cancel schema compare failed: '{0}'
+ 取消結構描述比較失敗: '{0}'Generate script
@@ -1136,7 +1136,7 @@
Generate script failed: '{0}'
- Generate script failed: '{0}'
+ 無法產生指令碼: '{0}'Options
@@ -1156,11 +1156,11 @@
Schema Compare Apply failed '{0}'
- Schema Compare Apply failed '{0}'
+ 結構描述比較套用失敗 '{0}'Switch direction
- Switch direction
+ 切換方向Switch source and target
@@ -1176,11 +1176,11 @@
Open .scmp file
- Open .scmp file
+ 開啟 .scmp 檔案Load source, target, and options saved in an .scmp file
- Load source, target, and options saved in an .scmp file
+ 載入儲存在 .scmp 檔案中的來源、目標和選項Open
@@ -1188,15 +1188,15 @@
Open scmp failed: '{0}'
- Open scmp failed: '{0}'
+ 無法開啟 scmp: '{0}'Save .scmp file
- Save .scmp file
+ 儲存 .scmp 檔案Save source and target, options, and excluded elements
- Save source and target, options, and excluded elements
+ 儲存來源和目標、選項及排除的元素Save
@@ -1204,7 +1204,7 @@
Save scmp failed: '{0}'
- Save scmp failed: '{0}'
+ 無法儲存 scmp: '{0}'