Remove Data Processing Remove Metadata Remove Testing
article thumbnail

Manage Amazon OpenSearch Service Visualizations, Alerts, and More with GitHub and Jenkins

AWS Big Data

It is advised to discourage contributors from making changes directly to the production OpenSearch Service domain and instead implement a gatekeeper process to validate and test the changes before moving them to OpenSearch Service. es.amazonaws.com' # e.g. my-test-domain.us-east-1.es.amazonaws.com, Leave the settings as default.

article thumbnail

Introducing a new unified data connection experience with Amazon SageMaker Lakehouse unified data connectivity

AWS Big Data

For each service, you need to learn the supported authorization and authentication methods, data access APIs, and framework to onboard and test data sources. The SageMaker Lakehouse data connection testing capability boosts your confidence in established connections. On your project, in the navigation pane, choose Data. Choose Next.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

How REA Group approaches Amazon MSK cluster capacity planning

AWS Big Data

In each environment, Hydro manages a single MSK cluster that hosts multiple tenants with differing workload requirements. To address this, we used the AWS performance testing framework for Apache Kafka to evaluate the theoretical performance limits. The following figure shows an example of a test cluster’s performance metrics.

Metrics 95
article thumbnail

Federate to Amazon Redshift Query Editor v2 with Microsoft Entra ID

AWS Big Data

Select the Consumption hosting plan and then choose Select. Save the federation metadata XML file You use the federation metadata file to configure the IAM IdP in a later step. In the Single sign-on section , under SAML Certificates , choose Download for Federation Metadata XML. Choose Test this application.

Sales 105
article thumbnail

Disaster recovery strategies for Amazon MWAA – Part 2

AWS Big Data

The solution for this post is hosted on GitHub. Backup and restore architecture The backup and restore strategy involves periodically backing up Amazon MWAA metadata to Amazon Simple Storage Service (Amazon S3) buckets in the primary Region. This is the bucket where you host all of your DAGs for your environment. [1.b]

Strategy 111
article thumbnail

Governing data in relational databases using Amazon DataZone

AWS Big Data

This post explains how you can extend the governance capabilities of Amazon DataZone to data assets hosted in relational databases based on MySQL, PostgreSQL, Oracle or SQL Server engines. Second, the data producer needs to consolidate the data asset’s metadata in the business catalog and enrich it with business metadata.

Metadata 104
article thumbnail

Configure a custom domain name for your Amazon MSK cluster

AWS Big Data

For the client to resolve DNS queries for the custom domain, an Amazon Route 53 private hosted zone is used to host the DNS records, and is associated with the client’s VPC to enable DNS resolution from the Route 53 VPC resolver. The Kafka client uses the custom domain bootstrap address to send a get metadata request to the NLB.