|
Search Jobvertise Jobs
|
Jobvertise
|
Remote - Sr. Data Engineer Location: US-Remote Jobcode: 3596600 Email Job
| Report Job
Role : Sr. Data Engineers Duration : 1 Year Plus Location : Remote Project to start mid-October. A minimum of 10 years experience in Data Engineering. A minimum of 4 years in Azure. Must have excellent communication skills. We are looking for a talented and experienced Azure Data Engineer with expertise in Azure Databricks, Azure Synapse Analytics, Azure Data Factory, RDBMS, and NoSQL databases. The ideal candidate will play a crucial role in designing, implementing, and optimizing data solutions on the Azure platform to support our data-driven initiatives. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain data pipelines using Azure Data Factory and Azure Databricks/Spark to efficiently extract, transform, and load (ETL) data from various sources into data lakes and data warehouses. Data Modeling: Create and manage data models and schemas within Azure Synapse Analytics, NoSQL (Cosmos, MongoDB) to ensure data accuracy, performance, and scalability. Data Integration: Collaborate with cross-functional teams to integrate data from diverse sources, including RDBMS (e.g., Oracle, SQL Server) and NoSQL databases (e.g., MongoDB, Cosmos DB Data Migration : Lead data migration projects, including data extraction, transformation, and loading (ETL), from on-premises systems and other cloud platforms to Azure. Ensure data quality, accuracy, and consistency throughout the migration process. Data Quality and Profiling: Implement data quality checks, Data Profiling and Cleansing to maintain data integrity, security, and compliance with industry standards and regulations. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Proven experience as a data engineer with a strong focus on Azure data services. Proficiency in Azure Databricks/Spark, Azure Synapse Analytics, Azure Data Factory, and other Azure data-related tools. Strong SQL skills and experience with data modeling in both RDBMS and NoSQL databases. Familiarity with data warehousing concepts and best practices. Knowledge of data governance, security, and compliance standards. Programming skills in languages such as Python and Microservices (Python) is desired. Excellent problem-solving and communication skills. Strong Databricks experience in AWS/GCP experience can also be considered. Ability to work independently and collaboratively within a team environment.
DKMRBH Inc.
|