|
Search Jobvertise Jobs
|
Jobvertise
|
Senior Data Engineer Location: US-TX-San Antonio - 78258 Email Job
| Report Job
Design and develop robust data architectures and data pipelines to support data ingestion, processing, storage, and retrieval using Cloud Data warehouses like Snowflake, Big Query etc. Lead development on internal products like Snowflake Cost Optimization, design best practices for data engineering reference guide, internal demonstration warehouse and ETL tools environments. Develop and document database architectures. Design database applications such as interfaces, data transfer mechanisms, global temporary tables, data partitions and function-based indexes to enable efficient access of the generic database structure. Transform raw data into highly structured and consistent data sets using SQL / Python /Snowflake/ Matillion /DBT/Azure Synapse/Talend, Kafka, Hive and Sqoop or similar tools to power business insights. Design and develop solutions that can scale horizontally and vertically in a cloud computing environment such as AWS, Azure, or GCP
Identify and evaluate emerging technologies, tools, and trends that can drive innovation and improve the efficiency and effectiveness of our data engineering processes. Design, develop, and own robust, scalable data processing and data integration pipelines using Python, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake. Design and develop batch and streaming data pipelines using Apache Kafka, Flink etc. Design and develop dimensional data model techniques (snowflake and star schemas, dimension, fact tables, slowly changing dimensions, periodic snapshot tables etc.) Design and build data models for optimal storage and retrieval.
Collaborate with cross-functional teams, such as data scientists, software engineers, and business stakeholders to understand data requirements and deliver solutions that meet business needs. Enforce data governance policies and practices to maintain data integrity, security, and compliance with relevant regulations. Work with developers to implement software processes and tools such as Git, CI/CD pipelines, and orchestration tools like Airflow.
REQUIREMENTS: Bachelor degree in Computer Science, Information Technology or related or foreign equivalent + 60 months of experience in the job offered. Employer will accept an Associate's degree in Computer Science or foreign equivalent or three years of academic studies or foreign equivalent towards a Bachelor of Science Degree in Computer Science. Experience in using Kafka, Hive and Sqoop tools are required.
Scott Monteith
EtiVenture, Inc
401 Sonterra Blvd Suite 375 San Antonio, TX 78258 Phone: 2064977876
Timestamp: Thu, 31 Oct 2024 22:27:54 -0500
|