|
Search Jobvertise Jobs
|
Jobvertise
|
Data Engineer (day 1 onsite) - Need consultants who can share PP Location: US-VA-Mclean Jobcode: 3614903 Email Job
| Report Job
Job Title: Data Engineer (day 1 onsite)Location: Mclean, VA Job Description: We are seeking an experienced Data Engineer with 7+ years of experience in Python, Snowflake, and ETL tools. This role is responsible for the development, testing, and maintenance of data engineering frameworks and large-scale data processing systems. The ideal candidate is a detail-oriented team player who will be the first line in creating innovative data solutions for our business. Key Responsibilities: 1. Design, construct, install, test, and maintain highly scalable data management systems. 2. Work closely with IT architects, data analysts, and business stakeholders to design and implement data solutions that meet business requirements and goals. 3. Collaborate with business stakeholders to identify opportunities for data-driven improvements. 4. Develop and implement data standards, procedures, and guidelines to ensure data integrity, dependability, and regulatory compliance. 5. Develop and implement complex database queries, stored procedures, and scripts. 6. Anticipate future demands of initiatives related to data, and make recommendations to upper management on necessary upgrades or new systems. Required Skills & Qualifications: 1) Proficiency in Python is one of the fundamental requirements. This includes a deep understanding of Python syntax, data structures, control flow, and error handling mechanisms. 2) At least one project experience of deploying data ingestion framework using python. 3) Extensive experience with Snowflake and its features like snowpipe, snowpark, tasks, procedures, eetc. 4) Hands-on experience with ELT (Extract, Load, Transform) tools and processes. 5) Proficiency in SQL is essential, including the ability to write complex queries and stored procedures. 6) Experience working with legacy and modern sources and file formats like json, csv, xml. and classified data with PII, PHI etc. 7) Proficiency in python for legacy and modern ingestion is essential and experience in usage of libraries for data engineering like pandas, sqlalchemy, snowpark, etc. is required 8) Familiarity with version control systems like Git for managing and collaborating. 9) Familiarity with at least one cloud platform, such as AWS, Azure, or Google Cloud, and understanding how Snowflake integrates with these platforms. Proficiency in Azure Cloud services like Azure data factory, DevOps are advantageous. 10) Exceptional analytical and problem-solving skills. 11) Strong communication and collaboration skills. 12) A degree in Computer Science, Engineering, or a related field. Need resumes @ .
Tech20 Solutions
|