|
Search Jobvertise Jobs
|
Jobvertise
|
Data Engineer Big Data Hadoop Location: US-FL-Fort Lauderdale Jobcode: c422f9512e985a80c544d63480280fad-122020 Email Job
| Report Job
CLIENT NAME: HealthCare.
JOB ID: PIT-DEBD-03232020
OPEN POSITIONS: 1
JOB LOCATION: Fort Lauderdale, FL
DURATION: 12 Plus Months
JOB TYPE: CTC (CORP TO CORP) WITH 3RD PARTY
WORK AUTHORIZATION: US CITZ or GC,GC - EAD,H4 & L2 EAD,
NO H1B & OPT/CPT ACCEPTED BY CLIENT
RATE: $OPEN - DOE
JOB TITLE: Data Engineer Big Data Hadoop
JOB DESCR:
Candidates will start out REMOTE WORK and then will eventually be sitting in Frt. Lauderdale, FL.
Candidates should be senior Data Engineers with big data tools (Hadoop, Spark, Kafka) as well as AWS (cloud services: EC2, EMR, RDS, Redshift) and NOSQL.
This is a phone and Skype to hire. Candidates in Florida with a LinkedIn profile preferred but not required.
Essential Duties and Responsibilities:
• Past experience in executing and delivering solutions in an Agile scrum development environment is preferred
• Produce deliverables with little oversight or management assistance
• Assemble large, complex data sets that meet functional / non-functional business requirements
• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies
• Building and optimizing ‘big data’ data pipelines, data sets, data transformation, data structures, metadata, dependency and workload management
• Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
Non-Essential Duties and Responsibilities:
• Perform other duties as assigned
Minimum Qualifications:
Education/Licensing/Certification:
• Bachelor’s degree in Computer Science, Information Management, or related field preferred.
Experience:
• 5 years of experience, with a heavy emphasis on cloud and big data technologies
• Healthcare Knowledge and Experience heavily preferred
Knowledge and Skills:
• Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.)
• Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
• Strong hands-on personality, results-oriented, pragmatic, high quality standards, ability to multi-task and manage numerous projects
• Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
• Detail oriented, thorough and complete
• Strong analytic skills related to working with unstructured datasets
• Strong time management and organizational skills
• Experience supporting and working with cross-functional teams in a dynamic environment.
Desired Skills:
Experience:
• 5 years of experience, with a heavy emphasis on cloud and big data technologies
• Healthcare Knowledge and Experience heavily preferred
• Past experience in executing and delivering solutions in an Agile scrum development environment is preferred
• Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.)
• Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
IDEAL CANDIDATE:
Experience:
• 5 years of experience, with a heavy emphasis on cloud and big data technologies
• Healthcare Knowledge and Experience heavily preferred
• Past experience in executing and delivering solutions in an Agile scrum development environment is preferred
• Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.)
• Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
SPECIAL NOTES:
Experience:
• 5 years of experience, with a heavy emphasis on cloud and big data technologies
• Healthcare Knowledge and Experience heavily preferred
• Past experience in executing and delivering solutions in an Agile scrum development environment is preferred
• Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.)
• Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Screening Questions:
Experience:
• 5 years of experience, with a heavy emphasis on cloud and big data technologies
• Healthcare Knowledge and Experience heavily preferred
• Past experience in executing and delivering solutions in an Agile scrum development environment is preferred
• Knowledge of new and emerging data and analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.)
• Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
RELOCATION: LOCALS & NON-LOCALS ARE OK.,NO Relocation Assistance Available
TRAVEL: NO TRAVEL RELATED TO WORK
Interview Mode: Hire on Phone and Video/Skype Interview Screening.
Vegatron Systems
|