Data warehouse Architect (AWS) - HHCJP00001833

Chicago, IL 60606

Employment Type: Contract Job Category: Data Engineer Job Number: TS221960518 Compensation: $75-100 / hour

Job Description

Title: AWS Data warehouse Architect

Location: Chicago IL/Remote
 
Hire Type: CONTRACT 
 
Overview:
Sterling has helped build careers for thousands of professionals like yourself. Our expert recruiters support you at every step in the process and as a Best of Staffing company, Sterling provides exciting work with exceptional employers across the U.S.
 
As a contract employee of Sterling, you are eligible to receive a Full Employee Benefits Package that includes paid time off, paid holidays, 3 medical plans to choose from, dental & vision plans, 401(k), and an Employee Stock Ownership (ESOP) plan.
 
Job Summary: 
Seeking AWS Data warehouse Architect who will be an exceptional addition to our clients growing engineering team. We are looking for someone with Security, encryption and RBAC (Role base access control) experience.
 
Job Duties:
  1. Work on a Snowflake DW design and implementation.
  2. Design and deploy services to AWS using Terraform and Cloud formation stack.
  3. Design CICD pipelines for applications and deploy them.
  4. Design near real-time data pipelines using Kafka streaming service.
  5. Deliverables    
  6. Work on a Snowflake DW design and implementation.
  7. Design and deploy services to AWS using Terraform and Cloud formation stack.
  8. Design CICD pipelines for applications and deploy them.
  9. Design near real-time data pipelines using Kafka streaming service.

     
Qualifications:
  1. 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics
  2. Bachelor’s degree in Engineering, Computer Science, or related field
  3. Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
  4. Experience with AWS Cloud 
  5. Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.
  6. Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders
  7. Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase
  8. Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)
Preferred
  1. Background with Linux Administration is preferred
 

Qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identify, national origin, disability, protected veteran status, or genetic information.

#SP

Meet Your Recruiter

Sabika Sewani

Apply Online

Send an email reminder to:

Share This Job:

Related Jobs:

Login to save this search and get notified of similar positions.