22 Apr
Cloud Data Engineer (AWS, Azure, Google Cloud Platform)
Arizona, Gilbert , 85233 Gilbert USA

Vacancy expired!

Cloud Data Engineer - Solution Specialist - USDC Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center. Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below

Work you'll do/Responsibilities
  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
  • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
  • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
  • Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure P Methods.
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure P cloud platform.

The Team From our centers, we work with Deloitte consultants to design, develop and build solutions to help clients reimagine, reshape and rewire the competitive fabric of entire industries. Our centers house a multitude of specialists, ranging from systems designers, architects and integrators, to creative digital experts, to cyber risk and human capital professionals. All work together on diverse projects from advanced preconfigured solutions and methodologies, to brand-building and campaign management. We are a unique blend of skills and experiences, yet we underline the value of each individual, providing customized career paths, fostering innovation and knowledge development with a focus on quality. The US Delivery Center supports a collaborative team culture where we work and live close to home with limited travel.

Qualifications

Required
  • 3+ years of experience in data engineering with an emphasis on data analytics and reporting.
  • 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (Google Cloud Platform), others.
  • 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Warehouse, etc.).
  • 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
  • 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
  • 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
  • Bachelor's degree or equivalent work experience.
  • Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice).
  • Must live a commutable distance, or relocate, to one of the following cities: Austin, TX; Charlotte, NC; Cincinnati, OH; Gilbert, AZ; Lake Mary, FL; Phoenix, AZ; Pittsburgh, PA; Tampa, FL
  • Must be willing to live and work in one of our Center locations:
    • Lake Mary, FL (Orlando area)
    • Mechanicsburg, PA (Harrisburg area)
    • Gilbert, AZ (Phoenix area)
  • Limited immigration sponsorship may be available

Preferred
  • AWS, Azure and/or Google Cloud Platform Certification.
  • Master's degree or higher.
  • Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
  • Experience working with either a Map Reduce or an MPP system on any size/scale.
  • Experience working with agile development methodologies such as Sprint and Scrum.

Deloitte's culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives.

Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte's impact on the world.

Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you're applying to. Check out recruiting tips from Deloitte professionals.

Vacancy expired!


Report job