15 Jan
Cloud Data Solution Architect
Vacancy expired!
- Chicago Tribune’s Top Workplaces
- Inc 500 Fastest Growing Private Companies in the US
- Crain’s Fast 50 fastest growing companies in the Chicago area
- Talend Expert Partner recognition
- Microsoft Gold Data Platform competency
- Lead and provide advanced data engineering expertise for projects that enable analytics to drive optimization of decisions for client(s), within a team of engineers.
- Design new methods and processes to ensure maximum effectiveness of client data.
- Partner with data analysts/scientists to provide solutions enabling statistical analysis tools and data visualization applications.
- Identify processes and tools that can be shifted towards automation to enable seamless development and self-service analytics workloads.
- Partner with various business units and data stewards to understand the business needs.
- Obtain and/or maintain technical expertise of available data manipulation and preparation tools (Talend, Informatica, Matillion etc) as well as programming languages ( Python, Spark, EMR, etc.)
- Ensure data is secure, relevant, and maintains high quality standards.
- Identify and implement industry best practices.
- Evaluate new data sets to determine appropriate ingestion techniques.
- Build, manage and optimize data pipelines through a variety of ETL tools, including custom infrastructure and 3rd-party tooling (AWS, Google Cloud Platform, Databricks, Snowflake).
- Work with internal engineering teams and vendors to understand business logic to ensure veracity in datasets.
- Generate documentation on existing production data logic and its influencing business processes in order to reconcile knowledge gaps between the business, engineering, and data collection.
- 8-10 years of experience in delivering data engineering solutions that include batch and streaming capabilities.
- 5+ years of strategic/management consulting experience is highly preferred.
- Business Development experience required (i.e. RFP, proposal writing, opportunity assessment, etc.)
- Experience building, testing, automating and optimizing data pipelines.
- Experience using AWS, Databricks, Snowflake or similar products.
- Strong understanding and prior use of SQL and be highly proficient in the workings of data technologies (Hadoop, Hive, Spark, Kafka, low latency data stores, Airflow, etc.).
- Deep understanding of data testing techniques, and a proven record of driving sustainable change to the software development practice to improve quality and performance.
- Proficiency with data querying languages (e.g. SQL), programming languages (e.g. Python, Spark, Java, etc.).
- Expertise selecting context-appropriate data modeling techniques, including Kimball dimensional modeling, slowly changing dimensions, snowflake, and others.
- Passion for software development and data and be highly skilled in performing data extraction, transformation and processing to optimize quantitative analyses on various business functions.
- Familiarity with Scrum, DevOps, and DataOps methodologies, and supporting tools such as JIRA.
- Experience with AWS technologies such as Redshift, RDS, S3, Glacier, EC2, Lambda, API Gateway, Elastic Map Reduce, Kinesis, and Glue.
- Experience with managing AWS infrastructure as code, including the use of Cloud Formation, Git, and GitLab.
- Excellent oral and written communication skills.
- Strong presentation skills and the ability to communicate analytical and technical concepts with confidence and in an easy-to-understand fashion to technical and non-technical audiences.
- Bachelor or Master Degree in Computer Science or relevant field is required.
- Candidates can live anywhere in the U.S. but must be willing to travel up to 25% to the client location.
- Health Care Plan (Medical, Dental & Vision)
- Retirement Plan (401k, IRA)
- Life Insurance (Basic, Voluntary & AD&D)
- Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Training & Development
- Work From Home
- College Tuition Benefit
- Bonus Program
Vacancy expired!