About the Role
We are looking for a Solution Architect who will perform mission critical duties with data engineering strategies, contributing to and leading the development of our Enterprise Data and Analytics Platforms. The ideal Architect will be a passionate professional, who can blend the ever-evolving technology landscape of Cloud and Advanced Analytics, with the complex and high-impact space of E-Commerce and Direct Sales.
The Solution Architect will be responsible for leading a team of talented engineers, to develop and maintain the foundation of next generation data platforms. This role will be responsible for expanding, optimizing, and monitoring our expanding data pipelines through meticulous architecting, intelligent business logic, consistent data governance, testing and continuous delivery.
- Lead and provide advanced data engineering expertise for projects that enable analytics to drive optimization of decisions for client(s), within a team of engineers.
- Design new methods and processes to ensure maximum effectiveness of client data.
- Partner with data analysts/scientists to provide solutions enabling statistical analysis tools and data visualization applications.
- Identify processes and tools that can be shifted towards automation to enable seamless development and self-service analytics workloads.
- Partner with various business units and data stewards to understand the business needs.
- Obtain and/or maintain technical expertise of available data manipulation and preparation tools (Talend, Informatica, Matillion, etc.) as well as programming languages ( Python, Spark, EMR, etc.)
- Ensure data is secure, relevant, and maintains high quality standards.
- Identify and implement industry best practices.
- Evaluate new data sets to determine appropriate ingestion techniques.
- Build, manage and optimize data pipelines through a variety of ETL tools, including custom infrastructure and 3rd-party tooling (AWS, GCP, Databricks, Snowflake).
- Work with internal engineering teams and vendors to understand business logic to ensure veracity in datasets.
- Generate documentation on existing production data logic and its influencing business processes, in order to reconcile knowledge gaps between the business, engineering, and data collection.
Required Knowledge and Level of Experience
- 8-10 years of experience in delivering data engineering solutions that include batch and streaming capabilities.
- Experience building, testing, automating and optimizing data pipelines.
- Experience using AWS, Databricks, Snowflake or similar products.
- Strong understanding and prior use of SQL and be highly proficient in the workings of data technologies (Hadoop, Hive, Spark, Kafka, low latency data stores, Airflow, etc.).
- Deep understanding of data testing techniques, and a proven record of driving sustainable change to the software development practice to improve quality and performance.
- Proficiency with data querying languages (e.g. SQL), and programming languages (e.g. Python, Spark, Java, etc.).
- Expertise selecting context-appropriate data modeling techniques, including Kimball dimensional modeling, slowly changing dimensions, snowflake, and others.
- Passion for software development and data and must be highly skilled in performing data extraction, transformation and processing to optimize quantitative analyses on various business functions.
- Familiarity with Scrum, DevOps, and DataOps methodologies, and supporting tools such as JIRA.
- Experience with AWS technologies such as Redshift, RDS, S3, Glacier, EC2, Lambda, API Gateway, Elastic Map Reduce, Kinesis, and Glue.
- Experience with managing AWS infrastructure as code, including the use of Cloud Formation, Git, and GitLab.
- Bachelors’ degree preferred in Computer Science, Information Systems, or another related field.
- Excellent oral and written communication
- Strong presentation skills possessing the ability to communicate analytical and technical concepts with confidence and in an easy-to-understand fashion to technical and non-technical audiences.