Solutions Architect – Azure
Wavicle Data Solutions designs and delivers data and analytics solutions to reduce time, cost, and risk of companies’ data projects, improving the quality of their analytics and decisions now and into the future. As a privately-held consulting service organization with popular, name brand clients across multiple industries, Wavicle offers exciting opportunities for data scientists, solutions architects, developers, and consultants to jump right in and contribute to meaningful, innovative solutions.
Our 250+ local, nearshore and offshore consultants, data architects, cloud engineers, and developers build cost-effective, right-fit solutions leveraging our team’s deep business acumen and knowledge of cutting-edge data and analytics technology and frameworks.
At Wavicle, you’ll find a challenging and rewarding work environment where we enjoy working as a team to exceed client expectations. Employees appreciate being part of something meaningful at Wavicle. Wavicle has been recognized by industry leaders as follows:
- Chicago Tribune’s Top Workplaces
- Inc 500 Fastest Growing Private Companies in the US
- Crain’s Fast 50 fastest growing companies in the Chicago area
- Talend Expert Partner recognition
- Microsoft Gold Data Platform competency
What You Will Get To Do:
- Ensure data is secure, relevant, and maintains high quality standards.
- Identify and implement industry best practices.
- Evaluate new data sets to determine appropriate ingestion techniques.
- Build, manage and optimize data pipelines through a variety of ETL tools, including custom infrastructure and 3rd-party tooling (Azure, Databricks, Snowflake).
- Work with internal engineering teams and vendors to understand business logic to ensure veracity in datasets.
- Generate documentation on existing production data logic and its influencing business processes in order to reconcile knowledge gaps between the business, engineering, and data collection.
- Sound Knowledge of ETL
- Strong in Technical, system design.
- Atleast 3 years in Azure.
- Worked with real-time data.
- Identify processes and tools that can be shifted towards automation to enable seamless development.
- Partner with various business units and data stewards to understand the business needs.
- Obtain and/or maintain technical expertise of available data manipulation and preparation tools (ADF, Talend, Informatica, Matillion, etc.) as well as programming languages (Python, Spark, etc.)
- 8-10 years of experience in delivering data engineering solutions that include batch and streaming capabilities.
- 5+ years of strategic/management consulting experience with experience in pre-sales activities (RFP response, proposal development, SOW development, etc.).
- 3+ years of subject matter expertise in 1 of the following industries: Retail/CPG, Financial Services, Insurance Healthcare, Pharma, Travel/Hospitality or other is required.
- Expert knowledge of key Azure services (VM, Storage, Azure SQL, Synapse Cosmos, HDInsights, Azure functions, containers, AKS, ADLS, Events Hub, Azure Databricks, VNet, Redis Cache, Resource Manager, Azure Diagnostics/OMS, Traffice Manager, Azure CDN, Azure Notification Hub, Azure Identity and Access Management, Infrastructure as a Code, etc…).
- Experience leveraging Azure Well Architected framework.
- Building CI/CD pipelines using Jenkins, Ansible, Python, Shell Scripting, Change Management and production support activities.
- In-depth expertise of design, implementation, engineering, automation and devops implementation, service operation and service improvement initiatives.
- On-prem / other cloud providers to Azure cloud migration using cloud migration tools like, ASR/Cloud Endure, etc.
- Cloud Strategy assessment and definition. Experience creating and delivering end-to-end roadmaps to address a business problem.
- Strong understanding of Unix operating system and experience in any scripting language like Python, Shell, Awk, etc.
- Understand and experience on handling of servers like IIS, Tomcat or Apache.
- Experience building, testing, automating and optimizing data pipelines.
- Experience using emerging technologies (Snowflake, Databricks, Matillion, etc.)
- Strong understanding and prior use of SQL and be highly proficient in the workings of data technologies (Hadoop, Hive, Spark, Kafka, low latency data stores, Airflow, etc.).
- Deep understanding of data testing techniques, and a proven record of driving sustainable change to the software development practice to improve quality and performance.
- Expertise selecting context-appropriate data modeling techniques, including Kimball dimensional modeling, slowly changing dimensions, snowflake, and others.
- Passion for software development and data and be highly skilled in performing data extraction, transformation and processing to optimize quantitative analyses on various business functions.
- Familiarity with Scrum, DevOps, and DataOps methodologies, and supporting tools such as JIRA.
- Excellent oral and written communication skills.
- Strong presentation skills and the ability to communicate analytical and technical concepts with confidence and in an easy-to-understand fashion to technical and non-technical audiences.
- Bachelor or Master Degree in Computer Science or relevant field is required.