You will help make sense of petabytes of Data while building, supporting and delivering state-of-the-art stream and batch data flows. The data teams are placed at the core of Business, Analytics, and Data Science operations, facilitating decision-making by providing the right data, at the right time, to the right people or system.
- You will design, develop, model and create data structures to integrate and prepare for reporting solutions and broader business intelligence application.
- You will implement the tools and processes to handle performance, scale, availability, accuracy, and monitoring.
- You will identify performance and data challenges, suggesting code and architecture improvements.
- You will participate in a DevOps environment (you build it, you run it) by managing and operating real-time and batch data solutions at scale.
- You will develop, build scalable and high-performant data enrichment processes, using the right development patterns and continuous deployment/integration practices.
- You will cooperate in an agile and dynamic environment to lead change and keep us current with the latest technologies.
- You will breakdown requirements, simplify architectures and collaborate with other business teams to deliver intelligent data solutions in line with defined objectives and results.
- You will collaborate on a multi-functional team, composed by BI developers, Software Developers, Data Engineers, Technical Specialists and Product Managers.
- You will seek opportunities to bring advanced analytical techniques and solutions to new data products.
Qualification & Experience:
- You deeply understand of SQL, all its facets, and have experience working with traditional (e.g. Teradata) or cloud-based data-warehouse (e.g. Redshift, Snowflake).
- Experience producing tested, secure, resilient and well-documented applications.
- Proven background in a variety of data technologies, such as Teradata, Hadoop, Spark, HBase, Hive, Presto and/or ETL frameworks.
- You develop elegant SQL queries while breaking down complexity and identifying modular parts.
- You know about domain and business event patterns, streaming pipelines such as Kafka or Kinesis.
- You know and use the best fit data technologies and approaches to addressing performance, scalability, governance challenges.
- You have knowledge of dimensional modeling concepts and other efficient data representations for optimizing SQL queries.
- You understand the AWS cloud ecosystem, such as EMR, Lambda, S3, EC2, Cloud Formation, VPC, etc.
- You have excellent interpersonal skills and verbal and written communication skills when working with both business and technical teams.
- You have knowledge of cloud infrastructures automation tools such as CloudFormation and Docker.
Vacancy Type: Full Time
Job Functions: Information Technology,Engineering
Job Location: Chicago, IL, US
Application Deadline: N/A