Job Description
This a Full Remote job, the offer is available from: United States
Yelp engineering culture is driven by our values: we?re a cooperative team that values individual authenticity and encourages creative solutions to problems. All new engineers deploy working code their first week, and we strive to broaden individual impact with support from managers, mentors, and teams. At the end of the day… we?re all about helping our users, growing as engineers, and having fun in a collaborative environment. Our mission is to connect people with great local businesses, and our analytics engineers play a pivotal role in transforming raw data into actionable insights that drive our business forward.
In this role, you’ll design and build robust data pipelines to manage large volumes of data, ensuring seamless integration into our data warehouse. You will also develop and maintain our data models, data warehouse, and semantic layer to support analytics and lead data-driven solutions. Ensuring data integrity will be key, as you implement quality checks and processes across the organization. Additionally, you’ll continuously integrate cutting-edge technologies to enhance our data processing and wrangling capabilities.
You’ll collaborate closely with teams such as finance, marketing, people, business operations, and product to shape the data and analytics roadmap and develop scalable solutions. If you have a knack for ownership, a curiosity to understand and solve complex problems, and a passion for deriving insights from complex datasets, we’re looking for you!
This opportunity is fully remote and does not require you to be located in any particular area in Canada. We welcome applicants from throughout Canada. We?d love to have you apply, even if you don?t feel you meet every single requirement in this posting. At Yelp, we?re looking for great people, not just those who simply check off all the boxes.
? Collaborate with business and engineering teams, product managers, and analysts to understand the organization’s data needs.
? Develop and maintain the content and structure within data storage solutions including data lakes and data warehouses, ensuring data is accurately organized, enriched, and ready for analysis.
? Architect and implement scalable and flexible data models as well as data processes.
? Design, build and launch efficient & reliable data pipelines to move and transform data.
? Address complex data challenges and build analytical solutions for decision-making.
? Balance scalability, latency, and availability based on requirements.
? Build analytical solutions and data products that leverage data for decision-making and analytics.
? Mentor other engineers and share your skills and knowledge.
? Support on-call shift as needed to support the team.
? Expertise in architecting and implementing efficient data models and data warehousing solutions.
? A strong understanding of data lifecycle development and ability to iteratively improve the data processes and maturity.
? High proficiency in scripting with SQL and Python, extracting large sets of data, and designing ETL flows.
? Hands-on experience with orchestration tools like Airflow or similar.
? A deep understanding of the systems you’ve worked on.
? Experience with more than one coding language.
? Experience in following technologies: dbt, Python, AWS Redshift, AWS Athena, AWS S3, data observability, data cataloging and data integration tools.
This offer from “Yelp” has been enriched by Jobgether.com and got a 82% flex score
For more such jobs please click here!