BPM – where caring and community is in our company DNA; we are always striving to be our best selves; and we’re compelled to ask the questions that lead to innovation. As a Data Engineer, you will help drive BPM’s data-driven culture by building scalable, secure, and high-quality data pipelines that support insights and decision-making across the firm. You’ll play an integral role in delivering, maintaining, and optimizing our data infrastructure in a cutting-edge Azure environment.
Working with BPM means using your experiences, broadening your skills, and reaching your full potential in work and life—while also making a positive difference for your clients, colleagues, and communities. Our shared entrepreneurial spirit drives us to see and do things differently. Our passion for people makes BPM a place where everyone feels welcome, valued, and part of something bigger. Because People Matter.
What you get:
· Total rewards package: from flexible work arrangements to personalized benefit structures and financial compensation options that give you choice and flexibility
· Well-being resources: interactive wellness platform and incentives, employee assistance program and mental health resources, and Colleague Resource Groups (CRGs)
· Balance & flexibility: 14 Firm Holidays (including 2 floating), Flex PTO, paid family leave, winter break, summer hours, and remote work options
· Professional development opportunities: a learning culture with CPA exam resources and bonuses, tuition reimbursement, a coach program, and workshops through BPM University
About BPM:
BPM LLP is one of the 40 largest public accounting and advisory firms in the United States with a global team of over 1,200 colleagues. A Certified B Corp, the Firm works with clients in the agribusiness, consumer business, financial and professional services, life science, nonprofit, wine and craft beverage, real estate and technology industries. BPM’s diverse perspectives, expansive expertise, and progressive solutions come together to create exceptional experiences for individuals and businesses around the world. To learn more, visit our website.
For this position, you will have:
Undergraduate degree in data or computer science, IT, statistics, or mathematics preferredMinimum of 2 years of experience as a Data Engineer in a Databricks environmentSpecific expertise in Databricks Delta Lake, notebooks, and clustersData Vault Modeling experienceKnowledge of big data technologies such as Hadoop, Spark, and KafkaStrong understanding of relational data structures, theories, principles, and practicesProficiency in Python and SQL programming languagesStrong understanding of data modeling, algorithms, and data transformation strategies for data science consumptionExperience with performance metric monitoring and improvementExperience analyzing and specifying technical and business requirementsAbility to create consistent requirements documentation in both technical and user-friendly languageExcellent critical thinking skills and understanding of relationships between data and business intelligenceStrong communication skills with technical and non-technical audiencesAbility to work remotely and collaborate with geographically distributed team In this position, you will:
Support BPM's culture of data, representing the firm's approach to data management, stewardship, lineage, architecture, collection, storage, and utilization for delivering analytic resultsDeliver, maintain, and build trusted business relationships that contribute to BPM's data cultureStay current with the latest technologies and methodologies with a pragmatic mindsetParticipate in technology roadmaps and maintain data pipeline and tool documentation
Data Pipeline Development
Build, maintain, and govern data pipelines in an Azure environment with best of breed technologyDevelop pipelines to the data Lakehouse ensuring scalability, reliability, security, and usability for insights and decision-makingDevelop, deploy, and support high-quality, fault-tolerant data pipelinesBuild infrastructure for optimal extraction, loading, and transformation of data from various sourcesSupport architecture for observing, cataloging, and governing data
ETL / ELT
Build and optimize ELT functionality using Python, dbt, and SQLMonitor and troubleshoot ELT processes to ensure accuracy and reliabilityImplement development best practices including technical design reviews, test plans, peer code reviews, and documentation
Data Governance & Security
Implement data governance and access controls to ensure data security and complianceCollaborate with security to implement encryption, authentication, and authorization mechanismsMonitor and audit data access to maintain data privacy and integrity
Collaboration & Communication
Collaborate with cross-functional stakeholders and IT to deliver meaningful outcomesProfile data sources and understand data relationships to support analytics