Data Engineer
- Job ID
- 59902
As a Data Engineer for the WORKS team, you will play a critical role in the modernization and re-platforming of the WORKS Data Warehouse. This position is part of a strategic initiative to transition from legacy data structures (Oracle Database) to a modern Data Mart on Google Cloud infrastructure (Big Query and Cloud Run). You will not just be "moving data"; you will be a key architect in reducing years of technical debt by refactoring complex table structures, eliminating redundancies, and building our first formal ETL processes to ensure a high-performance, cost-effective data environment.
Responsibilities:
- Reverse Engineering & Logic Extraction: Analyze legacy Oracle data structures and SAP Business Objects Universes to identify and document the "hidden" ETL logic, complex joins, and calculated measures currently used for business reporting.
- Data Mart Design: Design and implement a clean, high-performance Data Mart in Big Query (Star/Snowflake schema) that eliminates the redundancies and "spaghetti joins" of the legacy environment.
- Source-to-Target Mapping: Map upstream interfaces and Oracle data structures to the new Big Query environment, ensuring data integrity and consistency during the transition.
- ETL Development:Design, develop, and implement high-quality ETL pipelines to automate data movement and transformation to replace manual or non-existent processes.
- Technical Debt Reduction: Actively simplify the data architecture by consolidating redundant tables and optimizing query paths for cloud-native performance.
- Data Models Enablement: Build and maintain the core data layer (tables, views, and curated datasets) that will enable other Software Engineers to successfully transition reports from SAP Business Objects to Power BI.
- Documentation: Create clear technical documentation of the new Data Mart schema and the logic used to transform legacy data.
- Collaboration: Work closely with a global team of developers (located in Mexico, US and India) and business stakeholders to simplify WORKS data architecture, and ensure data availability and integrity for reporting and analytics.
- Ensure on-time delivery using Agile, engaging in practices such as paired programming and automated testing for data pipelines.
- Conduct code and design reviews to ensure adherence to data standards, patterns, and architecture principles.
- Perform and participate in load/volume testing to ensure the new platform can handle global scale.
Required Skills and Experience:
- Bachelor’s Degree in Computer Science, Computer Engineering or a related field.
- English proficiency (written and verbal).
- Data Engineering or Database Development Experience (3-5 year minimum).
- Excellent communication skills, with the ability to articulate complex technical concepts to global stakeholders.
- Advanced proficiency in SQL and PL/SQL, with a strong ability to read and interpret complex legacy stored procedures and view logic
- Proven experience in Data Modeling, specifically designing Data Marts and optimizing schemas for analytical workloads.
- Familiarity with SAP Business Objects (Universes/Web Intelligence) with the ability to navigate and extract transformation logic from the semantic layer.
- Experience with Google Cloud Platform (GCP) and Big Query.
- Experience with ETL tools (such as Cloud Data Fusion, dbt, or Dataform).
- Experience in relational database management systems (RDBMS) like Oracle or PostgreSQL.
- Willingness to challenge the "status quo" to eliminate redundant table structures and unnecessary joins.
- Ability to work in a dynamic environment, handling multiple assignments and prioritizing work appropriately.
- Strong collaboration skills and ability to work across regions (US, Mexico, India).
- Attention to detail and a strong "detective" mindset for solving data redundancy problems.
- Experience working with Agile methodologies (SCRUM, Kanban)
Skills/Experience Preferred:
- Domain knowledge of order to delivery of vehicles.
- Specific experience with Big Query SQL and performance tuning (partitioning/clustering). Experience migrating from SAP Business Objects to Power BI (Data Layer focus).
- Knowledge of Unix Shell Scripting
- Experience with GitHub or other Version Control tools
- Knowledge of Unix/Autosys to understand legacy job scheduling
-
Built on one bold idea and the passion to define sustainable transportation for generations to come, Ford is a story about people with a vision that’s still being written.
What We Do -
Ford’s culture fuels the kind of momentum where ideas flow, progress is unstoppable, and our people keep redefining what it means to innovate.
Our People and Culture -
At Ford, your work matters, your life matters and we’re here to back the whole you—from growth to well-being—so you show up ready to realize your full potential.
Your Benefits
Jobs For You.
Explore roles tailored to your interests, based on your preferences and experience.
-
VHES Business Supervisor
- Dearborn, Michigan
-
Platform Hardware Technical Engineer
- Dearborn, Michigan
-
Category Implementation Buyer-Engine Castings
- Dearborn, Michigan
-
Commodity Calibration Engineer
- Dearborn, Michigan