Data Engineer Job

Date: Sep 7, 2023

Location: Mexicali, BCN, MX, CP 21385

Company: PACCAR

Company Information

PACCAR is a Fortune 500 company established in 1905. PACCAR Inc is recognized as a global leader in the commercial vehicle, financial, and customer service fields with internationally recognized brands such as Kenworth, Peterbilt, and DAF trucks.  PACCAR is a global technology leader in the design, manufacture and customer support of high-quality light-, medium-, and heavy-duty trucks under the Kenworth, Peterbilt and DAF nameplates.  PACCAR also designs and manufactures advanced diesel engines, provides financial services and information technology, and distributes truck parts related to its principal business.

Whether you want to design the transportation technology of tomorrow, support the staff functions of a dynamic, international leader, or build our excellent products and services — you can develop the career you desire with PACCAR. Get started!

 

Kenworth Truck Company

 

Kenworth Truck Company is the manufacturer of The World’s Best® heavy and medium duty trucks. Kenworth is an industry leader in providing fuel-saving technology solutions that help increase fuel efficiency and reduce emissions. The company’s dedication to the green fleet includes aerodynamic trucks, compressed and liquefied natural gas trucks, and medium duty diesel-electric hybrids. Kenworth is the first truck manufacturer to receive the Environmental Protection Agency’s Clean Air Excellence award in recognition of its environmentally friendly products.

 

Requisition Summary

 

We are looking for an experienced Data Engineer with an uncanny ability to integrate multiple heterogeneous data sources to build efficient, flexible, and scalable data warehouse and reporting solutions. The ideal candidate is enthusiastic about learning new technologies and implementing solutions using these technologies to empower internal customers and scale the existing platform. The ideal candidate demonstrates solid business and communication skills and ability to work with Research Scientists and business owners across both technical and non-technical teams to develop and define key business questions, then build the data sets that answer those questions. In this role, you will serve as the expert at designing, implementing, and operating stable, scalable, low cost solutions to flow data from production systems into the data warehouse and into end user facing reporting applications. Above all, you should be excited about bringing large datasets together to answer business questions and drive data driven decision making. The Data Engineer will design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures for business intelligence analytics and deep learning. Implement data structures using best practices in data modeling, ETL/ELT processes in Azure Synapse, AWS Redshift, Snowflake and OLAP technologies He/she will be writing scalable queries and tuning performance on queries running over billion of rows of data. The person in this position should be analytical, have an extremely high level of customer focus and a passion for process improvement.  The Data Engineer should be motivated self-starter that can work independently in a fast paced, ambiguous environment and should have excellent business and communication skills to be able to work with business owners to develop and define key business questions.

 

Job Functions / Responsibilities

 

  • Design, implement, and support data warehouse infrastructure using Azure Data factory, SQL Server, and multiple other RDBMS engines.
  • Create ELT/ETL procedures to take data from various operational systems and integrate into a dimensional or star schema data model for analytics and reporting.
  • Support Data Analysts and Research Scientist in analyzing usage data to derive new insights and fuel customer success
  • Use business intelligence and visualization software (e.g., PowerBI, Tableau Server, Jupyter Notebooks, etc.) to empower non-technical, internal customers to drive their own analytics and reporting.
  • Manages data models and related artifacts in a source safe repository such as TFS or GitHub.
  • Provides on-going support for Agile projects
  • Performs expert-level data development and design work in Cloud environments that may include logical data topology design, cloud data architecture analysis and design, including integration with additional 3rd party data sources in multiple cloud platforms.
  • Ensures security is integrated into all data solutions to meet compliance standards
  • Work with Business Customers in understanding the business requirements and implementing solutions.
  • Performance tuning on databases and ETL transformations
  • Improve data foundational procedures, guidelines, and standards.
  • Comply with the change control process
  • Maintain audit compliance
  • Support scheduled after-hours maintenance
  • Ability to be on call for 24/7 support
  • Perform additional data and database related tasks.

 

 

Skills Required:

 

  • Bachelor’s degree in Computer Science or related field.
  • 2-5 year’s relevant experience in data modeling and data architect.
  • Expertise with SQL and relational database systems.
  • Knowledge of data warehousing concepts.
  • Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets.
  • Experience in generating scripts for physical database deployment.
  • Experience with Python, C#, and/or other programming languages.
  • Familiarity with Jira, Dev OPS or other project tracking tools.
  • Familiarity with GitHub or DevOPS continuous delivery pipelines.
  • Proficiency with ETL tools and techniques such as SSIS, Azure Data Factory, AWS Glue, etc…
  • Knowledge of more than one database platform, such as Azure, SQL Server, Snowflake, Oracle and Teradata.
  • Must have knowledge of data normalization and de-normalization techniques
  • Excellent interpersonal and organizational skills, including active listening, problem solving when under pressure, and facilitation.
  • Strong troubleshooting and problem-solving skills.
  • Machine learning model experience a plus.

 

Education/Training:

  • Bachelor's degree in Computer Science or related field required.
  • MCSE or related SQL certification preferred


Job Segment: Data Modeler, Data Architect, Machinist, Data Warehouse, Computer Science, Data, Manufacturing, Technology