Data Engineer - Mexicali, México - PACCAR

Empresa verificada
Mexicali, México

hace 2 semanas

Rodrigo Fernández

Publicado por:

Rodrigo Fernández

Reclutador de talento para beBee


Company Information

PACCAR is a Fortune 500 company established in 1905. PACCAR Inc is recognized as a global leader in the commercial vehicle, financial, and customer service fields with internationally recognized brands such as Kenworth, Peterbilt, and DAF trucks. PACCAR is a global technology leader in the design, manufacture and customer support of high-quality light-, medium-, and heavy-duty trucks under the Kenworth, Peterbilt and DAF nameplates. PACCAR also designs and manufactures advanced diesel engines, provides financial services and information technology, and distributes truck parts related to its principal business.

Whether you want to design the transportation technology of tomorrow, support the staff functions of a dynamic, international leader, or build our excellent products and services — you can develop the career you desire with PACCAR.

Get started

Kenworth Truck Company
Kenworth Truck Company is the manufacturer of The World's Best heavy and medium duty trucks. Kenworth is an industry leader in providing fuel-saving technology solutions that help increase fuel efficiency and reduce emissions.

The company's dedication to the green fleet includes aerodynamic trucks, compressed and liquefied natural gas trucks, and medium duty diesel-electric hybrids.

Kenworth is the first truck manufacturer to receive the Environmental Protection Agency's Clean Air Excellence award in recognition of its environmentally friendly products.

Requisition Summary

Job Functions / Responsibilities

  • Design, implement, and support data warehouse infrastructure using Azure Data factory, SQL Server, and multiple other RDBMS engines.
  • Create ELT/ETL procedures to take data from various operational systems and integrate into a dimensional or star schema data model for analytics and reporting.
  • Support Data Analysts and Research Scientist in analyzing usage data to derive new insights and fuel customer success
  • Use business intelligence and visualization software (e.g., PowerBI, Tableau Server, Jupyter Notebooks, etc.) to empower nontechnical, internal customers to drive their own analytics and reporting.
  • Manages data models and related artifacts in a source safe repository such as TFS or GitHub.
  • Provides ongoing support for Agile projects
  • Performs expertlevel data development and design work in Cloud environments that may include logical data topology design, cloud data architecture analysis and design, including integration with additional 3rd party data sources in multiple cloud platforms.
  • Ensures security is integrated into all data solutions to meet compliance standards
  • Work with Business Customers in understanding the business requirements and implementing solutions.
  • Performance tuning on databases and ETL transformations
  • Improve data foundational procedures, guidelines, and standards.
  • Comply with the change control process
  • Maintain audit compliance
  • Support scheduled afterhours maintenance
  • Ability to be on call for 24/7 support
  • Perform additional data and database related tasks.

Skills Required:

  • Bachelor's degree in Computer Science or related field.
  • 25 year's relevant experience in data modeling and data architect.
  • Expertise with SQL and relational database systems.
  • Knowledge of data warehousing concepts.
  • Experience in data mining, ETL, etc. and using databases in a business environment with largescale, complex datasets.
  • Experience in generating scripts for physical database deployment.
  • Experience with Python, C#, and/or other programming languages.
  • Familiarity with Jira, Dev OPS or other project tracking tools.
  • Familiarity with GitHub or DevOPS continuous delivery pipelines.
  • Proficiency with ETL tools and techniques such as SSIS, Azure Data Factory, AWS Glue, etc
  • Knowledge of more than one database platform, such as Azure, SQL Server, Snowflake, Oracle and Teradata.
  • Must have knowledge of data normalization and denormalization techniques
  • Excellent interpersonal and organizational skills, including active listening, problem solving when under pressure, and facilitation.
  • Strong troubleshooting and problemsolving skills.
  • Machine learning model experience a plus.


  • Bachelor's degree in Computer Science or related field required.
  • MCSE or related SQL certification preferred

Más ofertas de trabajo de PACCAR