Data Engineer Intern

Hong Kong
Internship
Product and Technology
Student (College)

About the Team:

The Product & Technology department is composed of Product Managers, Engineers and Designers. Ownership, meritocracy and collaboration are at our core. We are not afraid to think differently, embrace new ideas and dream big. We empower ownership and share responsibility. We support each other to achieve and grow. Our goal is simple - to create products that delight our customers and readers.
 

Purpose of the position:

We are seeking a skilled Data Engineer to contribute to our business strategy projects. In this role, you will report to the lead engineer and collaborate closely with our engineering and product squad within the organisation. This role supports pipeline development on an agile team, delivering key datasets that empower the product and the go to market team to grow our traffic and our subscribers base.
 

In this role, you will:

  • Assist in the full data pipeline development lifecycle, including helping to migrate datasets cross clouds, verify data integrity, deploy and schedule in UAT and production.
  • Support the building and deployment of ETL pipelines in Airflow and data models in dbt.
  • Participate in code reviews to learn best practices, ensure code quality, and engage in knowledge sharing among team members.
  • Stay curious and up-to-date with technologies, trends, and best practices in applying AI to boost data engineering productivity and efficiency
  • Assist across the applications lifecycle, gaining exposure to development, testing, documentation, and ongoing support.


Skills and Experience that will lead to success:

  • Currently pursuing or recently graduated with a Degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence or a related discipline.
  • Academic projects, coursework, or previous internship experience related to software development or data engineering.
  • Good command of Python and SQL to create scripts, debug data issues, and run automations
  • Familiarity with or keen interest in learning data modeling and pipeline design (ETL, ELT, data warehousing).
  • Experience with Airflow and dbt is a plus
  • Knowledge of Kubernetes, Docker, CI/CD pipelines, microservice architecture, and Cloud concepts is an advantage.
  • Ability to problem solve through collaboration and a willingness to ask the right questions.
  • Good team player with a strong proactive willingness to learn.
 

Work location will be at Causeway Bay office.

Our Privacy Notice aims to comply with all relevant data privacy and protection laws. You should read the Privacy Notice in full at ​corp.scmp.com/privacy-policy.

Share

Apply for this position

Required*
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*