Job title: (CEN) GLOBAL DATA SCIENCE INTERN
San Pedro Garza Garcia, Nuevo Leon, 66215
Project Name: Global Data Science Intern Pipeline
Vice Presidency: Digital Enablement
Department: Global Data Science
Region: CENTRAL
Intern Location (Office): Campus NEORIS
Eligible Semesters:
5th to 8th semester
Careers of Interest:
Data Science, Mathematics, Actuarial Science, Computer Engineering, Software Engineering, Industrial Engineering, Systems Engineering, Physics, Mechatronics, Artificial Intelligence Engineering
Schedule Availability:
25–30 hours per week
Duration:
6 months, with the option to renew
Tentative Start Date:
March 2026
Project Description
Project Objective
Actively contribute to the development of end-to-end analytical solutions for real business use cases within the Supply Chain domain. The intern will participate in the full data science lifecycle—from data exploration and modeling to production deployment and consumption of results—while receiving technical mentorship and guidance from the Global Data Science team.
Key Challenges for the Intern
- Understanding and translating real business problems into data-driven solutions.
- Working with complex, large-scale datasets while ensuring data quality, consistency, and reliability.
- Generating tangible and measurable business value through analytical solutions.
- Developing solutions that go beyond experimentation, with a focus on scalability, maintainability, and usability.
- Collaborating effectively within a multidisciplinary, global team and adapting to agile ways of working.
- Applying modern data science best practices, including reproducibility, documentation, and code quality.
- Learning and leveraging an enterprise-grade data science technology stack.
Main Responsibilities and Activities
- Process, clean, and validate data to ensure high-quality and reliable inputs for analysis and modeling.
- Perform exploratory data analysis (EDA) to generate insights and support decision-making through analytical outputs and visualizations.
- Formulate hypotheses, design analytical approaches, and apply statistical techniques to validate findings.
- Conduct feature engineering and data preparation to support predictive and prescriptive modeling initiatives.
- Develop, evaluate, and iterate on predictive or prescriptive models using machine learning and advanced AI techniques.
- Contribute to the full lifecycle of data science solutions, including documentation, validation, and preparation for production deployment.
- Apply data science and software engineering best practices, including reproducibility, version control, and clear technical documentation.
Qualifications and Requirements
Required Skills and Competencies
- Strong analytical and problem-solving skills, with the ability to break down complex business problems into structured analytical tasks.
- Proficiency in Python, with the ability to write clean, reusable, and well-structured code for data analysis and modeling.
- Solid knowledge of SQL.
- Ability to communicate insights effectively through data storytelling, supported by clear analysis and visualizations.
- Strong collaboration skills and the ability to work effectively in a team-oriented, agile environment.
- High attention to detail and a structured, methodical approach to problem-solving.
- Proficiency in English for technical and professional communication within a global team.
- Familiarity with version control tools (e.g., Git).
Nice to Have
- Experience with Snowflake or other analytical data warehouses.
- Knowledge of web application development (REST APIs, HTML, and/or CSS).
- Knowledge or hands-on experience with simulation techniques (e.g., discrete-event simulation, Monte Carlo simulation).
- Knowledge or practical experience in optimization methods, including linear, integer, or nonlinear optimization, heuristics, or metaheuristics.
- Familiarity with optimization or simulation libraries/tools (e.g., OR-Tools, PuLP, Gurobi, CPLEX, SimPy, or similar).
- Experience with data visualization tools such as Power BI or Tableau.
- Exposure to AI models beyond academic coursework.
- Experience working with enterprise-grade data platforms or cloud-based analytical environments (e.g., Azure, AWS).