Internship – Data science Developer
Alteia provides tremendous career opportunities to professionals willing to work hard on meaningful challenges alongside a talented team.
By joining Alteia, you’ll participate in the transformation of key industry sectors that are increasingly relying on imagery and artificial intelligence to drive their businesses. You’ll have a unique chance to shape and implement your ideas as part of a leading, fast-growing, cutting-edge company! In addition, you will be surrounded by professionals who have an exceptional background and amazing stories.
We encourage out-of-the-box ideas and incentivize our teams to develop their creativity. As a result, Alteia can give you a unique opportunity to gain valuable and challenging experience in a fast-growing business with passionate, easy-going, enthusiastic people.
It is in technical excellence and perpetual innovation that we recognize ourselves.
The Alteia Platform is the cloud-based solution that enables enterprises to rapidly and flexibly access and prepare gigabytes of visual data (images, point clouds, videos, etc.) with prebuilt annotation/labeling tools. It allows our customers to build and manage AI models without writing code using an intuitive user interface. Then deploy applications within weeks with customizable validation processes and continuous improvement workflows. From there, they can drive company-wide results by seamlessly publishing predictive insights to enterprise systems or custom business applications.
Within our Toulouse offices, you will join a team of data scientists specializing in deep learning, image processing, and geomatics, in charge of R&D on data analysis tools deployed on the Alteia platform. The team’s expertise focuses on analyzing data from varied sources: images of drones, LIDAR point clouds, CAD models, etc.
Different themes can be considered depending on your profile:
- 3D reconstruction by deep learning: Your main task will be to design algorithms for 3D reconstruction of electrical infrastructure (pylons and lattice tower) from images from a drone survey. First, you will learn about the reconstruction models developed internally (which uses a computer vision approach) to familiarize yourself with the problem. As a second step, you will have to implement and test different network architectures neurons to improve performance. Connections:
- Machine Learning Operation and Tools: Your main task will be to contribute to the development of the framework used internally for machine learning model lifecycle management. First, you will familiarize yourself with the framework and the necessary steps to produce a deep learning model. Next, you will need to implement and test the procedure for tracking the different versions of the models and its interaction with our web platform. Connections:
- Object detection and Lidar data analysis: Point clouds from LIDAR sensors and/or photogrammetric processing generally have large sizes and require robust and efficient algorithms to be analyzed. Your main task will be the implementation of the different algorithms which aim to extract shapes and critical information from LIDAR data. Initially, you will use algorithms already developed internally to familiarize yourself with the issues. Afterward, you will have the objective of improving them and extending their application case. Connections:
- Processing of video streams by artificial intelligence: Alteia is interested in processing data via AI. on video streams in the industrial production background. Successive treatments are applied in real-time videos, deep-learning (detection, classification), and time-series or statistical processing. Part of a multidisciplinary project team and supported by our experts in artificial intelligence, your tasks may include:
- State-of-the-art research and evaluation of treatment techniques and tools.
- Dataset annotation, data analysis.
- Evaluation of treatment results (learning) and tuning of parameters to improve performances.
- Adaptation of tools and algorithms.
- Development or improvement of software bricks to integrate or deploy algorithms or data.
- Design of user-oriented visualization tools.
- Participation in the architecture of the system, suggestion of algorithms, techniques and tools.
You are: Committed. Rigorous. Autonomous. Persistent for the purpose of succeeding.
Qualifications and skills:
- Education: preparing a diploma in programming (M2 level or equivalent)
- Good level in Python (scientific libraries in particular: Numpy, Pandas, OpenCV, Scikit-Learn)
- Knowledge of Unix / Linux and Git environments.
- Willingness to learn and improve.
- Fluency in English.
Apply to this job