119807
Profesionales cualificados de Digital & IT
Compartir oferta

Data Sciencist_remote

Referencia: #40423

Localización: UE


Plants of Tomorrow (PoT) is a strategic program from our Client multinantional swiss, having the goal to empower their business to increase productivity, reduce costs and get insights for new innovation by using Artificial Intelligence and Machine Learning onsite and in real time.

Best of all, it enables their plants to leverage their existing systems to host both plant-specific as well as PoT new range of applications.
For this as part of the PoT ML-OPS team you’ll have the opportunity to manage the complex challenges of scale which are unique to
PoT, while using your expertise in coding, algorithms, complexity analysis and large-scale system design while keeping a results oriented approach.

PoT ML-OPS culture of diversity, intellectual curiosity, problem solving, and innovation oriented towards helping solving business problems and openness is key to its success. Our team brings together people with a wide variety of backgrounds, experiences and perspectives.

We encourage them to collaborate, think big and take risks in a blame-free environment.
We promote self-direction to work on meaningful projects, while we also strive to create an environment that provides the support and mentorship needed to learn and grow.

Key Responsibilities:

  • Drive autonomously multiple developments and/or deployments across various products to meet organization’s objectives.
  • Work with business stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
  • Assess the effectiveness and accuracy of new data sources and data gathering techniques.
  • Develop, deploy and maintain custom data models and algorithms to apply to data sets.
  • Use predictive modeling to increase and optimize manufacturing processes, product quality and other business processes.
  • Develop testing framework and test model quality.
  • Coordinate with different functional teams to implement models and monitor outcomes.
  • Develop processes and tools to monitor and analyze model performance and data accuracy.

Key Values:

  • Self-driven, autonomous, results oriented person
  • Enjoys solving business and technical challenges
  • Positive and joyful attitude including under stress
  • Analytical and practical mindset
  • Curiosity to explore, to learn new things and to challenge existing understandings
  • Design solutions considering the context, the end result and all the intermediate elements
  • Build solutions to be reliable, secure, sustainable and performant while remaining pragmatic in achieving the intermediate objectives
  • Courage to take risks, openness to admit errors and move forward by learning from errors
  • Perseverance in face of setbacks

Key Skills:

  • Strong problem solving skills with an emphasis on product development.
  • Knowledge of a variety of machine learning techniques (time series, regression, classification, clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
  • Knowledge of advanced statistical techniques and concepts (forecasting, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
  • Excellent written and verbal communication skills for coordinating across teams.
  • A drive to learn and master new technologies and techniques.
  • We’re looking for someone with 7-10 years of experience manipulating data sets and building statistical and machine learning models, has a Master’s or PHD in Statistics, Mathematics, Computer Science or another quantitative field, and is familiar with the following software/tools:
  • Expertise in coding, algorithms, and proficiency in any of these programming languages: Python, R, Scala, C++.
  • Experience querying databases and using statistical computer languages to manipulate data and draw insights from large data sets: SQL, Python, etc.
  • Experience using major cloud providers (for machine learning training / MLOps pipelines) like GCP, AWS, Azure, etc.
  • Experience using AutoML tools, such as: DataRobot, Vertex AI, etc.
  • Experience working in the MLOps setup, deploying and scaling multiple products.
  • Adequate expertise and experience in delivering solutions to production via Continuous Integration/Continuous Delivery (CI/CD).
  • Experience creating and using advanced machine learning algorithms and statistics: time series, regression, simulation, scenario analysis, modeling, clustering, anomaly detection, decision trees, neural networks, etc.
  • Experience visualizing/presenting data for stakeholders using: Google DataStudio, Tableau, Qlik, Plotly/ggplot/Matplotlib.
  • Advanced/bilingual English essential, the interviews will be conducted in this language, other languages such as French or German are valuable

Nice to have:

  • Experience with developing and deploying machine learning models for the heavy/process manufacturing industries like cement, steel, paper, oil, mining…
  • Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, MySQL, etc.

We offer:

  • A professional career where you will set the limits, we will leave you and encourage you to grow
  • Gross remuneration to be negotiated based on the expertise provided, we will not rule out any professional if their assessment is justified.
  • 100% teleworking and more social benefits that we will explain to you if you are our candidate.


INSCRIBIRSE EN LA OFERTA