Company logo

Senior Machine Learning Engineer AWS ML Pipelines

Experion Technologies (I) Pvt Ltd

Trivandrum

in 17 days

Brief DescriptionRole : Senior Machine Learning Engineer  AWS ML Pipelines Notice period - Immediate joiners Location - Remote   Job Overview   We are seeking a highly skilled and independent Senior Machine Learning Engineer Contractor to design, develop, and deploy advanced ML pipelines in an AWS environment. In this role, you will build cutting-edge solutions that automate entity matching for master data management, implement fraud detection systems, handle transaction matching, and integrate GenAI capabilities. The ideal candidate will have extensive hands-on experience in AWS services such as SageMaker, Bedrock, Lambda, Step Functions, and S3, as well as strong expertise in CI/CD practices to ensure a robust and scalable solution.   Key Responsibilities

ML Pipeline Design & Development: Architect, develop, and maintain end-to-end ML pipelines focused on entity

matching, fraud detection, and transaction matching.

Integrate generative AI (GenAI) solutions using AWS Bedrock to enhance data

processing and decision-making.

Collaborate with cross-functional teams to refine business requirements and

develop data-driven solutions tailored to master data management needs.

AWS Ecosystem Expertise: Utilize AWS SageMaker for model training, deployment, and continuous

improvement.

Leverage AWS Lambda and Step Functions to orchestrate serverless workflows

for data ingestion, preprocessing, and real-time processing.

Manage data storage, retrieval, and scalability concerns using AWS S3. CI/CD Implementation: Develop and integrate automated CI/CD pipelines (using tools such as GitLab) to

streamline model testing, deployment, and version control.

Ensure rapid iteration and robust deployment practices to maintain high

availability and performance of ML solutions.

Data Security & Compliance: Implement security best practices to safeguard sensitive data, ensuring

compliance with organizational and regulatory requirements.

Incorporate monitoring and alerting mechanisms to maintain the integrity and

performance of deployed ML models.

Collaboration & Documentation: Work closely with business stakeholders, data engineers, and data scientists to

ensure solutions align with evolving business needs.

Document all technical designs, workflows, and deployment processes to support

ongoing maintenance and future enhancements.

Provide regular progress updates and adapt to changing priorities or business

requirements in a dynamic environment.Required Qualifications

Technical Expertise: 5+ years of professional experience in developing and deploying ML models and

pipelines.

Proven expertise in AWS services including SageMaker, Bedrock, Lambda, Step

Functions, and S3.

Strong proficiency in Python and/or PySpark for data manipulation, model

development, and pipeline implementation.

Demonstrated experience with CI/CD tools and methodologies, preferably with

GitLab or similar version control systems.

Practical experience in building solutions for entity matching, fraud detection, and

transaction matching within a master data management context.

Familiarity with generative AI models and their application within data

processing workflows.

Analytical & Problem-Solving Skills: Ability to transform complex business requirements into scalable technical

solutions.

Strong data analysis capabilities with a track record of developing models that

provide actionable insights.

Communication & Collaboration: Excellent verbal and written communication skills. Demonstrated ability to work independently as a contractor while effectively

collaborating with remote teams.

Proven record of quickly adapting to new technologies and agile work

environments.   Preferred Qualifications

Bachelors or Master’s degree in Computer Science, Data Science, Engineering, or a

related field.

Experience with additional AWS services such as Kinesis, Firehose, and SQS. Prior experience in a consulting or contracting role, demonstrating the ability to manage

deliverables under tight deadlines.

Experience within industries where data security and compliance are critical. Preferred SkillsKey Responsibilities

ML Pipeline Design & Development: Architect, develop, and maintain end-to-end ML pipelines focused on entity

matching, fraud detection, and transaction matching.

Integrate generative AI (GenAI) solutions using AWS Bedrock to enhance data

processing and decision-making.

Collaborate with cross-functional teams to refine business requirements and

develop data-driven solutions tailored to master data management needs.

AWS Ecosystem Expertise: Utilize AWS SageMaker for model training, deployment, and continuous

improvement.

Leverage AWS Lambda and Step Functions to orchestrate serverless workflows

for data ingestion, preprocessing, and real-time processing.

Manage data storage, retrieval, and scalability concerns using AWS S3. CI/CD Implementation: Develop and integrate automated CI/CD pipelines (using tools such as GitLab) to

streamline model testing, deployment, and version control.

Ensure rapid iteration and robust deployment practices to maintain high

availability and performance of ML solutions.

Data Security & Compliance: Implement security best practices to safeguard sensitive data, ensuring

compliance with organizational and regulatory requirements.

Incorporate monitoring and alerting mechanisms to maintain the integrity and

performance of deployed ML models.

Collaboration & Documentation: Work closely with business stakeholders, data engineers, and data scientists to

ensure solutions align with evolving business needs.

Document all technical designs, workflows, and deployment processes to support

ongoing maintenance and future enhancements.

Provide regular progress updates and adapt to changing priorities or business needs