Image Alt

IT

Leading the way in the IT Recruitment sector with fantastic opportunities for you

Unrivalled IT Recruitment services for candidates and clients

As a leader in the IT Recruitment space for 20 years, our broad IT expertise has continued to grow. We work on exciting IT jobs, with new offerings every week for our candidates. 

We understand the importance of these roles within a business, but also how each candidate will have different needs and IT career goals. The IT sector is our core strength in recruitment, so we’re confident we can help your IT recruitment journey.

Latest IT Jobs

Receive jobs like these by email

Senior Backend Engineer

United Kingdom - London, Greater London, United Kingdom
Posted: 11/02/2026

Salary: £0.00 to £400.00 per Day
ID: 37113_BH

... Read more


Senior Front-End Developer (React, GraphQL, Amazon Neptune)

Fully Remote (UK-based)

Initial 3-month contract

Up to £450 per day (Negotiable)

 

We're working with a major UK organisation building a next-generation knowledge platform that combines structured data, graph technology, and AI-enabled workflows.

This is a senior front-end role focused on developing high-performance interfaces that interact directly with a semantic knowledge graph platform.

 

Key Responsibilities

  • Build and maintain complex React/Next.js applications in a modern TypeScript codebase
  • Develop rich UI experiences powered by GraphQL APIs connected to Amazon Neptune
  • Create advanced Search & Discovery interfaces, combining keyword and AI/vector-based retrieval
  • Build dashboards supporting human-in-the-loop validation of AI-generated outputs
  • Ensure strict adherence to WCAG 2.2 AA accessibility and design system standards
  • Collaborate closely with backend engineers, data teams, and UX/service designers
  • Integrate front-end applications with AWS-hosted services (Lambda, API Gateway, ECS)

 

Core Skills Required

  • Strong front-end engineering experience with React or Next.js + TypeScript
  • Proven expertise consuming and delivering applications built on GraphQL APIs
  • Hands-on experience working with GraphQL backed by Amazon Neptune is essential
  • Strong state management skills (Redux, TanStack Query, or similar)
  • Deep practical accessibility experience (WCAG standards)
  • UK-based and able to meet security and governance requirements

 

Desirable Experience

  • Knowledge graph or semantic data awareness (RDF, JSON-LD, linked data concepts)
  • Experience building high-performance search UIs (OpenSearch, Elastic, vector search)
  • Familiarity with CI/CD pipelines and Infrastructure-as-Code (Terraform/CDK)
  • Interest in AI-enabled workflows and validation tooling

 

Please note: GraphQL experience specifically with Amazon Neptune is an absolute minimum requirement. Applications without this will not be considered.


ML Engineer - Allergan

India - Chennai
Posted: 06/02/2026

Salary: Negotiable
ID: 37108_BH

... Read more


Machine Learning Engineer - Contract (12 Months)

Location: Remote (India)
Contract Length: 12 months

We are looking for an experienced Machine Learning Engineer to join a global data and analytics programme, supporting the development of predictive automation solutions using large-scale commercial datasets. This role focuses on building, deploying, and optimising machine learning models on an enterprise data platform, with a strong emphasis on PySpark and distributed data processing.

This is a hands-on engineering role suited to someone who enjoys working with complex, multi-source data and translating advanced analytics into real-world commercial impact.

What you'll be doing:

  • Designing and implementing machine learning and deep learning models using PySpark on an enterprise data platform.

  • Solving complex analytical problems using large, distributed commercial datasets.

  • Building and maintaining end-to-end ML pipelines, from data ingestion through to deployment and monitoring.

  • Collaborating closely with commercial, analytics, and data engineering stakeholders to align technical solutions with business goals.

  • Identifying data drift, bias, and performance issues, and ensuring models generalise effectively in production.

  • Delivering actionable outputs such as segmentation, targeting, and optimisation recommendations.

  • Keeping up to date with advances in machine learning, big data, and automation technologies.

Key responsibilities:

  • Ingesting, cleaning, and transforming large-scale datasets using PySpark.

  • Developing, evaluating, documenting, and monitoring ML models in a production environment.

  • Designing and optimising ETL and data pipelines to support scalable ML operations.

  • Working with stakeholders to define requirements and translate business objectives into technical solutions.

  • Customising or extending ML libraries (e.g. PySpark MLlib) to maximise analytical value.

Skills & experience required:

  • Strong experience working with large-scale, distributed data sets.

  • Advanced knowledge of statistics, probability, algorithms, and machine learning concepts.

  • Hands-on experience with PySpark and big data processing.

  • Strong Python development skills.

  • Experience with enterprise data platforms and data modelling.

  • Excellent communication and collaboration skills.

  • Proactive mindset with a willingness to learn new tools and technologies.

Desirable experience:

  • Background in machine learning within pharmaceutical, life sciences, or commercial analytics environments.

  • Practical experience deploying ML solutions on enterprise data platforms.

  • Exposure to additional ML frameworks such as PyTorch or Keras (nice to have).

  • Bachelor's or higher degree in computer science, mathematics, engineering, or a related quantitative field.


Tech Lead - Data & Python Engineering

India - Pune
Posted: 04/02/2026

Salary: $0.00 to $150.00 per Day
ID: 37105_BH

... Read more


Tech Lead - Data & Python Engineering
Location: India | Fully Remote
Type: Contract | 6 - 12 months
Day Rate: Up to $150 Per Day (Negotiable)

We're looking for a hands-on Tech Lead to design, build, and run modern data platforms in a cloud-native environment. This role combines deep technical ownership with people leadership, guiding a small team while remaining actively involved in delivery.

You'll play a key role in shaping how data is ingested, transformed, governed, and served across the organisation; ensuring systems are reliable, secure, and scalable.

What you'll be doing
  • Leading a team of data engineers while staying hands-on with design and coding
  • Owning the technical direction and architecture of data pipelines and platforms
  • Building and maintaining production-grade data workflows using Python, SQL, Snowflake, and DBT
  • Designing cloud-based data solutions on AWS using serverless and event-driven patterns
  • Setting engineering standards around code quality, testing, CI/CD, and documentation
  • Ensuring data platforms are reliable, observable, and cost-efficient in production
  • Working closely with business and technical stakeholders to turn requirements into working solutions
  • Mentoring engineers and raising the overall engineering bar

What we're looking for
  • Strong experience in Python and SQL, with a software engineering mindset
  • Proven hands-on experience with Snowflake and modern data modelling practices
  • Solid background in data engineering (ETL/ELT, orchestration, data quality)
  • Practical experience building data systems on AWS
  • Experience defining architecture and making technical trade-offs
  • Confidence leading technical initiatives and supporting other engineers
  • Strong communication skills and comfort working with non-technical stakeholders

Nice to have
  • Experience with workflow orchestration tools (e.g. Airflow)
  • Exposure to APIs, streaming data, or real-time processing
  • Familiarity with regulated or compliance-heavy environments
  • Experience improving platform reliability, performance, or cloud costs

Why this role
  • High-impact position with real ownership and influence
  • A balance of leadership and hands-on technical work
  • Opportunity to shape long-term data architecture and standards
  • Collaborative environment with complex, real-world data challenges