
GCP FinOps Engineer
Key responsibilities:
* Optimise large scale data analytics workloads through partitioning, clustering, query rewrites, storage format improvements, and lifecycle policies.
* Tune containerised microservices by recalibrating CPU/memory requests, improving autoscaling efficiency, and restructuring workload placement on cost efficient compute.
* Redesign workflow orchestration pipelines for parallel execution, increased concurrency, and offloading heavy tasks to lower cost execution environments.
* Analyse distributed data processing pipelines to right size worker types, adjust scaling thresholds, and adopt low cost compute for batch workloads.
* Reduce log processing and storage overhead through log level standardisation, routing rules, exclusion filters, and retention optimisation.
* Implement storage tiering strategies based on access patterns and enforce lifecycle rules to minimise cold data retention costs.
* Improve relational database performance through index tuning, connection optimisation, and instance right sizing.
* Enhance horizontally scalable database performance via autoscaling policies, index improvements, and mitigation of read/write hotspots.
* Build dashboards, budgets, alerts, and guardrails to drive ongoing cost governance and financial accountability.
* Collaborate with engineering teams to embed cost efficient architecture patterns and operational best practices.
- Supervisory / Managerial responsibilities (please specify if the position will have persons reporting to it): NA
- Other responsibilities - Budgets, targets, equipment etc (please specify): NA
Key Skills/Knowledge: * 5+ years of hands-on experience in Google Cloud
* Strong understanding of GCP Data services (indexing, slots, pruning, partitioning, clustering)
* Expert-level Kubernetes & GKE resource tuning
* Hands-on experience with Dataflow job pipelines and worker optimisation
* Strong Airflow/Composer knowledge (DAG design, scheduling, PodOperator)
* Strong Dataflow processing pipelines development & schedulers knowledge
* Deep understanding of Cloud Logging routing, sinks, exclusion filters
* Experience with Cloud Spanner autoscaling, indexing, schema optimisation
* Cloud SQL performance tuning and indexing
* Ability to analyse billing data & resource consumption
* Experience using GCP Cost Explorer, Recommender API, Billing Export
* Ability to quantify cost savings and present ROI to leadership
* Build dashboards, alerts, and budget guardrails
* Strong communication and stakeholder management
* Ability to collaborate across engineering, data, and product teams
* Structured problem-solving mindset
* Ownership-driven, proactive and independent
*
LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an IT Consultancy or as an Employment Business & Agency depending upon the precise nature of the work, for security cleared jobs or non-clearance vacancies, LA International welcome applications from all sections of the community and from people with diverse experience and backgrounds.
Award Winning LA International, winner of the Recruiter Awards for Excellence, Best IT Recruitment Company, Best Public Sector Recruitment Company and overall Gold Award winner, has now secured the most prestigious business award that any business can receive, The Queens Award for Enterprise: International Trade, for the second consecutive period.