WE ARE
SoftServe is a dynamic digital service and consulting company, founded in 1993, with a global presence across the USA, Europe, APAC, and LATAM regions. Our team of skilled professionals collaborates on over 2,000 projects, driving transformation and optimizing business strategies for ISVs and Fortune 500 companies.
Our client is a leading managed hosting platform, empowering businesses with the performance, security, and reliability they need to grow online. Their platform automates the complexities of site management, helping teams focus on content and user engagement instead of infrastructure.
The team operates a central data platform that serves as the analytical backbone of the company, enabling other data teams to deliver reliable, secure, and governed data.
IF YOU ARE
Having 3+ years of professional experience as a DevOps EngineerExperienced with Python or Go for building infrastructure and toolingStrong in Kubernetes (K8s) for managing containerized applications and DockerExperienced with building and maintaining CI/CD pipelinesProficient in Infrastructure as Code (IaC) tools, specifically TerraformHaving strong SQL debugging skills and query performance optimization knowledgeFamiliar with data models and pipelines directly, within a Data Engineering role or similarExperienced with GCP (Google Cloud Platform) or any other public cloud providerAccustomed to data warehouse technologies, particularly BigQueryCapable of working in distributed Agile teams across multiple time zonesAn effective communicator with upper-intermediate English level or higherKnowledgeable of DBT (will be a plus)AND YOU WANT TO
Build, maintain, and improve the core data infrastructure necessary to support data replication and data modeling (, K8s runners, CI/CD, Git repos, BigQuery)Develop and maintain supporting tooling that enables data development, such as custom container images and internal toolsManage infrastructure as code (IaC) using tools like TerraformImplement and maintain high security standardsDefine and maintain coding standards and best practices across Data Engineering teams through reviews, linters, and frameworksSet up, maintain, and improve the centralized alerting infrastructure using platforms like GCP Alerts, Monte CarloDefine, document, and promote standardization and best practices for data development, including coding standards, CI/CD processes, and data lifecyclesImplement and manage data access policies for the data ecosystem in accordance with governance guidanceEnhance the developer environment by creating onboarding guides, process runbooks, and improving the contribution process for other teamsProvide guidance and support to other teams by mentoring them on effective use of the data platform technologiesMonitor and manage the operational costs of the data ecosystem, building tools to control spend and assign ownershipTOGETHER WE WILL
Gain certifications from leading providers (Google, AWS & others)Empower you to scale your expertise with others by joining the Mentoring ProgramExcel business of our clients from startups and ISVs to Enterprise and Fortune 500 companiesCreate an exceptional customer experience and impact the company's global success, and be recognized by the Customer Hero ProgramCare for your wellness with a health insurance packageCode! The project is not overwhelmed with meetings and other non-development-related activitiesSupport hundreds of thousands of people every day by helping them not to waste precious time on maintaining healthy nutritionHelp you with your individual initiatives — we are open to them, just come and share your ideas!