Key Responsibilities:Design, implement, and manage cloud-based infrastructure on GCP with a good understanding of AWS services.Automate infrastructure provisioning, scaling, and management processes using tools like Terraform, Ansible, and Kubernetes.Implement and manage CI/CD pipelines for seamless code deployment and updates.Monitor system performance, troubleshoot issues, and ensure high availability and reliability.Collaborate with development teams to optimize application performance on Apache/PHP stacks.Implement security best practices and compliance policies to safeguard data and infrastructure.Contribute to the development and integration of BigQuery and ETL processes for data analytics.Explore and integrate new cloud technologies and tools to enhance operational efficiency.Qualifications:Bachelor’s degree in Computer Science, Engineering, or a related field.3-5 years of experience in DevOps roles with a strong focus on GCP. Knowledge of AWS is highly beneficial.Proficiency in scripting languages (e.g., Python, Bash).Experience with containerization and orchestration technologies (Docker, Kubernetes).Solid understanding of CI/CD tools (Jenkins, GitLab CI/CD, etc.).Experience with Apache/PHP stack is a significant advantage.Familiarity with Google Looker, BigQuery, and ETL processes is a plus.Strong problem-solving skills, with a proactive approach to identifying and resolving issues.Excellent communication and teamwork abilities.
We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customer As a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale ResponsibilitiesDesign, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsDesign and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsImproving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etcDesign, implement and improve monitoring and alerting systemCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirements4 years of professional experience2 years hands-on experience with a variety of Azure servicesProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsSolid Linux/Unix systems administration backgroundAdvanced skill in configuring, managing and maintaining networking on Azure cloudSolid experience in managing production infrastructure with TerraformHands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automationHands-on experience with Databricks platformPractical knowledge of Python combined with SQL knowledge
Take the next leap in your career as a Data DevOps Engineer with EPAM! As a key figure in our team, you'll implement and support advanced Terraform and Azure DevOps pipeline solutions across multiple environments. If you hold hands-on experience with Terraform, Azure DevOps, and possess a knack for communication, optimization, and automation, we are excited to get acquainted with you ResponsibilitiesImplement and support advanced Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsCommunicate effectively with multiple stakeholders to gather requirements and provide updates on platform activitiesUtilize Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirementsHands-on experience with Terraform and Azure DevOps pipelines for infrastructure management and automationProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsStrong communication skills with the ability to interact with stakeholders at various levelsExperience collaborating with Architecture teams to ensure alignment with design standardsAbility to troubleshoot and prevent issues related to data workloads and platform infrastructureTechnologiesTerraformAzure DevOpsAzure Data Lake StorageAzure DatabricksAzure Data Factory (ADF)Azure Synapse AnalyticsWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positivel
Take the next leap in your career as a Data DevOps Engineer with EPAM! As a key figure in our team, you'll implement and support advanced Terraform and Azure DevOps pipeline solutions across multiple environments. If you hold hands-on experience with Terraform, Azure DevOps, and possess a knack for communication, optimization, and automation, we are excited to get acquainted with you ResponsibilitiesImplement and support advanced Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsCommunicate effectively with multiple stakeholders to gather requirements and provide updates on platform activitiesUtilize Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirementsHands-on experience with Terraform and Azure DevOps pipelines for infrastructure management and automationProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsStrong communication skills with the ability to interact with stakeholders at various levelsExperience collaborating with Architecture teams to ensure alignment with design standardsAbility to troubleshoot and prevent issues related to data workloads and platform infrastructureTechnologiesTerraformAzure DevOpsAzure Data Lake StorageAzure DatabricksAzure Data Factory (ADF)Azure Synapse AnalyticsWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positivel
We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customer As a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale ResponsibilitiesDesign, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsDesign and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsImproving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etcDesign, implement and improve monitoring and alerting systemCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirements4 years of professional experience2 years hands-on experience with a variety of Azure servicesProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsSolid Linux/Unix systems administration backgroundAdvanced skill in configuring, managing and maintaining networking on Azure cloudSolid experience in managing production infrastructure with TerraformHands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automationHands-on experience with Databricks platformPractical knowledge of Python combined with SQL knowledge
We are looking for a Senior DevOps Engineer to make the team even stronger. Do you have a strong technical background in various areas of modern technology stacks with a focus on deployments into cloud, specifically Azure? Are you a highly motivated self-directed technical person with practical experience delivering digital platforms? Do you understand fundamental DevOps practices and culture and are ideally also experienced in Agile methodology and popular Agile frameworks, such as Scrum and Kanban? Join EPAM's growing team in Greece to work on digital transformation projects with cutting-edge technologies in a variety of sectors: we look forward to receiving your application! The remote option applies only to the Candidates who will be working from any location in Ukraine. #LI-IK2ResponsibilitiesDevelop and manage CI/CD process for different types of application and automate as many processes as possible under the guidance of seniorsHelp advise customers on technical aspects of installation, sizing, scalingManage and support customer cloud environmentsContribute to the support development teams in technical questions of continuous integration and continuous deliverySet up monitoring and optimise for cost and performance under the guidance of seniorsManage the full release and support process following defined processesRequirements4+ years of DevOps ExperienceExperience in configuring, maintaining, and troubleshooting cloud-based production systems in Windows or Linux OSMicrosoft Azure Cloud: ARM templates writing, setup automation for resources provisioning (Azure PowerShell)Azure services (Azure App Service, Azure SQL, Azure Service Fabric, Azure Storage Account)Some understanding of load balancers, DNS, virtual networks and firewalls in cloud environmentConfigure monitoring (performance metrics, OMS)Azure Active Directory basics: Azure AD authenticationImplementation of Rol
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. We welcome DevOps engineers to help us in keeping up delivering superior services to our clients. We expect our DevOps engineers to have the following scope of responsibilities. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team ResponsibilitiesAutomation of continuous integration and continuous delivery for multiple customized installationsManagement and utilization of cloud providers (AWS/Azure/GCP)Operational infrastructure maintenance (monitoring, logs, and alerts)Work in close cooperation with the
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. We welcome DevOps engineers to help us in keeping up delivering superior services to our clients. We expect our DevOps engineers to have the following scope of responsibilities. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team ResponsibilitiesAutomation of continuous integration and continuous delivery for multiple customized installationsManagement and utilization of cloud providers (AWS/Azure/GCP)Operational infrastructure maintenance (monitoring, logs, and alerts)Work in close cooperation with the
DESCRIPTION Our client is a Canadian retail company that operates in the automotive, hardware, sports, leisure, and housewares sectors. ResponsibilitiesEnvironment Incidents troubleshootingThe development team and customer supportDeployment support to multiple environmentsWorking with a centralized DevOps teamParticipate in regular internal and client syncRequirements3+ years of experinece as a DevOps Engineer with Azure expertiseExperience with TerraformAzure CloudAzure DevOps PipelinesAzure NetworkingAzure SecurityAzure Kubernetes ServiceWe OfferCompetitive compensation depending on experience and skillsIndividual career pathSick leave and regular vacationEnglish classes with certified English teachersUnlimited access to LinkedIn learning solutionsFlexible work hours
«Ми (ARS ONLINE OÜ) використовуємо cookie-файли (cookies), обов\'язкові для роботи нашого сайту і сервісів, на підставі легітимного інтересу. Також, ми хотіли б із вашої згоди встановити на вашому пристрої опціональні аналітичні cookies для запам\'ятовування даних про перегляди та користування сервісами, а також маркетингові cookies, які допоможуть нам зрозуміти, які сервіси і продукти цікавлять користувачів найбільше.
Вмикаючи ці cookies, ви сприяєте поліпшенню наших сервісів і продуктів. Детальніше про cookies читайте в нашій «Політиці використання файлів cookies». Обов\'язкові cookies ми встановлюємо в будь-якому випадку. Нижче ви можете дозволити чи не дозволити нам установку опціональних cookies».