Key Responsibilities:Design, implement, and manage cloud-based infrastructure on GCP with a good understanding of AWS services.Automate infrastructure provisioning, scaling, and management processes using tools like Terraform, Ansible, and Kubernetes.Implement and manage CI/CD pipelines for seamless code deployment and updates.Monitor system performance, troubleshoot issues, and ensure high availability and reliability.Collaborate with development teams to optimize application performance on Apache/PHP stacks.Implement security best practices and compliance policies to safeguard data and infrastructure.Contribute to the development and integration of BigQuery and ETL processes for data analytics.Explore and integrate new cloud technologies and tools to enhance operational efficiency.Qualifications:Bachelor’s degree in Computer Science, Engineering, or a related field.3-5 years of experience in DevOps roles with a strong focus on GCP. Knowledge of AWS is highly beneficial.Proficiency in scripting languages (e.g., Python, Bash).Experience with containerization and orchestration technologies (Docker, Kubernetes).Solid understanding of CI/CD tools (Jenkins, GitLab CI/CD, etc.).Experience with Apache/PHP stack is a significant advantage.Familiarity with Google Looker, BigQuery, and ETL processes is a plus.Strong problem-solving skills, with a proactive approach to identifying and resolving issues.Excellent communication and teamwork abilities.
Запрошуємо Full-Stack розробника приєднатися до нашої команди ITRDev і взяти участь у захоплюючому проекті спортивної тематики. Проект повʼязаний із організацією змагань з волейболу на території Сполучених Штатів, де беруть участь понад 1500 команд. Сезон змагань цілий рік, один івент триває івент 3 дні.Ми шукаємо розробника, який готовий взяти на себе відповідальність за розвиток та вдосконалення цього проекту. Основними завданнями є розробка нового функціоналу, що допоможуть підвищити якість і ефективність проекту.Якщо ви готові прийняти виклик і працювати над цікавим проектом у спортивній галузі, який постійно розвивається, то приєднуйтесь до нашої команди, де вас чекають нові можливості для професійного зростання.Що ти будеш робити:— писати чистий та якісний код:)— покривати його юніт тестами;— розробляти нові фічі;— займатись оптимізацією;— писати складні SQL запити;— писати документацію;— робити естімейти задач;— активно комунікувати з командою, в тому числі на дейлі мітингах (твоя залученість важлива для нас).Навички та знання, які для цього знадобляться:— Досвід роботи від 4 років;— Node.js, Nest.js, React.js, PostgreSQL від 3+ років (RabbitMQ, Next, інші тулзи);— Досвід написання чистих SQL запитів, їх оптимізація (процедури, функції, тригери);— Досвід роботи з CI/CD;— Досвід роботи у продукті від 1,5 року.Також буде круто, якщо маєш:— Навички проектування архітектури бази даних;— Навички менторства, лідства;— Досвід зі Stripe;— Express, Sails; — Досвід роботи з DDD.Команда на проекті:— Lead/Architect;— Full-stack developers;— 2 QA;— DevOps;— Designer;— PM.Ми пропонуємо:— роботу без задач на вчора та «палаючих» дедлайнів;— work-life balance;— можливість реалізувати свої ідеї в життя — думка кожного фахівця має значення;— гнучкий графік роботи без трекерів;— корпоративні курси англійської мови для різних рівнів;— компенсація навчання та курсів;— внутрішня бібліотека та knowledge sharing від колег;— індивідуальний план розвитку;— 20 робочих днів відпустки;— 7
Come join a project at an industry-leading financial intelligence and analytics powerhouse! Our customer is a renowned financial information and analytics company that provides essential intelligence to businesses, governments, and individuals worldwide. The client offers a wide range of services, including credit ratings, market intelligence, data analysis, and investment research. A company is best known for its credit ratings division which assesses the creditworthiness of companies and governments Join the project and be at the forefront of shaping the future of financial information and analytics! ResponsibilitiesDrive the development of the data platformOversee a team of developers and QAsTake part in architecture decisions and implementationDesign and develop solutions utilizing Integration Best PracticesManaging complex data workflowsProvide day-to-day support and technical expertise to other engineersWork across the full software development cycleHelp improve end-to-end development process, from data collection tools to deployment in a production environmentHandle direct communication with the customerRequirementsData Architecture Modeling ExperienceStrong Python development skillsExpertise in SQLStrong understanding of Data Pipeline and Data Storage tooling (AWS S3, AWS Glue, Apache AirFlow, Apache Kafka, Apache Spark/PySpark, CloudFormation/Terraform, Delta Lake)Understanding of RESTFUL APIs and web technologiesMicro-service architectureExperience with the full development lifecycleExperience with AWS cloud platformSolid understanding of containers and orchestration tools (Docker, CI/CD)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are commi
Обов'язки:
Виконання ручного тестування програмного забезпечення для перевірки його функціональності та виявлення помилок.
Створення тестових сценаріїв та тест-кейсів на основі вимог до продукту.
Реєстрація та документування виявлених дефектів у системі управління помилками.
Взаємодія з розробниками для виправлення виявлених помилок та тестування виправлень.
Підготовка та підтримка тестового середовища.
Вимоги:
Бажання навчатися та розвиватися у сфері тестування програмного забезпечення.
Основи аналізу та вміння виявляти помилки в програмах.
Уважність до деталей та систематичність у роботі.
Добрі комунікаційні навички та здатність працювати в команді.
Основи розуміння життєвого циклу програмного забезпечення будуть плюсом.
Ми пропонуємо:
Можливість отримати цінний досвід у сфері тестування програмного забезпечення.
Графік роботи 5/2, 08:00 - 18:00.
Затишний офіс у центрі міста.
Корпоративні заходи, команді змагання.
Підтримку та навчання від досвідчених QA-спеціалістів.
Дружню та підтримуючу робочу атмосферу.
Конкурентоспроможну заробітну плату та можливість отримання бонусів за досягнення цілей.
Якщо ви зацікавлені у цій позиції та готові прийняти виклик, надсилайте своє резюме.
We are looking for a Unified Communication Engineer who specializes in creating, deploying, and maintaining integrated communication systems involving voice, video, data, and mobile applications. In this role, you will be responsible for system design, implementation, regular maintenance, managing collaboration tools, and ensuring security and compliance with industry standards. Working schedule 11:00/12:00 – 20:00/21:00 (UTC+2) The remote option applies only to the Candidates who will be working from any location in Ukraine. #LI-IK2ResponsibilitiesProactively manage and keep the collaborations system operationalIdentify new use cases, glean from various channels including trouble tickets and other end user interactionsCreate and implement new capabilities to support increased overall productivityCollaborate with UX/service-desk teams to ensure proactive communications, training, and accurate documentation are available to the user communityOptimize users’ collaboration experience, which involve computation and network enhancementDevelop the tool set and telemetryDevelop a knowledge base of self-serve steps for end-users to identify cause(s) of performance issuesWork on escalated support tickets within the UCC domainStay abreast of feature releases and investigate relevance/impact to the client’s user baseResearch and implement alternative solutions and workarounds. Including BOTs, to fill and work around feature gaps within standard Webex productsParse through application and system logs to troubleshoot issuesWrite Standard Operating Procedures and Knowledge Articles, as needed, for support staffOperate with a focus on the telephony functionRequirementsProfessional experience in the same field 3+ yearsCandidate has proficiency across the Webex suite of products, with emphasis on telephony functionPossess an intermediate level of IP networking knowledgeCandidate can troubleshoot Cisco UC systems, including Webex Control HubCandidate has demonstrated the ability to
We are looking for a technology champion to join a dynamic project team as a .NET Tech Lead. The project is about cloud-native platform development for a global consulting company. It is a technology leadership role with focus on Azure platform, which includes guiding a team of full-stack engineers and driving the success of new and ongoing projects. This role requires a strong hands-on approach and deep expertise in .NET, Azure and Angular, alongside a passion for client-centric development and cloud-native architectures. Join our team and lead the way in Azure technology solutions! If you meet the requirements and are passionate about technology leadership, apply now to make a significant impact. The remote option applies only to the candidates who will be working from any location in Ukraine with the possibility to visit Kyiv or Lviv office for onboarding. #LI-IRINABENKOResponsibilitiesLead a team of full-stack engineers, providing hands-on technical guidance and mentorshipOversee the development and maintenance of cloud-native applications on the Azure platformImplement and manage Infrastructure as Code (IAC) for efficient cloud resource managementCollaborate with clients to understand their needs and ensure successful project deliveryMaintain and improve engineering practices within the teamStay current with the latest developments in Azure and .NET technologies, and apply them to enhance project outcomesRequirementsMinimum of 5 years of experience in similar rolesStrong hands-on experience with: Azure Serverless (Azure Container Apps (ACA), Azure Functions (AZFN)Azure Service Bus (ASB)EventHubAzure SQL.NETInfrastructure as Code (IaC) tools such as Terraform, ARM, AWS CDK or similarAngularExperience in cloud infrastructure automationAbility to start and lead new projects independentlyClient-centric approachSelf-motivated and proactive in driving projects forwardExcellent understanding of engineering practices and clo
We are seeking passionate and experienced technology leadership champions to join our team. If you are an innovative leader with a strong background in Azure and .NET technologies, we want to hear from you. The successful candidate will lead technology initiatives, drive the development of high-quality software solutions, and mentor team members. Our client provides audit, management consulting, tax and legal services. ResponsibilitiesLead and oversee technology initiatives on the Azure platformDesign, develop, and maintain applications using Azure Serverless (ACA, AZFN), ASB, EventHub, and Azure SQLDevelop robust and efficient solutions using .NETImplement Infrastructure as Code (IaC) using tools like Terraform, ARM, AWS CDK, or similarDevelop and maintain user interfaces with AngularConduct code reviews, ensure code quality, and write unit tests to maintain high standardsCollaborate with cross-functional teams to define, design, and implement new featuresMentor and guide developers, promoting best practices and development techniquesEngage with clients to understand their requirements and deliver technology solutions that meet their needsRequirementsMinimum of 5 years of experience in similar roles, with a focus on Azure and .NET technologiesExtensive hands-on experience with Azure Serverless (ACA, AZFN), ASB, EventHub, and Azure SQLProficiency in .NET and AngularExperience with Infrastructure as Code (IaC) tools such as Terraform, ARM, AWS CDK, or similarSelf-motivated and able to initiate and lead new projects independentlyStrong client-centric approach with excellent communication and interpersonal skillsNice to haveExperience with BicepWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital pl
We are seeking passionate and experienced technology leadership champions to join our team. If you are an innovative leader with a strong background in Azure and .NET technologies, we want to hear from you. The successful candidate will lead technology initiatives, drive the development of high-quality software solutions, and mentor team members. Our client provides audit, management consulting, tax and legal services. ResponsibilitiesLead and oversee technology initiatives on the Azure platformDesign, develop, and maintain applications using Azure Serverless (ACA, AZFN), ASB, EventHub, and Azure SQLDevelop robust and efficient solutions using .NETImplement Infrastructure as Code (IaC) using tools like Terraform, ARM, AWS CDK, or similarDevelop and maintain user interfaces with AngularConduct code reviews, ensure code quality, and write unit tests to maintain high standardsCollaborate with cross-functional teams to define, design, and implement new featuresMentor and guide developers, promoting best practices and development techniquesEngage with clients to understand their requirements and deliver technology solutions that meet their needsRequirementsMinimum of 5 years of experience in similar roles, with a focus on Azure and .NET technologiesExtensive hands-on experience with Azure Serverless (ACA, AZFN), ASB, EventHub, and Azure SQLProficiency in .NET and AngularExperience with Infrastructure as Code (IaC) tools such as Terraform, ARM, AWS CDK, or similarSelf-motivated and able to initiate and lead new projects independentlyStrong client-centric approach with excellent communication and interpersonal skillsNice to haveExperience with BicepWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital pl
We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customer As a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale ResponsibilitiesDesign, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsDesign and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsImproving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etcDesign, implement and improve monitoring and alerting systemCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirements4 years of professional experience2 years hands-on experience with a variety of Azure servicesProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsSolid Linux/Unix systems administration backgroundAdvanced skill in configuring, managing and maintaining networking on Azure cloudSolid experience in managing production infrastructure with TerraformHands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automationHands-on experience with Databricks platformPractical knowledge of Python combined with SQL knowledge
Take the next leap in your career as a Data DevOps Engineer with EPAM! As a key figure in our team, you'll implement and support advanced Terraform and Azure DevOps pipeline solutions across multiple environments. If you hold hands-on experience with Terraform, Azure DevOps, and possess a knack for communication, optimization, and automation, we are excited to get acquainted with you ResponsibilitiesImplement and support advanced Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsCommunicate effectively with multiple stakeholders to gather requirements and provide updates on platform activitiesUtilize Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirementsHands-on experience with Terraform and Azure DevOps pipelines for infrastructure management and automationProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsStrong communication skills with the ability to interact with stakeholders at various levelsExperience collaborating with Architecture teams to ensure alignment with design standardsAbility to troubleshoot and prevent issues related to data workloads and platform infrastructureTechnologiesTerraformAzure DevOpsAzure Data Lake StorageAzure DatabricksAzure Data Factory (ADF)Azure Synapse AnalyticsWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positivel
Take the next leap in your career as a Data DevOps Engineer with EPAM! As a key figure in our team, you'll implement and support advanced Terraform and Azure DevOps pipeline solutions across multiple environments. If you hold hands-on experience with Terraform, Azure DevOps, and possess a knack for communication, optimization, and automation, we are excited to get acquainted with you ResponsibilitiesImplement and support advanced Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsCommunicate effectively with multiple stakeholders to gather requirements and provide updates on platform activitiesUtilize Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirementsHands-on experience with Terraform and Azure DevOps pipelines for infrastructure management and automationProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsStrong communication skills with the ability to interact with stakeholders at various levelsExperience collaborating with Architecture teams to ensure alignment with design standardsAbility to troubleshoot and prevent issues related to data workloads and platform infrastructureTechnologiesTerraformAzure DevOpsAzure Data Lake StorageAzure DatabricksAzure Data Factory (ADF)Azure Synapse AnalyticsWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positivel
Наразі ми шукаємо Senior Big Data Engineer для поповнення нашої команди професіоналів. Можливість віддаленої роботи відкрита лише для тих кандидатів, які знаходяться на території України. #LI-YK3Навички3+ років досвіду в Data EngineeringЗнання JavaДосвід роботи з хмарними платформами, такими як AWS або AzureРівень володіння усною і письмовою англійською мовою вище середнього (B2+)Обов'язкиАналіз існуючих практик та рішень нашого клієнтаДіагностувати та вирішувати проблеми після впровадження з новими та існуючими середовищами, які були наданіВідповідати та консультувати кінцевих користувачів в робочий часРозробка модульних та компонентних тестів та підтримка наскрізного тестування CDPДодавання показників ефективності в рішення для моніторингу та оновлення кодуПідготовка документації та передача знань співробітникам клієнтаБуде перевагоюДосвід роботи з DatabricksЗ нами ти можешПрацювати за гнучким графіком віддалено або у будь-якому з наших комфортних офісів чи коворкінгів в УкраїніОтримати необхідне обладнання для виконання робочих завданьЗмінювати проєкти та стек технологій всередині EPAMОтримувати досвід у різних бізнес-доменах (Insurance, E-commerce, Healthcare, Finance, Travelling, Media, Artificial Intelligence та інші)Розглянути варіанти для релокації у понад 30 країн світуДолучатися до волонтерських, благодійних програм та спільнот (як за технічними напрямами, так і за інтересами)Ми допоможемо тобі у професійному розвиткуСплануй свій індивідуальний кар’єрний шлях разом із менеджеромОтримуй регулярний зворотній зв’язок від колегБезкоштовно вдосконалюй англійську мову з сертифікованими викладачами (Speaking Clubs, курси з підготовки до клієнтських інтерв’ю тощо)Отримай можливість безкоштовно пройти навчання та сертифікацію з AWS, GCP або Azure CloudsКористуйся внутрішньою навчальною програмою E-learn (18,200+ спеціалізованих тренінгів та менторинг-програм)Отримай доступ до корпоративних акаунтів на LinkedIn Learning, Get Abstract та інших партнерських ресурсівН
We are looking for a Senior Data DevOps to join EPAM and contribute to a project for a large customer As a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale ResponsibilitiesDesign, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsDesign and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environmentsImproving users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models etcDesign, implement and improve monitoring and alerting systemCollaborate with Architecture teams to ensure platform architecture and design standards align with support model requirementsIdentify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operationsRequirements4 years of professional experience2 years hands-on experience with a variety of Azure servicesProficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse AnalyticsSolid Linux/Unix systems administration backgroundAdvanced skill in configuring, managing and maintaining networking on Azure cloudSolid experience in managing production infrastructure with TerraformHands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automationHands-on experience with Databricks platformPractical knowledge of Python combined with SQL knowledge
Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world ResponsibilitiesCreate data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assetsSupport the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptionsDrive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirementsStrive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.)Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often haveWorking with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect)Requirements5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database designExpert hands-on experience with DatabricksExpert-level knowledge of SQLExperienced with Azure Data FactoryProduction coding experience with PythonDelta Lake or other similar technologyUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, ou
Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world ResponsibilitiesCreate data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assetsSupport the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptionsDrive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirementsStrive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.)Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often haveWorking with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect)Requirements5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database designExpert hands-on experience with DatabricksExpert-level knowledge of SQLExperienced with Azure Data FactoryProduction coding experience with PythonDelta Lake or other similar technologyUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, ou
Щоб ми хотіли бачити в тобі:Досвід виконання SEO задач від 2 роківКомунікативні навичкиРозуміння алгоритмів пошукових систем;Знання Ahrefs та інших SEO-платформБажання розвивати свої навички у сфері контекстної рекламиТвої майбутні задачі:Допомога із запуском нових та підтримка існуючих кампаній;Аналіз нашої ніші, моніторинг конкурентів;Збір статей за ключами, та структурою;Аналіз нинішніх статей та внесення необхідних правок для оптимізації;Аналіз написаних статей копірайтером на предмет коректного інтегрування ключів;Пошук та впровадження нових технологій та методів просування.Аналіз ефективності зовнішніх посилань на різноманітних сайтах.Зі своєї сторони пропонуємо:Конкурентоспроможну зарплату та бонуси за результатами;Дружелюбне робоче оточення;Оплачувані відпустки та лікарняні;Віддалений формат роботи;Можливість навчатися та покращувати скіли за рахунок компанії.Для уточнення будь яких питань можете писати у мій телеграм recruiter_proxy Просимо вказувати свої зарплатні очікування в супровідному письмі. Будемо раді знайомству та подальшій співпраці:)
DESCRIPTION The 3rd Line / Software Maintenance Engineer will be responsible for vulnerability management and ensuring the security of the software systems. They will utilize tools such as Qualys, Wireshark, Docker and Kubernetes, as well as work with various cloud platforms including AWS, Azure, and GCP. ResponsibilitiesManage and remediate vulnerabilities in software systems in real timeUtilize tools such as Qualys, Wireshark, Docker, and Kubernetes to effectively identify and address vulnerabilitiesWork with cloud platforms such as AWS, Azure, and GCP to ensure the security of software systemsCollaborate with IT Security, IT, and asset owners to provide real-time alerts and recommended remediation solutionsStay updated on the latest security practices and vulnerabilities to proactively protect software systemsRequirementsStrong experience in security engineering, with a primary skillset in securityKnowledge and experience with vulnerability management tools such as Qualys is a plusFamiliarity with network security concepts and technologiesExperience with cloud platforms such as AWS, Azure, and GCPAbility to work remotely and independently, with no work from the office requiredGood to have skills include Amazon Web Services, DevOps, Google Cloud Platform, Microsoft Azure, Security Monitoring, and experience with security technologies like Wireshark, Docker, and KubernetesWe OfferCompetitive compensation depending on experience and skillsIndividual career pathSick leave and regular vacationEnglish classes with certified English teachersUnlimited access to LinkedIn learning solutionsFlexible work hours
DESCRIPTION The Systems Engineer will be responsible for providing technical leadership, hands-on implementation, on-going day to day support across Datacenter operations and vendor management. This role will ensure effective development, maintenance, support, and optimization key functional areas, including Windows server operations, monitoring, virtualization, and cloud-based technologies. ResponsibilitiesProvide support for Endurance’s global data center environment which includes Windows server technologies, VMware, VEEAM and general server/datacenter maintenanceProvide immediate 3rd level server support for problems escalated by Datacenter Operations Team, Service Desk, App/Dev, or business usersProvide general server support and maintenance; implement patching and proactive maintenance plans; develop best practice maintenance plansProactively review monitoring systems, take action on alerts, and review/revise thresholds/alerts to ensure infrastructure stabilityPerform thorough root cause analysis, best practice troubleshooting techniques and processes, and procedures as defined by hardware/software vendors and personal experienceProactively support, maintain, create and update associated build and standard operating procedure documentation. Plan and implement necessary updates as required for support relevanceIdentify, document, publish and uphold systems policies, standards, procedures, checklists, agreements, diagrams, inventory, etcIdentify, plan, and lead projects and tasks necessary to assess, optimize, proactively manage, and maintain enterprise and client systems and infrastructureEnsure all support requests, projects, and other tasks are reviewed, prioritized, addressed and completed in a timely and proficient mannerIdentify, document, publish and uphold systems policies, standards, procedures, checklists, agreements, diagrams, inventory, etcRequirements7+ years of equivalent work experience5+ years of infrastructure support
DESCRIPTION This is an exciting opportunity to join a rapidly growing healthcare information technology business in the role of Senior Software Test Engineer. Reporting to the Clinical Effectiveness Software Test Engineering Manager, you will play an important role in our transition to a new consolidated suite of business systems across all of Clinical Effectiveness. This is a tremendous opportunity for someone with a passion for quality and testing to help transform the business systems of a leading healthcare information technology company. ResponsibilitiesCollaborate with the onsite QA lead on a day-to-day basis to ensure all the tasks and status are coveredOwn the development and execution of test plans and test cases for multiple systems and integrations based on requirements and throughput from design and specification reviewsCollaborate with product owners, business analysts, and stakeholders to clarify business rules, refine acceptance criteria, and ensure the overall quality of coverage for our manual test solutionsPromote QA productivity through automation, tools, processes, and other best practices to foster a culture of quality throughout the organizationReport the Project status to the QA managerPerform functional, integration, regression, end-to-end, and performance testingTroubleshoot, analyze and isolate defects and report them in JIRAManage defect logs and track issue resolution with business system analysts and development staffCollaborate with solution architecture/development teams to determine release testing requirements throughout the development cycleParticipate in all sprint activities including, daily scrums, grooming/sizing, planning, requirements gathering, story writing, and solution design sessionsAnalyze all project documents from a QA testing perspective, verifying accuracy and completeness. Must fully understand the business need/problem and the functional/technical solutions identifiedCoordinate and commu
«Мы (ARS ONLINE OÜ) используем cookie-файлы (cookies), обязательные для работы нашего сайта и сервисов, на основании легитимного интереса. Также, мы хотели бы с вашего согласия установить на вашем устройстве опциональные аналитические cookies для запоминания данных о просмотрах и пользования сервисами, а также маркетинговые cookies, которые помогут нам понять, какие сервисы и продукты интересуют пользователей больше всего.
Включая эти cookies, вы способствуете улучшению наших сервисов и продуктов. Подробнее о cookies читайте в нашей «Политике использования файлов cookies». Обязательные cookies мы устанавливаем в любом случае. Ниже вы можете разрешить или не разрешить нам установку опциональных cookies».