Наразі ми шукаємо Senior Big Data Engineer для поповнення нашої команди професіоналів. Можливість віддаленої роботи відкрита лише для тих кандидатів, які знаходяться на території України. #LI-YK3Навички3+ років досвіду в Data EngineeringЗнання JavaДосвід роботи з хмарними платформами, такими як AWS або AzureРівень володіння усною і письмовою англійською мовою вище середнього (B2+)Обов'язкиАналіз існуючих практик та рішень нашого клієнтаДіагностувати та вирішувати проблеми після впровадження з новими та існуючими середовищами, які були наданіВідповідати та консультувати кінцевих користувачів в робочий часРозробка модульних та компонентних тестів та підтримка наскрізного тестування CDPДодавання показників ефективності в рішення для моніторингу та оновлення кодуПідготовка документації та передача знань співробітникам клієнтаБуде перевагоюДосвід роботи з DatabricksЗ нами ти можешПрацювати за гнучким графіком віддалено або у будь-якому з наших комфортних офісів чи коворкінгів в УкраїніОтримати необхідне обладнання для виконання робочих завданьЗмінювати проєкти та стек технологій всередині EPAMОтримувати досвід у різних бізнес-доменах (Insurance, E-commerce, Healthcare, Finance, Travelling, Media, Artificial Intelligence та інші)Розглянути варіанти для релокації у понад 30 країн світуДолучатися до волонтерських, благодійних програм та спільнот (як за технічними напрямами, так і за інтересами)Ми допоможемо тобі у професійному розвиткуСплануй свій індивідуальний кар’єрний шлях разом із менеджеромОтримуй регулярний зворотній зв’язок від колегБезкоштовно вдосконалюй англійську мову з сертифікованими викладачами (Speaking Clubs, курси з підготовки до клієнтських інтерв’ю тощо)Отримай можливість безкоштовно пройти навчання та сертифікацію з AWS, GCP або Azure CloudsКористуйся внутрішньою навчальною програмою E-learn (18,200+ спеціалізованих тренінгів та менторинг-програм)Отримай доступ до корпоративних акаунтів на LinkedIn Learning, Get Abstract та інших партнерських ресурсівН
Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world ResponsibilitiesCreate data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assetsSupport the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptionsDrive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirementsStrive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.)Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often haveWorking with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect)Requirements5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database designExpert hands-on experience with DatabricksExpert-level knowledge of SQLExperienced with Azure Data FactoryProduction coding experience with PythonDelta Lake or other similar technologyUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, ou
Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world ResponsibilitiesCreate data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assetsSupport the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptionsDrive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirementsStrive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.)Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often haveWorking with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect)Requirements5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database designExpert hands-on experience with DatabricksExpert-level knowledge of SQLExperienced with Azure Data FactoryProduction coding experience with PythonDelta Lake or other similar technologyUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, ou
DESCRIPTION The 3rd Line / Software Maintenance Engineer will be responsible for vulnerability management and ensuring the security of the software systems. They will utilize tools such as Qualys, Wireshark, Docker and Kubernetes, as well as work with various cloud platforms including AWS, Azure, and GCP. ResponsibilitiesManage and remediate vulnerabilities in software systems in real timeUtilize tools such as Qualys, Wireshark, Docker, and Kubernetes to effectively identify and address vulnerabilitiesWork with cloud platforms such as AWS, Azure, and GCP to ensure the security of software systemsCollaborate with IT Security, IT, and asset owners to provide real-time alerts and recommended remediation solutionsStay updated on the latest security practices and vulnerabilities to proactively protect software systemsRequirementsStrong experience in security engineering, with a primary skillset in securityKnowledge and experience with vulnerability management tools such as Qualys is a plusFamiliarity with network security concepts and technologiesExperience with cloud platforms such as AWS, Azure, and GCPAbility to work remotely and independently, with no work from the office requiredGood to have skills include Amazon Web Services, DevOps, Google Cloud Platform, Microsoft Azure, Security Monitoring, and experience with security technologies like Wireshark, Docker, and KubernetesWe OfferCompetitive compensation depending on experience and skillsIndividual career pathSick leave and regular vacationEnglish classes with certified English teachersUnlimited access to LinkedIn learning solutionsFlexible work hours
DESCRIPTION This is an exciting opportunity to join a rapidly growing healthcare information technology business in the role of Senior Software Test Engineer. Reporting to the Clinical Effectiveness Software Test Engineering Manager, you will play an important role in our transition to a new consolidated suite of business systems across all of Clinical Effectiveness. This is a tremendous opportunity for someone with a passion for quality and testing to help transform the business systems of a leading healthcare information technology company. ResponsibilitiesCollaborate with the onsite QA lead on a day-to-day basis to ensure all the tasks and status are coveredOwn the development and execution of test plans and test cases for multiple systems and integrations based on requirements and throughput from design and specification reviewsCollaborate with product owners, business analysts, and stakeholders to clarify business rules, refine acceptance criteria, and ensure the overall quality of coverage for our manual test solutionsPromote QA productivity through automation, tools, processes, and other best practices to foster a culture of quality throughout the organizationReport the Project status to the QA managerPerform functional, integration, regression, end-to-end, and performance testingTroubleshoot, analyze and isolate defects and report them in JIRAManage defect logs and track issue resolution with business system analysts and development staffCollaborate with solution architecture/development teams to determine release testing requirements throughout the development cycleParticipate in all sprint activities including, daily scrums, grooming/sizing, planning, requirements gathering, story writing, and solution design sessionsAnalyze all project documents from a QA testing perspective, verifying accuracy and completeness. Must fully understand the business need/problem and the functional/technical solutions identifiedCoordinate and commu
Do you have a passion for the ever-evolving beauty industry and a drive to push boundaries? Join the team of professionals for one of the beauty industry leader! Our customer is not only the best-in-class product and service provider but a pioneer who are constantly seeking new ways to redefine beauty and create unforgettable experiences for its customers. Their "try then buy" philosophy is exemplified through its in-store beauty stations where customers can test products and receive personalized advice from knowledgeable staff members. We are looking for an experienced senior developer who will be deeply involved in aggregation of commercial purchases, marketing campaigns, catalog data, etc from various sources into the data lake. ResponsibilitiesArchitect and maintain our code base for ETL and ELT pipelines, large batch/micro batch processing and streaming systemsBuild out the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ADF, Spark, Kafka, or similar technologiesIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etcFor the top management and stakeholders - act as a single point of responsibility over any delivery-related matters, including escalations, upsells, ramp-downs, etcAccountable for the technical leadership regarding the delivery. Ensuring a sound and future-proof architecture is planned and the implementation meets the technical quality standardsComfortable writing stories and associated acceptance criteria for agile/scrum workflowCoordinate between multiple disciplines, stakeholdersEnsure that projects are delivered in line with our client processes and methodologies; Well-versed in different delivery models, with focus on agile approachesEstablish a strategy of continuous delivery risk management that enables proactive decisi
Do you have a passion for the ever-evolving beauty industry and a drive to push boundaries? Join the team of professionals for one of the beauty industry leader! Our customer is not only the best-in-class product and service provider but a pioneer who are constantly seeking new ways to redefine beauty and create unforgettable experiences for its customers. Their "try then buy" philosophy is exemplified through its in-store beauty stations where customers can test products and receive personalized advice from knowledgeable staff members. We are looking for an experienced senior developer who will be deeply involved in aggregation of commercial purchases, marketing campaigns, catalog data, etc from various sources into the data lake. ResponsibilitiesArchitect and maintain our code base for ETL and ELT pipelines, large batch/micro batch processing and streaming systemsBuild out the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ADF, Spark, Kafka, or similar technologiesIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etcFor the top management and stakeholders - act as a single point of responsibility over any delivery-related matters, including escalations, upsells, ramp-downs, etcAccountable for the technical leadership regarding the delivery. Ensuring a sound and future-proof architecture is planned and the implementation meets the technical quality standardsComfortable writing stories and associated acceptance criteria for agile/scrum workflowCoordinate between multiple disciplines, stakeholdersEnsure that projects are delivered in line with our client processes and methodologies; Well-versed in different delivery models, with focus on agile approachesEstablish a strategy of continuous delivery risk management that enables proactive decisi
DESCRIPTION Our client is a global, privately owned company that connects people with ideas, data with insights, supply with demand, restaurants with deliveries and ultimately, people with the products they love. ResponsibilitiesImplementation of DWH and Data Hubs, including the full ETL processImplementation of Data ModelsUnit TestingRequirementsExperience building data ingestion pipelines with tools like: Databricks, SSIS, Talend, Informatica, etcExperience implementing data models that have been designed by someone elseStrong SQL – The plan is to have the data engineers create Databricks notebooks using either SQL or Python, but SQL is preferableExperience unit testing codeUpper-Intermediate level of English, both spoken and written (B2+)Nice to haveAzure/other cloud storage technologies, SSAS tabular, Azure Data Factory, Azure DevOpsWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is a German multinational pharmaceutical and biotechnology company and is one of the largest pharmaceutical companies and biomedical companies in the world. ResponsibilitiesCreate data models and related data pipelines in Azure DataBricks and Data Factory for analytical dashboards, integrating multiple data assetsSupport the architectural decisions, and participate in the elaboration of new implementation proposals for our customers, e.g. providing high level estimations and helping establish the right assumptionsDrive, lead and coach other BE engineers to implement data pipelines following the best practices and influencing the customer requirementsStrive to understand the problems to solve and proactively make suggestions on the best way to addressed them (performance, data volume, data discrepancies or mismatches, operational costs, etc.)Understanding, having had the experience of working with sales and consumer good analytics the most important metrics and aggregations to provide and the challenges they often haveWorking with and supporting several analytical teams (frontend developers, product owners, QAs, solution architect)Requirements5+ years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database designExpert hands-on experience with DatabricksExpert-level knowledge of SQLExperienced with Azure Data FactoryProduction coding experience with one of the data-oriented programming languagesDelta LakeUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is the world's leading manufacturer of advanced control and automation systems, innovating technology and reinventing the way people live and work, offering integrated solutions to control audio & video systems. The client streamlines technology, improving the quality of life for people in corporate boardrooms, conference rooms, classrooms, auditoriums, and in their homes. ResponsibilitiesSearch for, integrate, and modify audio/video streaming components, middleware, and interfaces to user-space applications for the Video/Audio Conferencing/Streaming solutionsDrive architecture and execution of software and hardware with cross-functional teamsIntegrate software components into a fully functional software systemWrite well-structured, testable, efficient, and maintainable codeBoard bring-up and hardware design validation related to video/audioRequirements8+ years of software/firmware development experience3+ years experience with Embedded Android/Android BSP specifically in video/audio streamingProficiency in Java and C++Understanding and ability to modify code in CExperience with GStreamer framework on a developer levelExperience with video or graphics software technologies (V4L, VAAPI, ffmpeg, OpenMAX, OpenGL, etc.)Understanding video and audio streaming protocols (RTP, RTSP, etc.), codecs (H.264/H265, VP9, etc.), and related technologies is highly desirableExperience with camera integration or camera subsystem customizations in AOSPExcellent problem-solving, critical thinking, and communication skills, self-motivated and independent contributorUnderstanding of networking: Ethernet, IP, TCP, UDPUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client provides property, agricultural, cyber risk, business, casualty insurance, and reinsurance products. We are seeking a Senior Data Engineer to join our team. As a Senior Data Engineer, you will play a crucial role in building and optimizing data pipelines and architectures and integrating data from various sources. You will be responsible for designing, developing, and maintaining data integration solutions using Azure Analytics, Azure Data Factory, Azure Databricks, Azure Pipelines, and other Data Integration (ETL/ELT) Platforms. Our ideal candidate has a strong background in data engineering and experience with Profisee Master Data Management tool. ResponsibilitiesDesigning and implementing data integration solutions using Azure Analytics, Azure Data Factory, Azure Databricks, and Azure PipelinesDeveloping and maintaining ETL/ELT processes to collect, transform, and load data from various sources into data warehousesOptimizing data pipelines and architectures to improve efficiency and scalabilityCollaborating with cross-functional teams to understand data requirements and provide technical solutionsTroubleshooting and resolving data integration issuesPerforming data validation and quality checks to ensure data accuracy and integrityKeeping up-to-date with industry trends and best practices in data engineeringRequirements3+ years of experience as a Data Engineer with Data Integration expertiseAzure Analytics, Azure Data Factory, Azure Databricks, Azure Pipelines, Data Integration (ETL/ELT) Platforms, PySpark, Azure Synapse Analytics, PythonExperience developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutionsAble to design & build end-to-end data engineering solutions using Azure Synapse AnalyticsSolid understanding of data modeling, lakehouse, data warehousing, and data governance principlesExcellent communication and collaboration skills to w
DESCRIPTION Our client is one of the world's leading manufacturers and marketers of quality skin care, makeup, fragrance, and hair care products. ResponsibilitiesDevelop scripts in Databricks workspace to transform data from feeds selected for the PoCDevelop data product per the requirements and register in the data marGain a detailed understanding of the requirement to ensure delivered data product meets the requirementsCollaborate with the architect and vendor developers to enable connectivity and integration of data productEnsure the quality and on-time delivery of project deliverablesGain an understanding of the overall planning processes and data flows to support full scale migration as a follow up to the PoCRequirements3+ years of experience as a Data Software EngineeringProficiency in DatabricksIntermediate level of English, both spoken and written (B1+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is an investment company that focuses on investments in hospitality and leisure. ResponsibilitiesCollaborate with cross-functional teams to understand data requirements and design efficient and scalable data solutionsDevelop and implement ETL/ELT solutions using Databricks / Spark and other relevant technologiesIntegrate data from multiple sources, ensuring data quality and accuracyDesign and optimize data models to support business intelligence and analytics needsWork with stakeholders to identify and prioritize data-related initiativesMaintain and support existing data pipelines and workflowsStay up-to-date with industry trends and best practices in data engineeringRequirements3+ years of experience as a Data EngineerDatabricksPythonETL/ELT SolutionsMicrosoft AzureSparkDeltalakeUpper-Intermediate level of English, both spoken and written (B2+)Apache Iceberg would be a plusWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
Are you a highly skilled and motivated Data Software Engineer with a passion for Machine Learning? We have an exciting opportunity for you to be a key player in our Machine Learning Engineering (MLE) team based in Ukraine. At Epam, we are committed to driving innovation and evolving cutting-edge data solutions. Why Join Us? Cutting-Edge Technology: As a Big Data Software Engineer, you will be at the forefront of developing and optimizing data-driven solutions with a primary focus on Machine Learning. Training and Certification: We invest in your growth. The selected candidate will undergo a two-month training period as part of the probation, with the opportunity to obtain a Databricks certification. The associated costs will be covered by Epam. Dynamic Team Environment: Collaborate with a dynamic MLE team, working together to design and implement groundbreaking data-driven solutions. #LI-IK2ResponsibilitiesInnovative Collaboration: Collaborate with the MLE team to design and implement cutting-edge data-driven solutionsData Pipeline Development: Develop and maintain data pipelines for machine learning applicationsPython and SQL Mastery: Utilize Python and SQL to manipulate and analyze large datasetsCloud Expertise: Work with cloud platforms, with a priority on Azure, AWS, or GCPAdvanced Techniques: Explore and implement advanced data processing techniques using Databricks and SparkQuality Assurance: Participate in code reviews, debugging, and troubleshooting to ensure high-quality deliverablesContinuous Learning: Engage in continuous learning and stay updated on emerging trends in data engineering and machine learningRequirements3+ years of hands-on experience in Software Engineering rolesEducation: Bachelor's degree in Computer Science, Data Science, or a related fieldTechnical Skills: Proficient in Python, strong SQL skills, and experience with at least one cloud platform (Azure, AWS, GCP), DevOps toolset – any of Docker/Kubernetes(K8s)/Terraform/AWS CloudFormatio
У ваші функції буде входити:▪ Спостереження дашбордів та відпрацювання оповіщень системи управління подіями безпеки (SIEM) з метою виявлення інцидентів незвичайної поведінки користувачів/хостів мережі▪ Побудова аналітичних запитів у системі Splunk Enterprise з метою виявлення інцидентів▪ Документування інцидентів, оцінка заподіяної шкоди та масштабу впливу▪ Адміністрування сервісу системи оновлень Microsoft (WSUS)▪ Усунення технічного боргу за метриками▪ Участь в усуненні наслідків інцидентів: встановлення патчів, відновлення інформації з резервних копій, оновлення антивірусних баз тощо▪ Забезпечення виконання та вдосконалення процесів моніторингу інцидентів▪ Управління інцидентами та реагування на інцидентиВимоги до посади:▪ Розуміння базових аспектів забезпечення інформаційної безпеки (конфіденційність-цілісність-доступність)▪ Знання стандартів та найкращих практик в галузі інформаційної безпеки (ISO27000, CIS contrils, SANS Top 20 Critical Security Controls і т. д., OWASP Top 10)▪ Досвід з адміністрування операційних систем: Microsoft Windows/Linux▪ Досвід управління сервісами операційних систем (AD, Exchange, DNS, DHCP, web-servers і т.д.)▪ Досвід роботи з утилітами командного рядка tcpdump, iptables (nftables), firewalld▪ Розуміння перебігу життєвого циклу обробки інцидентів (аналіз, оцінка впливу, усунення)▪ Розуміння призначення системи обробки подій інформаційної безпеки (SIEM).▪ Здатність гнучкого управління пріоритетами▪ Розуміння функцій та призначення інструментів кібербезпеки (IDS/IPS, FW, WAF, DAF, SIEM тощо)▪ Досвід роботи з одним із рішень захисту кінцевих точок провідних вендорів (CISCO, Symantec, TrendMicro, ESET тощо)▪ Досвід роботи від 2 років як системний адміністратор або лінії підтримки 2-го рівня▪ Вища освіта в одній із галузей: комп'ютерних, математичних або інженерних наукМи пропонуємо:• ЗП залежить від вашого рівня• 8-годинний робочий день• Навчання всередині компанії• Можливість кар'єрного та професійного зростанняЧекаємо на твоє резюме!
«Ми (ARS ONLINE OÜ) використовуємо cookie-файли (cookies), обов\'язкові для роботи нашого сайту і сервісів, на підставі легітимного інтересу. Також, ми хотіли б із вашої згоди встановити на вашому пристрої опціональні аналітичні cookies для запам\'ятовування даних про перегляди та користування сервісами, а також маркетингові cookies, які допоможуть нам зрозуміти, які сервіси і продукти цікавлять користувачів найбільше.
Вмикаючи ці cookies, ви сприяєте поліпшенню наших сервісів і продуктів. Детальніше про cookies читайте в нашій «Політиці використання файлів cookies». Обов\'язкові cookies ми встановлюємо в будь-якому випадку. Нижче ви можете дозволити чи не дозволити нам установку опціональних cookies».