Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. Your role: We see candidate for this position as a mix of a mathematician, algorithm specialist, analyst, and programmer, capable of developing domain expertise and implementing production-ready solutions in most scientifically challenging areas. All the above by either doing comprehensive research or discovering an efficient way of implementing ideas obtained by the research, analytics, or scientific articles. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team. RequirementsStrong knowledge in common Co
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. Your role: We see candidate for this position as a mix of a mathematician, algorithm specialist, analyst, and programmer, capable of developing domain expertise and implementing production-ready solutions in most scientifically challenging areas. All the above by either doing comprehensive research or discovering an efficient way of implementing ideas obtained by the research, analytics, or scientific articles. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team. ResponsibilitiesKnowledge of python libs:
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. Your role: We see candidate for this position as a mix of a mathematician, algorithm specialist, analyst, and programmer, capable of developing domain expertise and implementing production-ready solutions in most scientifically challenging areas. All the above by either doing comprehensive research or discovering an efficient way of implementing ideas obtained by the research, analytics, or scientific articles. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team. RequirementsStrong knowledge in common Co
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. We welcome DevOps engineers to help us in keeping up delivering superior services to our clients. We expect our DevOps engineers to have the following scope of responsibilities. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team ResponsibilitiesAutomation of continuous integration and continuous delivery for multiple customized installationsManagement and utilization of cloud providers (AWS/Azure/GCP)Operational infrastructure maintenance (monitoring, logs, and alerts)Work in close cooperation with the
Are you looking for a challenging yet rewarding project that will leverage the best of your decision-making and engineering excellence? At DIAL we strive to maintain a delicate balance between the constantly evolving LLM landscape, the demands and scales of enterprise customers, and the restrictions that come with developing an open-source multimodal LLM and Application Orchestration platform under the Apache 2 license, all while ensuring ease of use and deployment and constantly delivering new features. Using DIAL as a foundation, we build a variety of practical solutions that can be customized for specific needs, such as StatGPT, a talk-to-your-data platform adopted by the International Monetary Fund and World-Bank. Imagine a future where accessing and understanding complex datasets is as easy as asking a question. With Project StatGPT, that future is now. We're on a mission to democratize data access through the power of Large Language Models (LLMs) based on Natural Language Processing (NLP). StatGPT is designed to bridge the gap between vast data repositories and the business people who need insights from them, using intuitive, conversational interfaces. We welcome DevOps engineers to help us in keeping up delivering superior services to our clients. We expect our DevOps engineers to have the following scope of responsibilities. Here we provide a list of skills and technologies we are interested in. See for yourself, whether any of them are in your scope of expertise or interest. You do not have to master all of them. Tell us what your strong side is and we will find the right task for you. We value the thrive for knowledge and self-development within our team ResponsibilitiesAutomation of continuous integration and continuous delivery for multiple customized installationsManagement and utilization of cloud providers (AWS/Azure/GCP)Operational infrastructure maintenance (monitoring, logs, and alerts)Work in close cooperation with the
Компанія «MDConsult» зараз шукає Business development manager для одного з наших клієнтів. Наш клієнт —інноваційна хмарна платформа, яка надає легкий доступ до обчислювальної потужності та дозволяє користувачам запускати найрізноманітніші високоякісні ігри чи програми на будь-якому пристрої.Ставка: 3000-3500 $Наші очікування від Вас:Вища освіта;Досвід роботи на посаді від 3 року;Знання англійської мови не нижче рівня В1/В2Вміння працювати в команді та в режимі багатозадачностіОсобисті якості:Сильні лідерські,комунікативні,вміння вести переговори,менеджерські та аналітичні навички.Обов’язки:Чітко розуміти продуктФормувати ЦАВисокі аналітичні навичкиВміти формувати відносини та виходити на ЛПР контрагентівВолодіти ефективними прийомами ведення переговорівГрамотне: листування, переписка, розвиток та супроводження партнерських відносинОтримувати зворотній звʼязок щодо функціонування продукту та постійно Ініціювати його покращенняПриймає участь в конференціяхПублічні виступи на тематичних заходахЩо ми пропонуємо:Офіс у м.КиївГрафік роботи: з ПН-ПТ з 10.00 до 19.00Заробітна плата: 3000-3500 $Швидкий кар’єрний ріст і підвищення кваліфікації.Конкурентна заробітна плата.Творча і драйвова робоча атмосфера.Затишний офіс в 5 хвилинах ходьби від метро.Безкоштовний обід.Оплачувані відпустки та лікарняніЯкщо Вас зацікавила вакансія, відправляйте резюме @MDConsultRecruiter!
DESCRIPTION Our client is the world's leading manufacturer of advanced control and automation systems, innovating technology and reinventing the way people live and work, offering integrated solutions to control audio & video systems. The client streamlines technology, improving the quality of life for people in corporate boardrooms, conference rooms, classrooms, auditoriums, and in their homes. ResponsibilitiesSearch for, integrate, and modify audio/video streaming components, middleware, and interfaces to user-space applications for the Video/Audio Conferencing/Streaming solutionsDrive architecture and execution of software and hardware with cross-functional teamsIntegrate software components into a fully functional software systemWrite well-structured, testable, efficient, and maintainable codeBoard bring-up and hardware design validation related to video/audioRequirements8+ years of software/firmware development experience3+ years experience with Embedded Android/Android BSP specifically in video/audio streamingProficiency in Java and C++Understanding and ability to modify code in CExperience with GStreamer framework on a developer levelExperience with video or graphics software technologies (V4L, VAAPI, ffmpeg, OpenMAX, OpenGL, etc.)Understanding video and audio streaming protocols (RTP, RTSP, etc.), codecs (H.264/H265, VP9, etc.), and related technologies is highly desirableExperience with camera integration or camera subsystem customizations in AOSPExcellent problem-solving, critical thinking, and communication skills, self-motivated and independent contributorUnderstanding of networking: Ethernet, IP, TCP, UDPUpper-Intermediate level of English, both spoken and written (B2+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client provides property, agricultural, cyber risk, business, casualty insurance, and reinsurance products. We are seeking a Senior Data Engineer to join our team. As a Senior Data Engineer, you will play a crucial role in building and optimizing data pipelines and architectures and integrating data from various sources. You will be responsible for designing, developing, and maintaining data integration solutions using Azure Analytics, Azure Data Factory, Azure Databricks, Azure Pipelines, and other Data Integration (ETL/ELT) Platforms. Our ideal candidate has a strong background in data engineering and experience with Profisee Master Data Management tool. ResponsibilitiesDesigning and implementing data integration solutions using Azure Analytics, Azure Data Factory, Azure Databricks, and Azure PipelinesDeveloping and maintaining ETL/ELT processes to collect, transform, and load data from various sources into data warehousesOptimizing data pipelines and architectures to improve efficiency and scalabilityCollaborating with cross-functional teams to understand data requirements and provide technical solutionsTroubleshooting and resolving data integration issuesPerforming data validation and quality checks to ensure data accuracy and integrityKeeping up-to-date with industry trends and best practices in data engineeringRequirements3+ years of experience as a Data Engineer with Data Integration expertiseAzure Analytics, Azure Data Factory, Azure Databricks, Azure Pipelines, Data Integration (ETL/ELT) Platforms, PySpark, Azure Synapse Analytics, PythonExperience developing, and managing end-to-end data pipelines, ETL (Extract, Transform, Load), and ELT (Extract, Load, Transform) solutionsAble to design & build end-to-end data engineering solutions using Azure Synapse AnalyticsSolid understanding of data modeling, lakehouse, data warehousing, and data governance principlesExcellent communication and collaboration skills to w
DESCRIPTION Our client is a multinational technology company that specializes in Internet-related services and products which you use every day as a web search engine, a web browser, email/video/map services, online advertising technologies, cloud computing, software, and hardware. Author of various operating platforms, programming languages, tools, and frameworks without which current software development cannot be imagined. It is considered one of the biggest Internet stocks. ResponsibilitiesUnderstand and analyze client needsDesign and professionally articulate solution to the clientBe responsible for implementation and open source contributionBe effective in communication using official Kubernetes SIG channels and teamsBe creative and thinking out of box is a key to success hereRequirements3+ years of experinece as a GoLang EngineerExcellent Kubernetes knowledgeGood Cloud and Container architecture knowledgeKnowledge of batch scheduling technologies such as ray.io, spark (or similar) is a significant advantageExperience with Python and ML engineering is an advantageProficient in Systems engineering (Linux)We OfferCompetitive compensation depending on experience and skillsIndividual career pathSick leave and regular vacationEnglish classes with certified English teachersUnlimited access to LinkedIn learning solutionsFlexible work hours
DESCRIPTION Our client is one of the world's leading manufacturers and marketers of quality skin care, makeup, fragrance, and hair care products. ResponsibilitiesDevelop scripts in Databricks workspace to transform data from feeds selected for the PoCDevelop data product per the requirements and register in the data marGain a detailed understanding of the requirement to ensure delivered data product meets the requirementsCollaborate with the architect and vendor developers to enable connectivity and integration of data productEnsure the quality and on-time delivery of project deliverablesGain an understanding of the overall planning processes and data flows to support full scale migration as a follow up to the PoCRequirements3+ years of experience as a Data Software EngineeringProficiency in DatabricksIntermediate level of English, both spoken and written (B1+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is a Danish multinational company, based in Denmark, with more than 40,000 employees globally. ResponsibilitiesSuggest test approach and methodology: adjust test strategy / test planEducate other QAs/Devs to implement automated testsCreate automated test casesIdentify what and how can be automatedParticipate in requirements gathering and clarificationRequirements5+ years of experience as a Test Automation Engineer with JavaScript expertiseDeep understanding of the main test automation approaches and practical experience in with at least 2 approaches in Production projectsSolid theoretical knowledge and wide practical experience in test automationTypescriptReactNode.js.NETHeadless CMS (Contentstack)Upper-Intermediate level of English, both spoken and written (B2+)Nice to haveGUI TestingPlaywrightSQLWeb Service / API Test Automation Tools in JSWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is a world-leading food company built on business lines such as Essential Dairy and Plant-based Products, Waters, and Early Life Nutrition. ResponsibilitiesLead a team of engineers in the development of data products for Danone's commercial operationsCollaborate with stakeholders to understand requirements and design solutionsWrite clean, efficient, and maintainable codeMentor and coach junior team membersEnsure the quality and on-time delivery of project deliverablesRequirements5+ years of experience as a Data Software EngineeringExperience with Azure Data Factory and Azure Data ServicesProficiency in Databricks, PySpark, and Python BasicsIntermediate level of English, both spoken and written (B1+)We OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
DESCRIPTION Our client is a Canadian retail company that operates in the automotive, hardware, sports, leisure, and housewares sectors. ResponsibilitiesEnvironment Incidents troubleshootingThe development team and customer supportDeployment support to multiple environmentsWorking with a centralized DevOps teamParticipate in regular internal and client syncRequirements3+ years of experinece as a DevOps Engineer with Azure expertiseExperience with TerraformAzure CloudAzure DevOps PipelinesAzure NetworkingAzure SecurityAzure Kubernetes ServiceWe OfferCompetitive compensation depending on experience and skillsIndividual career pathSick leave and regular vacationEnglish classes with certified English teachersUnlimited access to LinkedIn learning solutionsFlexible work hours
DESCRIPTION Our client is an investment company that focuses on investments in hospitality and leisure. ResponsibilitiesCollaborate with cross-functional teams to understand data requirements and design efficient and scalable data solutionsDevelop and implement ETL/ELT solutions using Databricks / Spark and other relevant technologiesIntegrate data from multiple sources, ensuring data quality and accuracyDesign and optimize data models to support business intelligence and analytics needsWork with stakeholders to identify and prioritize data-related initiativesMaintain and support existing data pipelines and workflowsStay up-to-date with industry trends and best practices in data engineeringRequirements3+ years of experience as a Data EngineerDatabricksPythonETL/ELT SolutionsMicrosoft AzureSparkDeltalakeUpper-Intermediate level of English, both spoken and written (B2+)Apache Iceberg would be a plusWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hours
Are you a highly skilled and motivated Data Software Engineer with a passion for Machine Learning? We have an exciting opportunity for you to be a key player in our Machine Learning Engineering (MLE) team based in Ukraine. At Epam, we are committed to driving innovation and evolving cutting-edge data solutions. Why Join Us? Cutting-Edge Technology: As a Big Data Software Engineer, you will be at the forefront of developing and optimizing data-driven solutions with a primary focus on Machine Learning. Training and Certification: We invest in your growth. The selected candidate will undergo a two-month training period as part of the probation, with the opportunity to obtain a Databricks certification. The associated costs will be covered by Epam. Dynamic Team Environment: Collaborate with a dynamic MLE team, working together to design and implement groundbreaking data-driven solutions. #LI-IK2ResponsibilitiesInnovative Collaboration: Collaborate with the MLE team to design and implement cutting-edge data-driven solutionsData Pipeline Development: Develop and maintain data pipelines for machine learning applicationsPython and SQL Mastery: Utilize Python and SQL to manipulate and analyze large datasetsCloud Expertise: Work with cloud platforms, with a priority on Azure, AWS, or GCPAdvanced Techniques: Explore and implement advanced data processing techniques using Databricks and SparkQuality Assurance: Participate in code reviews, debugging, and troubleshooting to ensure high-quality deliverablesContinuous Learning: Engage in continuous learning and stay updated on emerging trends in data engineering and machine learningRequirements3+ years of hands-on experience in Software Engineering rolesEducation: Bachelor's degree in Computer Science, Data Science, or a related fieldTechnical Skills: Proficient in Python, strong SQL skills, and experience with at least one cloud platform (Azure, AWS, GCP), DevOps toolset – any of Docker/Kubernetes(K8s)/Terraform/AWS CloudFormatio
Потрібен відповідний адміністратор в магазин косметики. Досвід не обов'язковий, але є перевагою. Вимоги до кандидата: Відповідальність Впевненість Охайність Гарна пам'ять Умови: Графік понеділок-п'ятниця (сб-нд вихідний); з 8:00-17:00 Заробітня плата : ставка 16 000грн + % від продажів Звертатись за телефоном 0962236102
Our customer is one of the largest global biopharmaceutical firms. The company is known for its significant investment in research and development and its commitment to sustainability, high ethical standards, and improving access to healthcare worldwide The Power Automate Engineer will be responsible for designing, implementing, and managing workflows and task automation using Microsoft's Power Automate Desktop, Power Automate Cloud Flows, SQL, and potentially other technologies as the role evolves. This role requires an analytical mindset, problem-solving skills, and the ability to work and communicate directly with project stakeholders. RequirementsRich RPA experienceExperience working with SQL, Microsoft Power Automate Desktop, Power Automate Cloud Flows, and understanding of automation conceptsFamiliarity with ExcelPrior analytical experience or knowledge is a bonusWillingness for continuous self-improvement, particularly in the areas of analytical and technical trendsExperience working in Agile teamsIntermediate or above level of English proficiencyNice to haveFamiliarity with UiPath RPA platformSharePoint automationsAttention to detail and a conceptual mindsetStrong business analysis skills and ability to formulate requirementsAbility to work with large volumes of dataGood communication skillsAbility to work both independently and within a team setupWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver
Our customer is one of the largest global biopharmaceutical firms. The company is known for its significant investment in research and development and its commitment to sustainability, high ethical standards, and improving access to healthcare worldwide The Power Automate Engineer will be responsible for designing, implementing, and managing workflows and task automation using Microsoft's Power Automate Desktop, Power Automate Cloud Flows, SQL, and potentially other technologies as the role evolves. This role requires an analytical mindset, problem-solving skills, and the ability to work and communicate directly with project stakeholders. RequirementsRich RPA experienceExperience working with SQL, Microsoft Power Automate Desktop, Power Automate Cloud Flows, and understanding of automation conceptsFamiliarity with ExcelPrior analytical experience or knowledge is a bonusWillingness for continuous self-improvement, particularly in the areas of analytical and technical trendsExperience working in Agile teamsIntermediate or above level of English proficiencyNice to haveFamiliarity with UiPath RPA platformSharePoint automationsAttention to detail and a conceptual mindsetStrong business analysis skills and ability to formulate requirementsAbility to work with large volumes of dataGood communication skillsAbility to work both independently and within a team setupWe OfferCompetitive compensation depending on experience and skillsIndividual career pathUnlimited access to LinkedIn learning solutionsSick leave and regular vacationEnglish classes with certified English teachersFlexible work hoursAbout EPAMEPAM is a leading global provider of digital platform engineering and development services. We are committed to positively impacting our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver
We are looking for Solution Architects for data-driven projects to join our Data Practice team in Ukraine. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain ResponsibilitiesDesign data analytics solutions by utilizing the big data technology stackCreate and present solution architecture documents with deep technical detailsWork closely with business in identifying solution requirements and key case-studies/scenarios for future solutionsConduct solution architecture review/audit, calculate and present ROILead implementation of the solutions from establishing project requirements and goals to solution "go-live"Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representativesCreate and follow personal education plan in the technology stack and solution architectureMaintain a strong understanding of industry trends and best practicesGet involved in engaging new clients to further drive EPAM business in the big data spaceRequirementsShould have minimum experience of 7+ yearsStrong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or PythonExperience delivering data analytics projects and architecture guidelinesExperience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud) and other cloudProduction project experience in at least one of the big data technologiesBatch processing: Hadoop and MapReduce/Spark/HiveNoSQ
We are looking for Solution Architects for data-driven projects to join our Data Practice team in Ukraine. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain ResponsibilitiesDesign data analytics solutions by utilizing the big data technology stackCreate and present solution architecture documents with deep technical detailsWork closely with business in identifying solution requirements and key case-studies/scenarios for future solutionsConduct solution architecture review/audit, calculate and present ROILead implementation of the solutions from establishing project requirements and goals to solution "go-live"Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representativesCreate and follow personal education plan in the technology stack and solution architectureMaintain a strong understanding of industry trends and best practicesGet involved in engaging new clients to further drive EPAM business in the big data spaceRequirementsShould have minimum experience of 7+ yearsStrong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or PythonExperience delivering data analytics projects and architecture guidelinesExperience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud) and other cloudProduction project experience in at least one of the big data technologiesBatch processing: Hadoop and MapReduce/Spark/HiveNoSQ
«Мы (ARS ONLINE OÜ) используем cookie-файлы (cookies), обязательные для работы нашего сайта и сервисов, на основании легитимного интереса. Также, мы хотели бы с вашего согласия установить на вашем устройстве опциональные аналитические cookies для запоминания данных о просмотрах и пользования сервисами, а также маркетинговые cookies, которые помогут нам понять, какие сервисы и продукты интересуют пользователей больше всего.
Включая эти cookies, вы способствуете улучшению наших сервисов и продуктов. Подробнее о cookies читайте в нашей «Политике использования файлов cookies». Обязательные cookies мы устанавливаем в любом случае. Ниже вы можете разрешить или не разрешить нам установку опциональных cookies».