Hadoop sales require strong programming skills in Java, Python and Scala. The candidate should also have strong verbal and communication skills dealing with customers and partners. It requires a good understanding of analysis, design, coding and testing. Crafting enterprise data solutions for large organizations will be part of the skills required.



Contratar a Hadoop Consultants

Filtro

Mis búsquedas recientes
Filtrar por:
Presupuesto
a
a
a
Tipo
Habilidades
Idiomas
    Estado del trabajo
    6 trabajados encontrados, precios en USD

    You are required to prepare technical assessment questions about Data Science for a pre-assessment platform. The questions should be in the forms of multiple choice, free text and coding, and consist of different areas which are expected to be known by a Data Scientist at a hiring phase. Multiple choice questions should consist of options and free text questions should have 5 paraphrased answers. Each question should have its title generated from the specific topic included in the question. The questions should be about the following topics: Data: Databases / Mongo DB / SQL / R / Python / Algorithms / Machine Learning Techniques / Model Training You can find examplary questions from the attached document.

    $41 (Avg Bid)
    $41 Oferta promedio
    13 ofertas
    Observium Specialist 3 días left
    VERIFICADO

    I need an observium specialist to do custom dashboard for: 1. Port Status Monitoring for servers 2. Monitor database health (MariaDB, MongoDB, Hadoop) 3. Server up and down status 4. Memory, hard disk, and resource utilisation

    $53 (Avg Bid)
    $53 Oferta promedio
    3 ofertas

    We have opportunities for qualified Data Engineering Specialist to work for a leading telecommunications company in Sydney, Australia. We're seeking an experienced data engineering specialist who has skills and experience in implementing large scale big data platforms. Our big data / data analytics technologies: Postgresql, Apache Spark, Apache Kafka, Apache HDFS, Apache Nifi, Apache Hive, Apache Flink, Apache Druid, Scala and Python Programming language, Kubernetes(K8s)/Rancher/Docker, ELK, Internet of Things, Data Science, AI and ML Platforms

    $40 / hr (Avg Bid)
    $40 / hr Oferta promedio
    12 ofertas

    Content based recommendation system using MapReduce, i.e. given a job description you should be able to suggest a set of applicable courses

    $172 (Avg Bid)
    $172 Oferta promedio
    5 ofertas

    Create a POC for data lakehouse solution using: 1. hdfs for storage, 2. deltalake, dremio to query delta lake, dremio-superset integration It should have folln features: 1. ACID support 2. data governance feature 3. Data/metadata discovery using Amundsen 4. data catalog with Apache atlas .. here Amundsen to be integrated with Atlas. 5. provision for masking/encrypting data based on role accessing the data. 6. security controls and audit. Deliverables: 1. demo showcasing all the above requirements. 2. setup instructions/scripts that will help install the POC setup. 3. any scripts used to configure the products used in solution. 4. explanation of each and every solution component used, its configuration etc. 5. spark jobs code used to showcase one end to end scenario

    $178 (Avg Bid)
    $178 Oferta promedio
    4 ofertas

    My project is about scala and spark optimization

    $11 / hr (Avg Bid)
    $11 / hr Oferta promedio
    8 ofertas

    Principales artículos de la comunidad Hadoop