- Acxiom (Albany, NY)
- …assessments, roadmap creation, and execution of legacy-to-cloud migrations (on-prem Hadoop , EDWs, etc.). + Define end-to-end architecture for ETL/ELT pipelines, ... at least 5+ years on the Databricks platform. + Strong expertise in Apache Spark , Delta Lake, Databricks SQL, Unity Catalog, and MLFlow. + Demonstrated experience in… more
- Meta (Albany, NY)
- …customers or have started a new line of product 18. Experience with Hadoop / Hadoop Base/Pig or MapReduce/Sawzall/Bigtable/Hive/ Spark 19. Experience in one or ... more of the following areas: machine learning, recommendation systems, pattern recognition, NLP, data mining or artificial intelligence **Public Compensation:** $213,000/year to $293,000/year + bonus + equity + benefits **Industry:** Internet **Equal… more
- Bloomberg (New York, NY)
- …logs. Our teams extensively use open source technologies such as OCI registry, Spark , Iceberg, Flink, Kafka, S3, Hadoop , Kubernetes, Argo, Buildpacks, and other ... integrity. **We'd love to see:** + Prior experience with data technologies such as Spark , Iceberg, Redis, or Flink. + Experience working with ML Feature Stores such… more
- Molina Healthcare (Albany, NY)
- …Storage/ Data Lake, Azure Databricks, etc.) + Expert level experience on Hadoop cluster components and services (like HDFS, YARN, ZOOKEEPER, AMBARI/CLOUDERA MANAGER, ... stream-processing systems, using solutions such as Kafka, Storm or Spark -Streaming + Proven experience on Big Data tools such...+ Proven experience on Big Data tools such as, Spark , Hive, Impala, Polybase, Phoenix, Presto, Kylin, etc. +… more
- Mastercard (Harrison, NY)
- …building and modernizing distributed data platforms using technologies such as Apache Spark , Kafka, Flink, NiFi, and Cloudera/ Hadoop . * Hands-on proficiency with ... modern data platforms and tools including Databricks, Snowflake, Delta Lake, and Redshift. * Proven experience designing and operating real-time and batch data pipelines in hybrid and cloud-native environments (AWS, Azure, GCP). * Understanding of Data Mesh… more
- Citigroup (New York, NY)
- …engineering with Hadoop ecosystem (HDFS, Hive, Impala, Ranger) and Apache Spark . Working knowledge of big data file formats (Parquet, ORC) + Good understanding ... of the Linux OS (Bash and basic scripting, Unix groups, host groups, etc) + Understanding of network infrastructure (HTTP/S, TCP/IP, TLS, DNS, Load Balancers, Firewalls, Proxies) **Education:** + Bachelor's degree/University degree or equivalent experience +… more
- Eliassen Group (Albany, NY)
- …Senior Data Engineers to join a high-impact project migrating workloads from a Hadoop On-Prem cluster to Google Cloud Platform (GCP). _We can facilitate w2 and ... and life insurance._ _Rate: $65 - $75 / hr. w2_ **Responsibilities:** + Build Spark /BigQuery data pipelines on GCP Dataproc + Design data models and schemas to meet… more
- House of Blues (NY)
- …are highly performant, reliable and scalable + Decent expertise using ANSI SQL and Spark SQL is key to have + Workflow automation, orchestration using Airflow or ... stack + Hands on working knowledge on at least in one of these : Databricks, Hadoop and related stacks + Working experience + in at least one of cloud services from… more
- Robert Half Technology (New York, NY)
- …(HIPAA, GDPR, etc.) Requirements Apache Kafka, Apache Pig, Apache Spark , Cloud Technologies, Data Visualization, Algorithm Implementation, Analytics, Apache ... Hadoop , API Development, AWS Technologies Technology Doesn't Change the World, People Do.(R) Robert Half is the world's first and largest specialized talent solutions… more