Upvote
Downvote
Enterprise Data Architect
Share Job
- Suggest Revision
$70 - $75 an hour
- Experience in advanced Apache Spark processing framework, spark programming languages such as Scala/Python/Advanced Java with sound knowledge in shell scripting.
- Should have experience in both functional programming and Spark SQL programming dealing with processing terabytes of data · Specifically, this experience must be in writing Big Data Data engineering jobs for large scale data integration in AWS. · Prior experience in writing Machine Learning data pipeline using Spark programming language is an added advantage.
- Advanced SQL experience including SQL performance tuning is a must.
- Should have worked on other big data frameworks such as Map Reduce, HDFS, Hive/Impala, AWS Athena.
- Experience in logical & physical table design in Big Data environment to suite processing frameworks · Knowledge of using, setting up and tuning resource management frameworks such as Yarn, Mesos or standalone spark.
Active Job
Updated 1 month agoSimilar Job
Relevance
Active