В работе

run 10000 jobs in parallel on spark cluster , out of memory error

Поручен:

vinnzy

Hi, there 3 types of errors you might get while working with spark, 1. yarn error 2. stack error 3 memory error I have help you make changes to your code and get it up and running in no time. thanks

₹1750 INR за 4 дней(-я)
(0 отзывов(-а))
0.0

4 фрилансеров(-а) в среднем готовы выполнить эту работу за ₹1150

rainakarn

Any specific reason to run 10k jobs .I think design is wrong If you want we can setup and discuss more

₹600 INR за 1 день
(0 отзывов(-а))
0.0
dilipkumarkhand4

My name is dilip Kumar Khandelwal.I'm ceo of Finetech Analytics and i have very good team with lot of experience in analytics like data science, databricks, spark, scala, python,Java,Machine learning,Web developement Больше

₹950 INR за 1 день
(0 отзывов(-а))
0.0
djadhav16692

I have 2 years of experiance in spark, I had work on performance optimization of spark. So I think I can do it

₹1300 INR за 2 дней(-я)
(0 отзывов(-а))
0.0