Skip to content Skip to sidebar Skip to footer
Showing posts with the label Rdd

How To Flatten Nested Lists In Pyspark?

I have an RDD structure like: rdd = [[[1],[2],[3]], [[4],[5]], [[6]], [[7],[8],[9],[10]]] and I wa… Read more How To Flatten Nested Lists In Pyspark?

Spark: How To "reducebykey" When The Keys Are Numpy Arrays Which Are Not Hashable?

I have an RDD of (key,value) elements. The keys are NumPy arrays. NumPy arrays are not hashable, an… Read more Spark: How To "reducebykey" When The Keys Are Numpy Arrays Which Are Not Hashable?

Rdd Collect Issue

I configured a new system, spark 2.3.0, python 3.6.0, dataframe read and other operations working a… Read more Rdd Collect Issue