Arom: processing big data with data flow graphs and functional programming

The development in computational processing has driven towards distributed processing frameworks performing tasks in parallel setups. The recent advances in Cloud Computing have widely contributed to this tendency. The MapReduce model proposed by Google is one of the most popular despite the well-known limitations inherent to the model which constrain the types of jobs that can be expressed. On the other hand models based on Data Flow Graphs (DFG) for the processing and the definition of the jobs, while more complex to express, are more general and suitable for a wider range of tasks, including iterative and pipelined tasks. In this paper we present AROM, a framework for large scale distributed processing based on DFG to express the jobs and which uses paradigms from functional programming to define the operators. The former leads to more natural handling of pipelined tasks while the latter enhances genericity and reusability of the operators, as shown by our tests on a parallel and pipelined job performing the calculation of PageRank.

Nam-Luc Tran, Sabri Skhiri, Esteban Zimányi, and Arthur Lesuisse. AROM: Processing Big Data With Data Flow Graphs and Functional Programming, proceedings of the 4th IEEE International Conference on Cloud Computing Technology and Science, IEEE CloudCom 2012. IEEE Computer Society Press, Taipei, Taiwan, December 2012.

Click here to access the paper.