Skip to content

The captivating AROM of distributed processing

Last month we have had the opportunity to present AROM at the 4th IEEE International Conference on Cloud Computing Technology and Science (CLOUDCOM) in Taipei, the beautiful capital of Taiwan. For the people who are just getting on train, I will quickly here recall what AROM is about, where one can find it and how its taste is.

AROM is a distributed processing framework which aims at providing a playground for research on distributed processing. Its main objective is to provide a convenient environment for quickly implementing and prototyping on distributed algorithms and jobs. This is achieved first by the DataFlow Graph processing model, where jobs are expressed as Directed Acyclic Graphs expresing operations and the dependencies of the data.  By remaining simple and general, this processing model allows for greater flexibility and optimization. Furthermore, the fact that it is written entirely using Scala and the Akka Actors framework ensures for scalability, and the reusing of paradigms borrowed from the functional programming world (such as higher order, anonymous constructs, …)  enforces reusability and genericity of the job implementations.

Oh, and did I tell you that it is fully released in open source?

References

– N.-L. TRAN, S. SKHIRI, A. LESUISSE, E. ZIMANYI, AROM: Processing Big Data with DataFlow Graphs and Functional Programming. Proceedings of the International Workshop on Cloud Computing for Research Collaborations. 4th IEEE International Conference on Cloud Computing Technology and Science (CLOUDCOM): http://code.ulb.ac.be/dbfiles/TraSkhZimLes2012incollection.pdf

–  AROM distributed processing: http://arom-processing.org/

– AROM on GitHub: https://github.com/nltran/arom


 

Releated Posts

Kafka Summit 2024: Announcements & Trends

The Kafka Summit brought together industry experts, developers, and enthusiasts to discuss the latest advancements and practical applications of event streaming and microservices. In this article, our CTO Sabri Skhiri
Read More

Privacy Enhancing Technologies 2024: A Summary

For Large Language Models (LLMs), Azure confidential computing offers TEEs to protect data integrity throughout various stages of the LLM lifecycle, including prompts, fine-tuning, and inference. This ensures that all
Read More