Skip to content

Atmoshpere & Asynchronous Servlet

Last week, I was at Devoxx and I attended to a speech of JF Arcan, the head architect of the Atmoshpere framework [1]. While the new coming Servlet 3.0 Async provides

  • Method to suspend a response, HttpServletRequest.startAsync()
  • Method to resume a response: AsyncContext.complete()

Atmosphere offers:

  • Annotation to suspend: @Suspend
  • Annotation or resume: @Resume
  • Annotation to broadcast (or push) events to the set of suspended responses: @Broadcast
  • Annotation to filter and serialize broadcasted events using BroadcasterFilter

In [2], JF Arcand demonstrates how Atmosphere can be used to suspend a request but also broadcasting a message to all suspended threads. Atmoshpere defines the notion of broadcastable and topics.

A topic is a kind a message channel on which broadcastable servlet can listen. Later on, any other client can broadcast messages on this topic and then reaching all suspended-threads.

The SipServlet Grail ?

In the Telco world, more especially in the IMS world, the SipServlet specification[4] has gained popularity. Indeed, building Web/Sip application has never been so easy before SipServlet. However, in the Telco world we speak about blended applications, which means applications in which you blend different protocols such as SIP, INAP and DIAMETER. Typically an IM-SSF encounters this kind of issues.

Here is picture coming from the Oracle documentation showing the Sip servlet place into the Presence picture.

ocms_components
Oracle Communication Mobility Server [5]

The problem for SipServlet is that’s a synch servlet approach and nothing is foreseen in the specification in order to suppot a “suspend”, doing something else and coming back asynchronously in the same context and continuing the call.

Let’s take a simple example, in an INVITE request, you would like to start a SCUR(Session Charging with User Reservation)  to a DIAMETER server. How would you suspend the INVITE, sending the first request and later on, receiving the answer and continuing the call, by for instance, starting a B2B call.

That’s why, today, for blended applications,  a JAIN SLEE [3] container is more suited. Indeed, JSLEE is by nature an asynch., event-driven container. The JSR 240 provides a complete description about how a context could be retrieved and how the event router can find back the right instance to send back answer by the Initial Event Selector approach and the convergence name definition.

However, a same approach as the Atmosphere framework could resolve this challenge for the SipServlet api. The broadcastable can be seen a the JSLEE Activity context interface and the topic as the convergence name. This kind of framework could be adapted easily on the SipServlet implementation and could enable them to reach the blending level quiet easily.

Is it a good way?

The question must be asked: “do we really need blended services in SipServlet ? Is it made for ?”

Those are good questions. Yes we do need and no, that’s definitively not made for. Do we have a topic naming directory ? Do we have any mechanism to store the context while the processing is suspended in order to preserve the memory to be over loaded? Does it means that we need some kind of CMP fields (Container-Managed Persistence) ? Do we really have the control on the topic life cycle ?

Let’s imagine that the answere would be yes, is it still the SipServlet specification ?

References

[1] https://atmosphere.dev.java.net/

[2] http://weblogs.java.net/blog/jfarcand/archive/2009/11/06/servlet-30-async-or-atmosphere-you-decide

[3] The JSLEE Specification, http://jcp.org/en/jsr/detail?id=240

[4] The Sip Servlet Specification, http://jcp.org/en/jsr/detail?id=116

[5] The Oracle Communication and Mobility Server, http://download.oracle.com/docs/cd/E12752_01/doc.1013/e12656/img_text/ocms_components.htm


 

Releated Posts

Privacy Enhancing Technologies 2024: A Summary

For Large Language Models (LLMs), Azure confidential computing offers TEEs to protect data integrity throughout various stages of the LLM lifecycle, including prompts, fine-tuning, and inference. This ensures that all
Read More

IEEE Big Data 2023 – A Summary

Our CTO, Sabri Skhiri, recently travelled to Sorrento for IEEE Big Data 2023. In this article, Sabri explores for you the various keynotes and talks that took place during the
Read More