In recent years there has been an explosion of data all around us. The data comes in from a variety of sources, such as financial real-time systems, cell phone networks, sensor networks--RFID and IoT, and GPS. Commensurate with this dramatic increase in data, is a corresponding unquenchable thirst for analysis and insights. The natural question arises: how do we build systems that process and makes sense of this vast amount of data, in as close to real-time as possible? What patterns of software and systems should we look at? Michael Stonebraker of database fame et al. offer some advice on what to consider in their paper: " The 8 requirements of real-time stream processing " published a decade ago. In the paper, the authors list eight guiding principles that high-volume low-latency systems should follow to be able to process vast amounts of data in near real-time. First, the systems have to keep the data moving, and do straight-through processing with minimal to no writ...
Musings about technology and life in general.