Thu, 26 January 2017
We discuss how data streams, or real-time stream processing, is the next fundamental architectural and organizational revolution in IT. Today, the capability is available though it is complex to implement at scale. James predicts that within ten years using data streams will become mainstream. Traditionally, an application owns a set of data which comes to rest and is taken somewhere else to correlate with another data set through batch processing. This is an expensive process and can miss key events, which is crucial in industries like banking or finance. What’s available now for data streams takes directly from open-source software and becomes a utility.
How data streams work is that you can start with one stream and decide which functions you want to use on it. You can pass one stream through a different stream, then combine and recombine new streams in a relatively agile way so you can evolve them as you go. You could be adding new data compliance requirements or regulations as they emerge. The application is no longer the focus of how data gets managed, the data becomes the center of attention. There is still a lot about data streams that needs to be built from how to operate them, how to optimally monetize the cost, and what baseline services are required for it to be most useful.
The use of data streams does not eliminate the ability to analyze historical data stores, but allows you to bring AI, machine learning, and pattern recognition into that process. For instance, you can compare two online historical campaigns and do a reasonable prediction of what you’ll expect from a third campaign. This allows for event-processing and anomaly detection with machine learning.
Data stream technology is available now, but you would need a team of very strong developers to get it working at scale. The question is whether the average enterprise can adopt this quickly enough. If adopted, data streams will drive how organizations work and make real-time decisions in the future. It is an exciting trend and we look forward to seeing how real-time stream processing evolves.
Wed, 18 January 2017
Our guest on the podcast this week is Zohar Alon, CEO and Co-Founder at Dome9 Security. We discuss the shift from information security in the 1990’s to the cloud security of today. We look at how Dome9 protects company security at the network-level by being proactive and continuously monitoring and updating security practices to mirror the constantly changing cloud environment. Dome9 is a SaaS solution that helps security teams streamline their operations, management, and compliance across their cloud infrastructure. Traditional internal security teams at large enterprises are well-equipped for traditional networking, but these teams are not ready to keep public cloud infrastructures secure on their own. The rise of public security breaches is a result of hacking being a business now. People around the world are monetizing knowledge, which is why there have been so many public data breaches now from Yahoo to Target, Home Depot to the DNC. What we have learned from these breaches is the importance of leadership being transparent with customers when they happen and taking responsibility for the breach. Most instances we hear of come from cloud-related hacking of hosted environments the brands used. Hackers are trying every door and window to get into a brand’s system every second, and the only way to stop this is through continuous proactive security. We also talk about serverless computing such as AWS Lambda and what that means for security. With serverless computing, functions are triggered by a change, so the function observes until it sees something it can act on, and it is given the right permissions to act on its own and make a change to the API or the VPC. This presents many security challenges that customers are not fully aware of because permissions can easily be given out more than is required.