Cloud Computing Weekly Podcast

Our guest on the podcast this week is Randy Bias, VP, Technology & Strategy at Juniper Networks.

We discuss Juniper Networks and their acquisition of Contrail, Oracle’s vendor lock-in and new pricing changes, and why Kubernetes has pulled ahead of Docker in the container world. Juniper Networks is expanding their cloud capability with two recent acquisitions: Contrail and AppFormix, a software-defined networking play and a cloud monitoring software. 

For Randy, containers are the most exciting part of cloud computing right now. The Openstack movement is slowing down and there is fatigue because people are starting to realize that public cloud is going to win. Private infrastructures that still do exist (such as VMWare) come with too much complexity from set-up to organizing and managing. People are deciding to leapfrog past those now and going straight to containers. They want to pair containers with things like Contrail because the combination allows for a robust infrastructure that can run on both private and public cloud. An example of this is a recent Riot Games blogpost about streaming video games and shows how they combine container orchestration systems with Contrail to give themselves a true hybrid cloud solution at the container level. This means developers do not need to worry about what infrastructure they are running on.

In the news recently, Oracle doubled their license fees to run in AWS. As enterprises are migrating to the cloud, they have a lot of Oracle software currently running in their enterprise and are looking to bring licenses and run Oracle in the cloud, so now they’re being asked to pay more for that migration. This could lead to acceleration of companies leaving Oracle because of the higher prices. People are fed up with proprietary software, licensing, and vendor lock-in. Ten years ago, there weren’t a lot of alternatives to Oracle for relational databases. Now there are lots from Aurora, and RDS and Redshift on Amazon. The problem is, a typical enterprise built a lot of procedures and triggers in the applications that run in the Oracle database ten years ago. Now they either have to ditch all of that or rewrite it, which can be risky, or they can pay more for Oracle licenses and stay with it.

Oracle does not have a good cloud play yet, though they are trying. They do still need to upkeep the legacy databases for enterprises now as the transition occurs. Perhaps if oracle could remain reasonable on pricing, they would have more of a chance of surviving, but it seems the recent news makes it easier for customers to decide to leave. They may need to start skating further ahead of the puck soon if they want to weather the cloud disruption.

We also discuss Docker and why Kubernetes has taken the lead for container software in recent years. Docker took off because it was the “Easy Button” for application developers who did not want to learn Chef or Puppet. Then they tried to each Infrastructure teams who did not know what containers were, so they tried to add complexity to make containers look like nextgen virtual machines. That is not how app developers viewed them, and now Kubernetes has become more preferred. Most Openstack startups have bet on Kubernetes. Docker has built a big platform that has not found its killer app yet and are no longer able to take advantage over the dominance they once had. Docker acquired a lot of companies and linked together a lot of different technologies along the way, always adding layers of complexity. Kubernetes is in front because it has simplified everything and that is what developers want right now.  

For an enterprise who is moving to Docker, Randy provides tips and warnings.

  1. Work with a container vendor that is thinking about the future and making it easy on their app developers.
  2. View containers more as application-centric than infrastructure-centric.
  3. Containers by themselves are not going to solve the problem, they need to work with a set of other related services.

It won’t be easy, and be wary of anyone who tells you all the problems with containers are completely solved. Use DevOps and have an application-centric model, so you can focus on velocity.

News Covered

The Register: Oracle effectively doubles licence fees to run its stuff in AWS

Direct download: Randy_Bias_-_Juniper.mp3
Category:Technology News -- posted at: 9:33am EDT

Our guest on the podcast this week is Friederike Schüür, Data Science, Research, and Technical Advisor at Fast Forward Labs.

We discuss how Fast Forward Labs applies the machine learning algorithms of academia to the business world to impact industries. They help clients leverage what they’ve uncovered in their research to build prototypes and demonstrate the potential of new algorithms. Companies are starting to use neural networks for things like image classification, natural language, text summarization, and more. Though it is not a new technique, it is new to the business world and is now available for companies to use. One example of a neural network use-case is to take a long article and automatically cut it down to the five sentences that capture the article’s essence. Artificial Intelligence has been around for more than 30 years, but what has changed now is that it is much easier to store data, and deep learning requires a large amount of data to be effective. Also, the cloud infrastructure makes machine learning possible today because it opens up the use of these algorithms to large companies and even small startups. AI used to require more horsepower, physical space, and cost more to implement than it does today.

Fast Forward Lab clients often struggle to identify the right problem for machine learning. For example, there is a lot of hype around conversational agents, or chat bots, today to replace human agents with AI. But early entrants have found that these bots do not get used frequently by customers, which turns out ot be a user experience problem and not something that can be fixed with machine learning.

The real promise of machine learning is that it can help us with repetitive tasks inside the company. Look in your organization for places where a similar decision or action needs to be made over and over and that is where machine learning could be used.

We discuss the different options for tapping into machine learning algorithms from AWS to Google TensorFlow and open-source tools to combine with them. The most important thing to get started is to determine what clients want to start with defining what the business is trying to achieve. Next it’s important to determine exactly what data is available, and only then to develop a machine learning strategy. Starting simple is important, and only adding complexity when it is necessary.

 

Direct download: Friederike_Schuur.mp3
Category:Technology News -- posted at: 3:01pm EDT

Our guest on the podcast this week is James Urquhart, cloud computing thought leader and creator of Digital Anatomy. 

We discuss how data streams, or real-time stream processing, is the next fundamental architectural and organizational revolution in IT. Today, the capability is available though it is complex to implement at scale. James predicts that within ten years using data streams will become mainstream. Traditionally, an application owns a set of data which comes to rest and is taken somewhere else to correlate with another data set through batch processing. This is an expensive process and can miss key events, which is crucial in industries like banking or finance. What’s available now for data streams takes directly from open-source software and becomes a utility. 

How data streams work is that you can start with one stream and decide which functions you want to use on it. You can pass one stream through a different stream, then combine and recombine new streams in a relatively agile way so you can evolve them as you go. You could be adding new data compliance requirements or regulations as they emerge. The application is no longer the focus of how data gets managed, the data becomes the center of attention. There is still a lot about data streams that needs to be built from how to operate them, how to optimally monetize the cost, and what baseline services are required for it to be most useful.

The use of data streams does not eliminate the ability to analyze historical data stores, but allows you to bring AI, machine learning, and pattern recognition into that process. For instance, you can compare two online historical campaigns and do a reasonable prediction of what you’ll expect from a third campaign. This allows for event-processing and anomaly detection with machine learning.

Data stream technology is available now, but you would need a team of very strong developers to get it working at scale. The question is whether the average enterprise can adopt this quickly enough. If adopted, data streams will drive how organizations work and make real-time decisions in the future. It is an exciting trend and we look forward to seeing how real-time stream processing evolves.

Direct download: James_Urquhart.mp3
Category:Technology News -- posted at: 9:07am EDT

Our guest on the podcast this week is Zohar Alon, CEO and Co-Founder at Dome9 Security. We discuss the shift from information security in the 1990’s to the cloud security of today. We look at how Dome9 protects company security at the network-level by being proactive and continuously monitoring and updating security practices to mirror the constantly changing cloud environment. Dome9 is a SaaS solution that helps security teams streamline their operations, management, and compliance across their cloud infrastructure. Traditional internal security teams at large enterprises are well-equipped for traditional networking, but these teams are not ready to keep public cloud infrastructures secure on their own. The rise of public security breaches is a result of hacking being a business now. People around the world are monetizing knowledge, which is why there have been so many public data breaches now from Yahoo to Target, Home Depot to the DNC. What we have learned from these breaches is the importance of leadership being transparent with customers when they happen and taking responsibility for the breach. Most instances we hear of come from cloud-related hacking of hosted environments the brands used. Hackers are trying every door and window to get into a brand’s system every second, and the only way to stop this is through continuous proactive security. We also talk about serverless computing such as AWS Lambda and what that means for security. With serverless computing, functions are triggered by a change, so the function observes until it sees something it can act on, and it is given the right permissions to act on its own and make a change to the API or the VPC. This presents many security challenges that customers are not fully aware of because permissions can easily be given out more than is required.

Direct download: Zohar_Alon.mp3
Category:Technology News -- posted at: 8:56am EDT

Our guest on the podcast this week is Lori MacVittie, Principal Technical Evangelist at F5 Networks. 

We discuss upcoming enterprise cloud trends for 2017. We look at how collaboration is advancing with platforms like Slack and determine whether this trend is here to stay. Deep learning is another important 2017 focus, especially because the economics have become favorable and analytics no longer cost as much. This means we can now create things like neural networks and advanced predictive algorithms at a much lower cost. We also look at the return of SQL, and discuss whether it ever left at all. It seems SQL will be important in 2016, from MongoDB and beyond. We see how containers are here to stay and to manage clusters, Kubernetes will be the top player in 2017. We see how serverless computing makes development easier, especially looking at AWS Lambda. Instead of deploying apps and microservices, we now deploy functions that do only one thing with serverless computing. This is a lot more to manage for large enterprises because of how granular functional programming can become. We also look at custom cloud processors as an alternative to using major chip manufacturers and processors. IoT will continue to forge ahead in 2017 and the keyword will be interoperability. With so many new devices, standards on security and governance will be crucial next year. IT spending as we know it may change in 2017 as more and more enterprises look into PC-as-a-service models instead of purchasing expensive equipment. Last, we discuss the rise of Python programming language and where it will go in 2017 from data science to teaching it to kids. 

News Covered

InfoWorld: 9 enterprise tech trends for 2017 and beyond

Direct download: Lori_MacVittie2.mp3
Category:Technology News -- posted at: 11:08pm EDT

Our guest on the podcast this week is JP Morgenthal, CTO, Digital Applications, Americas at CSC. We discuss new product releases, improvements, and announcements from AWS re:Invent 2016. It is clear Amazon still is the leading cloud provider and this year proved that they still know how to innovate and their lead has not made them sit still. This year they extended many services already in play from Amazon EC2 to AWS Lambda and announced new services like Amazon Athena, Lightsail, and AWS Snowmobile and Glue. For the first time, Amazon began to release services directly for business use-cases instead of features targeted only to IT. We also look at next wave proficiencies and what will be important for those seeking careers in cloud computing. These proficiencies used to be defined by certifications (CISCO, CCIE, NetApp). With the rise of cloud computing we saw some general cloud certifications emerge, but none caught on. Now the most sought after skill is AWS Certification, and it’s likely Azure and Google will follow soon with their own certifications.

News Covered

AWS: re:Invent 2016 Product Announcements

Direct download: JP_Morgenthal_reInvent.mp3
Category:Technology News -- posted at: 2:16pm EDT

Our guest on the podcast this week is Mat Keep, Director of Product & Market Analysis at MongoDB. We discuss the differences between open-source database systems like MongoDB and relational databases that use SQL.  In the early days of open-source, it used to be built to replicate what proprietary software did well. Today, open-source is the innovator because that is what developers expect and need to be successful. In many ways MongoDB builds on the success of relational databases and replicates their best features while allowing for more rapid innovation and time to market. Modern applications deal with data that is both structured and unstructured, which means the database needs to be able to evolve rapidly. At its core, MongoDB is built for developers to improve productivity. It allows them to develop applications much faster and to build what had previously not been possible on relational databases. 

Direct download: Mat_Keep.mp3
Category:Technology News -- posted at: 9:22am EDT

Our guest on the podcast this week is Michael Crandell, CEO and Founder at RightScale. We discuss the early years of cloud computing before it had a name and when it was often referred to as “elastic compute cloud” or “utility computing”. In this time there was a need for a universal cloud management platform, which resulted in Michael creating RightScale. RightScale started as an Infrastructure-as-a-Service model for AWS that provided compile, storage, and networking services. Today, they have expanded to work with all providers including containers. With multiple tools for clients to customize, cloud instances are becoming more and more complex. We look at the State of the Cloud for 2017 and see that for many years people found cloud security to be the single biggest challenge of the cloud, but in 2016 people changed their minds. Today, the biggest cloud challenge is the skills and expertise gap for enterprises implementing a cloud strategy.  

Direct download: Michael_Crandell.mp3
Category:Technology News -- posted at: 4:26pm EDT

Our guest on the podcast this week is John Treadway, SVP at Cloud Technology Partners. We discuss best practices to leverage the cloud for innovative digital solutions. We look at Blockchain as a powerful new reality that helps reduce transaction costs for the financial industry on everything from mortgages, record-keeping, stock allocation, and even car records. Blockchain is disrupting the 15th century double-ledger accounting system still in use today, and it relies on the cloud for its future. The cloud acts as the backbone for many other innovative solutions of the future from AI to Machine Learning and Big Data insights. These include algorithms to improve the reliability of autonomous driving, and even to accurately build financial portfolios more effectively than a stock broker. The Cloud Technology Partners Digital Innovation initiative was launched on October 25 and will lead many of these innovative cloud-based solutions and more. 

 

News Covered

Wall Street Journal: How Blockchain Will Change Your Life

Direct download: John_Treadway.mp3
Category:Technology News -- posted at: 5:51pm EDT

Our guest on the podcast this week is Robert Christiansen, Vice President, Cloud Adoption Practice Lead at Cloud Technology Partners. We discuss a basic framework for cloud adoption in large enterprises based on learnings from over 400 client engagements at Cloud Technology Partners. Through these experiences, patterns have emerged that can be used to make strategies for engaging large enterprises in cloud adoption despite early fear and uncertainty.  A key step is to put together an internal cloud team or Cloud Business Office (CBO) with buy-in from all departments including legal so that adoption can move forward with full support. The beginning of enterprise cloud migration can sometimes be more about changing psychology over changing technology. 

Direct download: Robert_Christiansen.mp3
Category:Technology News -- posted at: 2:28pm EDT