• Home
  • >
  • DevOps News
  • >
  • Machine Learning, Microservices, and Kubernetes – InApps Technology 2022

Machine Learning, Microservices, and Kubernetes – InApps Technology is an article under the topic Devops Many of you are most interested in today !! Today, let’s InApps.net learn Machine Learning, Microservices, and Kubernetes – InApps Technology in today’s post !

Read more about Machine Learning, Microservices, and Kubernetes – InApps Technology at Wikipedia



You can find content about Machine Learning, Microservices, and Kubernetes – InApps Technology from the Wikipedia website


Monitoring The Ghost In Machine Learning

Listen to all TNS podcasts on Simplecast.

Artificial intelligence and machine learning are expected to have a profound effect on DevOps as a way to harness the brain power of perhaps hundreds or even thousands of humans in a single system in the development and deployment pipeline. But, of course, computer and data scientists are only just beginning to take advantage of the power of AI/ML, which remains largely in the testing phase.

As AI/ML sees further development and begins to play a role in commercial software development at scale, Kubernetes and microservices will almost certainly form the underlying architecture, as machines, for a lack of better words, “take over” many roles within DevOps teams.

Monitoring and observability will also play a major in this brave new AI/ML landscape with Kubernetes and microservices. This was the main theme of a podcast Alex Williams, founder and editor-in-chief of InApps Technology, hosted, with Janakiram MSV, InApps Technology correspondent and principal of Janakiram & Associates, as the co-host.

Irshad Raihan, director of product marketing at Red Hat, was the guest who spoke about the role of data and observability in AI/ML, in addition to how DevOps is changing for AI/ML, especially with the increasing availability of direct data and data streaming.

Raihan described how AI/ML is not evolving so much as an abstraction ecosystem on top of Kubernetes, but as something completely embedded and integrated into the Kubernetes layers. This, of course, will have a major impact on monitoring.

Read More:   Update The Challenges to Building a Predictive COVID-19 Model

“In the future, across the Kubernetes infrastructure, AI intelligence will be so embedded in every piece of the Kubernetes ecosystem, the AI functionality will be indistinguishable from Kubernetes in the logic,” Raihan said. “We will not talk about AI apps as a separate module that sits on top, but it will be an assumed feature of what Kubernetes has to offer.”

Monitoring and observability will thus play a major role during the emergence of AI/ML as it is completely embedded in Kubernetes as applications and deployments reach enterprise-scale maturity.

“Using all of these individual AI models together into a system that has thousands of moving parts, requires experienced data scientists. Typically, there are hundreds and thousands of smaller logs, from the control plane all the way into the workloads,” Raihan said. “Those that are sitting on the top are actually not just modern workloads but are also traditional workloads as well. [The merging] of these two worlds is huge from a logging perspective.”

In the present, much of the AI/ML work is being done on a pure R&D level. The concept and concern of  “siloing” remain a non sequitur for the moment ahead of the commercial application and deployments as the technology remains in the development, and in many cases, pure research phase. But when the time comes to incorporate AI/ML into working commercial production environments, AI/ML will, of course, then become integrated with DevOps. Kubernetes, as mentioned above, should also play an integral role.

“You have on one end AI and ML engineers doing really cool stuff as they build modules, algorithms, recommendation engines and things like that, and on the other hand, you have AI/ML technologies themselves sort of infiltrating the very bones of Kubernetes,” Raihan said. “In our industry with something as especially new as Kubernetes and containers, but when I use the term ‘fad,’ developers and engineers tend to object.”

Read More:   Update Kubernetes 1.8 Arrives with Improvements in Stability, Security, and Storage Support

In this Edition:

1:10: Monitoring technologies, tools, and exploring data and observability in this context.
10:53: Why is Kubernetes mattering so much?
13:30: Where does that take you in the context of machine intelligence?
17:52: Discussing the trend of AI Ops.
24:39: How code can help solve problems, and why transparency is important.
28:40: How do you see the abstractions sitting on top of Kubernetes, and how do you think that will get us to the point where the AI/ML ecosystem benefits from Kubernetes?

Feature image via Pixabay.



Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      [cf7sr-simple-recaptcha]

      Success. Downloading...