All That Machine Learning Hullabaloo – InApps is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn All That Machine Learning Hullabaloo – InApps in today’s post !

Read more about All That Machine Learning Hullabaloo – InApps at Wikipedia



You can find content about All That Machine Learning Hullabaloo – InApps from the Wikipedia website

These days, everything is powered by machine learning. Or at least, that’s the image being portrayed. Chances are, if you wrote a quick script to parse the last three months of technology-related press releases out there, an outstanding majority would contain some assortment of the words “AI,” “machine learning,” “deep learning,” and — a particular favorite — “neural network.” In no time at all, we’ve moved from a world where the number crunching was a cumbersome task to be performed by big machines for big companies, to one where every $2.99 app in the app store can think for itself and everyone is touting themselves as artificial intelligence experts.

But, as one author over on data science blog KDNuggets points out, “most of these experts only seem expert-y because so few people know how to call them on their bullshit” in a post titled “Neural network AI is simple. So… Stop pretending you are a genius.”

The author, a CEO of a “natural language processing engine” company seemingly goes on to dismantle a series of boasts that might be common in today’s vernacular of AI hype. On neural networks, he writes, “so you converted 11 lines of python that would fit on a t-shirt to Java or C or C++. You have mastered what a cross compiler can do in 3 seconds.” Or, perhaps, you’ve trained a neural network? “Congrats, you are a data wrangler. While that sounds impressive you are a dog trainer. Only your dog has the brains of a slug, and the only thing it has going for it, is that you can make lots of them.”

Read More:   HPE’s ChatOps Aims to Replace Dev Collaboration Platforms with Slack – InApps Technology 2022

KDNuggets founder Gregory Piatetsky notes, however, that the blog post should be considered tongue-in-cheek, since “saying that neural networks are simple because someone can write one neural net algorithms in 11 lines of code is like saying physics is simple because you can write E=mc^2 or Schroedinger equation.”

Thankfully, your task as a developer isn’t really to discern bullshit, but rather to leave that to the PR jockeys as you actually create these things. To that end, perhaps it’s time to start looking at how to get started learning machine learning. Despite the hullabaloo around machine learning these days, it is very much a worthwhile endeavor. After all, we’re simply at that part of the hype cycle called the “peak of inflated expectations” also known as “use these words and everyone will buy your product, whether or not your claims are really true!”

It’s just your job to make the claims true. And thankfully there’s new technology coming out all the time to make that easier. Let’s take a look at what’s new this week

This Week in Less Hullabaloo, More Programming

  • Google Releases TPUs in Beta: With all that said about machine learning, we may as well start off this week’s news roundup with a few ML-related announcements, right? First off, Google has announced that its custom TPU machine learning accelerators are now available in beta. Google’s Tensor Processing Units (TPUs), the company’s custom chips for running machine learning workloads that were first announced in 2016, are now available to developers. According to Frederic Lardinois at Techcrunch, “the promise of these Google-designed chips is that they can run specific machine learning workflows significantly faster than the standard GPUs that most developers use today.” In addition, the chips run at significantly lower power, allowing Google to offer the service at lower cost and further corner the market on machine learning with TensorFlow. Existing TensorFlow code will run on TPUs without modification. As Lardinois also notes, “with the combination of TensorFlow and TPUs, Google can now offer a service that few will be able to match in the short term.”
  • As Well As GPUs…: Techcrunch’s Lardinois also writes that GPUs on Google’s Kubernetes Engine are now available in open beta, giving developers easier access to these processing units – something that may also come in handy for the intense processing power often needed for big data, ML number crunching. “The advantages of the kontainer/GPU combo is that you can easily scale your workloads up and down as needed,” writes Lardinois. “Most GPU workloads probably aren’t all that spikey, but in case yours are, this looks like a good option to evaluate.”
  • Facebook Simplifies Machine Learning with Tensor Comprehensions: Despite everything said above, or maybe in proof of it, writing real, efficient and effective machine learning code can be a difficult task. This week, the Facebook research team announced the release of Tensor Comprehensions, “a C++ library and mathematical language that helps bridge the gap between researchers, who communicate in terms of mathematical operations, and engineers focusing on the practical needs of running large-scale models on various hardware backends.” The announcement explains that “the deep learning community has grown to rely on high-performance libraries such as CuBLAS, MKL, and CuDNN to get a high-performance code on GPUs and CPUs,” and that Tensor Comprehensions “shortens this process from days or weeks to minutes.
  • Visual Studio Code and Anaconda: One final, semi-related announcement to this week’s introduction — Visual Studio Code is now shipping with Anaconda, a popular Python data science platform. Starting immediately, Visual Studio Code, Microsoft’s free and cross-platform code editor, will be included in the Anaconda distribution. If you recall from last week, a new version of Visual Studio Code was just released with a Python extension for Visual Studio Code, alongside the company’s “strong support for Python in Azure Machine Learning Studio and SQL Server, and Azure Notebooks.”
  • What a Week for Rust: A couple weeks back, we looked at the Rust programming language’s march toward an epoch release later this year, and it looks like the team is plodding onward rather quickly according to a blog post by Aaron Turon, a Mozilla developer on the Rust team. According to Turon, it was an incredible week in Rust, with several breakthroughs, including a “eureka moment” on making specialization sound, and “a brilliant way to make “context arguments” more ergonomic, which lets us make a long-desired change to the futures crate without regressing ergonomics.” Turon writes that these particular topics were ones that “loomed large” for him personally and that he’s impressed by the overall growth of the team working on the language. “It’s now simply impossible to drink from the full firehose,” he writes, “but even a sip from the firehose, like the list above, can blow you away.”
  • And Then There’s Rust 1.24: While we’re here talking Rust, there’s also the release of Rust 1.24, which contains two big new features: rustfmt and incremental compilation. The first is a tool that automatically can reformat your Rust code to some sort of “standard style,” while Incremental Compilation is “basically this: when you’re working on a project, you often compile it, then change something small, then compile again. Historically, the compiler has compiled your entire project, no matter how little you’ve changed the code. The idea with incremental compilation is that you only need to compile the code you’ve actually changed, which means that that second build is faster.” With Rust 1.24, the incremental compilation is turned on by default.
  • Don’t Forget Kotlin/Native! Finally, for you Kotlin fans out there, Kotlin/Native v0.6 has arrived, which the company calls “a major update.” The release includes support for multiplatform projects in compiler and Gradle plugin, transparent Objective-C/Kotlin container classes interoperability, smaller WebAssembly binaries, and support for Kotlin 1.2.20, Gradle 4.5 and Java 9, among a few other features and bug fixes.

Feature image via Pixabay.

Read More:   A Next Step Beyond Test Driven Development – InApps 2022




Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      [cf7sr-simple-recaptcha]

      Success. Downloading...