Neural Magic gets $15M seed to run machine learning models on commodity CPUs – TechCrunch

Forums IoTStack News (IoTStack) Neural Magic gets $15M seed to run machine learning models on commodity CPUs – TechCrunch

  • This topic has 1 voice and 0 replies.
Viewing 0 reply threads
  • Author
    Posts
    • #38398
      Telegram SmartBoT
      Moderator
      • Topic 5959
      • Replies 0
      • posts 5959
        @tgsmartbot

        #News(IoTStack) [ via IoTGroup ]


        Neural Magic, a startup founded by a couple of MIT professors, who figured out a way to run machine learning models on commodity CPUs, announced a $15 million seed investment today.
        The company also announced early access to its first product, an inference engine that data scientists can run on computers running CPUs, rather than specialized chips like GPUs or TPUs. That means that it could greatly reduce the cost associated with machine learning projects by allowing data scientists to use commodity hardware.
        As he tells it, they were working on neurobiology data in their lab and found a way to use the commodity hardware he had in place.
        “I discovered that with the right algorithms we could run these machine learning algorithms on commodity hardware, and that’s where the company started,” Shavit told TechCrunch.
        He says there is this false notion that you need these specialized chips or hardware accelerators to have the necessary resources to run these jobs, but he says it doesn’t have to be that way.
        He says his company not only allows you to use this commodity hardware, it also works with more modern development approaches, like containers and microservices.
        “Our vision is to enable data science teams to take advantage of the ubiquitous computing platforms they already own to run deep learning models at GPU speeds — in a flexible and containerized way that only commodity CPUs can deliver,” Shavit explained.
        He says this also eliminates the memory limitations of these other approaches because CPUs have access to much greater amounts of memory, and this is a key advantage of his company’s approach over and above the cost savings.
        “Yes, running on a commodity processor you get the cost savings of running on a CPU, but more importantly, it eliminates all of these huge commercialization problems and essentially this big limitation of the whole field of machine learning of having to work on small models and small data sets because the accelerators are kind of limited.


        Read More..
        AutoTextExtraction by Working BoT using SmartNews 1.0299999999 Build 26 Aug 2019

    Viewing 0 reply threads
    • You must be logged in to reply to this topic.