Items tagged with deep learning

Multiple facets of technology are trending towards artificial intelligence these days, in applications both big and small. As that's been happening, graphics processing units (GPUs) have taken on the heavy lifting, though researchers at Rice University have cooked up a new machine learning scheme they say is more efficient when run on central processing units (CPUs). It should come as no surprise that the computer scientists from Rice University are being supported by collaborators from Intel, which has a vested interest in anything that can tap into its CPUs. There's a lot of money to be made by leveraging AI—NVIDIA, for example, reported a 41 percent increase in its fourth quarter revenue,... Read more...
You have probably heard at some point in your life, 'Don't believe everything you read!', especially in this day and age of the Internet. That is sage advice, and of particular interest in the era of fake news stories permeating social media. However, with advances in artificial intelligence technology, don't believe everything you see, either. Just check out those mugshots above. They look real, especially the one on the left. However, those are not actually real people. They are computer generated. A software engineer at Uber created a website that generates new facial images on demand, and rather quickly, whenever the site is refreshed. "Recently a talented group of researchers at NVIDIA released... Read more...
It's said that seeing is believing, but can you always believe what you see? That's a question that will come up more and more as companies like NVIDIA push the envelope with graphics rendering and, interestingly enough, artificial intelligence. As it relates to that, in addition to launching a new Titan RTX graphics card today, NVIDIA also announced a new deep learning-based model for generating 3D environments based on real-world data. A team of NVIDIA researchers led by Bryan Catanzaro, Vice President of Applied Deep Learning at NVIDIA, used a conditional generative neural network as a starting point. From that data, they trained a neural network to render new 3D environments, after it was... Read more...
The initial focus on NVIDIA's recently launched GeForce RTX 2080 Ti and GeForce RTX 2080 graphics cards has been on how well they perform in games, especially when cranking up the resolution to 4K (3840x2160). That will continue to be a point of interest, though it's not the only one. A fresh set of benchmarks making the rounds highlight how the new cards perform in deep learning workloads. Before we get to the numbers, let's talk about why this matters. As you might already know, the GeForce RTX series pushes consumer graphics cards into new territory. Typically with each new generation of graphics cards, consumers benefit from faster rasterization rendering and sometimes better power efficiency.... Read more...
Do you need any more proof that your ball handling skills suck? Well, look no further than Carnegie Mellon University and DeepMotion, which together trained AI how to dribble a basketball and pull off increasingly advanced moves as it learned. It can be hard to often rationalize in our minds that professional basketball players who make dribbling, crossovers, and pump fakes seem so effortless have honed those skills over many years of practice. However, researchers at Carnegie Mellon and DeepMotion were able to teach AI how to pull off similar feats in a matter of hours (via training). We're not talking about canned animations here that you might see in NBA 2K19, but actual... Read more...
One of the most annoying things about Windows 10 is that it often decides to apply updates right when you are in the middle of working. This is typically halfway through a long document or during meeting. A new Windows 10 Redstone 5 build that Microsoft has released to some Insiders dubbed 19H1 has new tech to avoid this issue. The Windows 10 now has changes specifically to the update system that takes advantage of new cloud-based logic. The goal of this cloud-based logic is to avoid unexpected updates when you are trying to work. The tech uses a predictive model that aims to improve over time to better understand when the user will be using their device. The goal is to make sure Windows isn't... Read more...
It’s happened to us all at some point in time — you capture an image in less-than-ideal light conditions and the end result is a is grainy photo filled with digital noise. While you may still be able to make out many of the details in the photograph, wouldn’t it be nice if you could somehow magically restore it to near perfect condition the way it was meant to be seen? That’s exactly what researchers from NVIDIA, MIT and Aalto University have been able achieve using deep learning artificial intelligence called Noise2Noise. NVIDIA used a team of Tesla P100 GPUs along with the cuDNN-accelerated TensorFlow deep learning framework to train Noise2Noise using over 50,000 images... Read more...
It's not quite the Skynet that Tesla CEO Elon Musk has warned us about, but researchers from the Musk-based OpenAI initiative have made a breakthrough in AI algorithms using Dota 2 as a testbed. OpenAI's achievement is remarkable due in part to its scope. Most AI versus human matches -- be it go or a computer game -- involves a single computer against a single human (as was the case with OpenAI’s victory last year). But OpenAI has managed to train its AI to master competing against humans on a five-player team. The team of five neural networks that was developed is collectively known as the OpenAI Five and is capable of playing [at an accelerated rate] 180 years with... Read more...
Anyone who has lived through the 1980s knows how maddeningly difficult it is to solve a Rubik's Cube, and to accomplish the feat without peeling the stickers off and rearranging them. It's not just challenging for humans, either. Apparently the six-sided contraption presents a special kind of challenge to modern deep learning techniques that makes it more difficult than, say, learning to play chess or Go. That used to be the case, anyway. Researchers from the University of California, Irvine, have developed a new deep learning technique that can teach itself to solve the Rubik's Cube. What they come up with is very different than an algorithm designed to solve the toy from any position. It's... Read more...
Gigabyte today announced a couple of new 4U GPU servers for the datacenter, both packed with multiple NVIDIA Tesla GPUs to bring massive parallel computing capabilities to the sector. According to Gigabyte, its new G481-S80 and G481-HA0 offer some of the highest GPU density available in the 4U form factor—the former accommodates eight SXM2 form factor GPUs, such as NVIDIA's Volta-based Tesla V100 or Pascal-based Tesla P100, and the latter packs 10 GPUs. These new servers also feature NVIDIA's NVLink technology supporting bi-directional communication between GPUs, allowing for higher bandwidth data transfers, more links, and improved scalability. To give an example, a single Tesla V100 GPU... Read more...
Hot on the heels of the debut of its 8th gen Core series, and also its brand-new top-end Core X chips, Intel just announced Loihi. With this new chip, Intel is going all-in on artificial intelligence (AI) and self-learning. It also drops a term you may have heard recently: neuromorphic computing; in effect, neural systems simulation. In his blog post, Intel's Corporate VP and Managing Director of Intel Labs Dr. Michael Mayberry lays down a couple of great examples of how AI could benefit our lives in the future. Picture, for example, stoplight-mounted cameras being tied to an AI backend that adjusted the timing of the light changes based on the flow of traffic. Then there are streetlight-mounted... Read more...
Microsoft was on hand at the Hot Chips 2017 show and rolled out a new deep learning acceleration platform dubbed Project Brainwave. Microsoft's Doug Burger says that the platform is a "major leap forward in both performance and flexibility for cloud-based serving of deep learning models." Microsoft says that it designed the system for real-time AI so the system is able to process requests as fast as they are received with ultra-low latency. Microsoft says that real-time AIs are becoming increasingly important to process live data streams in a cloud infrastructure. That sort of data includes things like search queries, videos, sensor streams, or interactions with users. Project Brainwave uses... Read more...
Intel is expanding its reach into the deep learning field today with the launch of the Neural Compute Stick (NCS), which as developed by its Movidius subsidiary. The Movidius NCS is aimed at democratizing deep learning and artificial intelligence, with Intel billing it as “the world’s first self-contained AI accelerator in a USB format.” The Movidius NCS is powered by the Myriad 2 vision processing unit (VPU), which promises 100 gigaflops of performance all while operating within a 1-watt power envelope. Given its low-power requirements, Intel is aiming the Movidius NCS at developers, research engineers and of course the do-it-yourself crowd.  When it comes to applications for the deep learning... Read more...
We talked yesterday of an example of how deep learning and artificial intelligence can be used to put words in people's mouths, creating video proof of something someone said, even if they didn't really say it. Prospects like that are downright scary, but so too are the realities of the jobs AI will be able to take away from humans. Case in point: professional photography editing. This is a bit of an odd one, as most photographers will edit their own photos, so maybe we should consider this an example of how AI could help someone get through their workflow more efficiently. And perhaps even deliver a better result in the end. Using AI to apply a variety of filters to a photo and calling... Read more...
Many of us have had to heed the warning of, "Don't put words in my mouth" at some point over the course of our lives, but generally speaking, no one actually means it from in a literal sense. In time, though, thanks to deep learning and artificial intelligence, putting words in someone's mouth could become a legitimate reality. Complementing the upcoming SIGGRAPH conference in Los Angeles, researchers from University of Washington have worked their magic to put words into former President Barack Obama's mouth. At least in this case, the words said actually did come from Obama's mouth, but the footage seen is not from the same time he said it. Think this sounds freaky? It absolutely is. Check... Read more...
A board game like "Go" might not look complicated on the surface to the untrained eye, which could lead the uninformed to believe that it wouldn't be all that difficult for a computer to best a human player in a head-to-head match. We've seen many examples in the past where that hasn't been the case (IBM's Watson is a good start), and it's because despite their simple nature, the number of solutions/moves at any given time is sometimes astronomical. Last month, we wrote about Google's DeepMind and its challenge of going up against the world's best Go player, Ke Jie. Fast-forward to now, and we learn that DeepMind's AlphaGo helped it secure a win against the Grandmaster on the very first... Read more...
NVIDIA's push into deep learning and artificial intelligence markets is is paying off in a big way. As a result of its ongoing investments into data center GPUs, the company was able to rake in $1.94 billion in revenue during the first quarter of 2017, an increase of 48 percent from $1.3 billion in same quarter a year ago. After paying the bills, NVIDIA was left with a profit of $507 million, up a whopping 144 percent from last year. "The AI revolution is moving fast and continuing to accelerate," said Jensen Huang, founder and chief executive officer of NVIDIA. "NVIDIA's GPU deep learning platform is the instrument of choice for researchers, internet giants and startups as they invent the future.... Read more...
We are starting to sense a recurring theme from NVIDIA at the annual GPU Technology Conference (GTC). Last year at GTC NVIDIA unveiled its DGX-1, the world's first deep learning supercomputer. Featuring Tesla P100 GPUs based on its Pascal architecture, the DGX-1 made clear NVIDIA's commitment to deep learning and artificial intelligence. Building on top of that, NVIDIA announced at this year's GTC plans to train 100,000 developers through its Deep Learning Institute in 2017. That is a tenfold increase over the number of developers it trained last year. The reason NVIDIA is making such a big commitment is because AI is a fast growing field that is snaking into different product categories, everything... Read more...
You don't need to look very far to see examples of how machine- and deep-learning could enrich our lives. You might be surprised, given how often we hear about deep-learning nowadays, that most companies have only been utilizing the technology to a notable degree for the past couple of years - including Amazon. Despite that, we've seen some amazing things accomplished in a relatively short time. With Amazon specifically, we've seen the company sell the fruits of its deep-learning labor to those who need reliable predictive services, use it to improve its product reviews sections, and of course, improve the usefulness of its Alexa AI, found in products like Amazon Echo. Today, Amazon is... Read more...
Over the years, we've seen numerous examples of how AI can enrich our lives. It could brew us an awesome beer, solve serious health diagnosis cases, and of course, handle every day problem solving tasks in a number of other ways. We can also use AI and deep learning to simulate real-world scenarios, such as what would happen if resources became scarce, among other other growing environmental concerns. Google's DeepMind team has just detailed one the projects it's working on that revolves around that exact subject. They have also created two demos (seen in the videos below) to showcase how the AI behaves in different scenarios. In the above demo, called "Gathering," two AI entities... Read more...
These days, machine-learning along with artificial intelligence are becoming increasingly capable, and we're seeing their use applied to things we might not have immediately expected. Case in point: last month, we talked about how AI was being used to craft the most perfect beer. But as wonderous as AI beer may sound, researchers at Carnegie Mellon University are going in a completely different direction, proving that a poker face is no match for advanced AI. By design, poker is a simple game, but it's not at all simple on a competitive level. Even if you believe body language says your opponent isn't bluffing, they very well could still be, and there's a chance an AI opponent might be able... Read more...
AMD is announcing a new series of Radeon-branded products today, targeted at machine intelligence (AI) and deep learning enterprise applications, called Radeon Instinct. As its name suggests, the new Radeon Instinct line of products are comprised of GPU-based solutions for deep learning, inference, and training. The new GPUs are also complemented by a free, open-source library and framework for GPU accelerators, dubbed MIOpen. MIOpen is architected for high-performance machine intelligence applications, and is optimized for the deep learning frameworks in AMD’s ROCm software suite, which we recently talked about here. During a recent visit with AMD, it was noted that we generate roughly... Read more...
1 2 Next