Items tagged with deep learning

Images generated by NVIDIA's GauGAN2 AI Visual imagination works differently for every person. Some people, with a condition called "aphantasia," aren't able to generate mental pictures at all. Others have the imagery come to mind first, and then describe it with words. Still others, like this HH contributor, think... Read more...
The human brain is enormously complex, and cracking the code of its intricacies in its entirety might never be accomplished. However, there have been loads of interesting research related to brain activity. Most recently, a team of researchers say they developed a deep learning framework that is able to decode sensory... Read more...
Recently, Microsoft patented the creation of an AI chatbot for a specific person, whether they were alive or not. Now, a genealogy company called MyHeritage has partnered with deep learning and image processing company D-ID to create something called “Deep Nostalgia.” This technology can bring a person's ancestors... Read more...
Multiple facets of technology are trending towards artificial intelligence these days, in applications both big and small. As that's been happening, graphics processing units (GPUs) have taken on the heavy lifting, though researchers at Rice University have cooked up a new machine learning scheme they say is more... Read more...
You have probably heard at some point in your life, 'Don't believe everything you read!', especially in this day and age of the Internet. That is sage advice, and of particular interest in the era of fake news stories permeating social media. However, with advances in artificial intelligence technology, don't believe... Read more...
It's said that seeing is believing, but can you always believe what you see? That's a question that will come up more and more as companies like NVIDIA push the envelope with graphics rendering and, interestingly enough, artificial intelligence. As it relates to that, in addition to launching a new Titan RTX graphics... Read more...
The initial focus on NVIDIA's recently launched GeForce RTX 2080 Ti and GeForce RTX 2080 graphics cards has been on how well they perform in games, especially when cranking up the resolution to 4K (3840x2160). That will continue to be a point of interest, though it's not the only one. A fresh set of benchmarks making... Read more...
Do you need any more proof that your ball handling skills suck? Well, look no further than Carnegie Mellon University and DeepMotion, which together trained AI how to dribble a basketball and pull off increasingly advanced moves as it learned. It can be hard to often rationalize in our minds that professional... Read more...
One of the most annoying things about Windows 10 is that it often decides to apply updates right when you are in the middle of working. This is typically halfway through a long document or during meeting. A new Windows 10 Redstone 5 build that Microsoft has released to some Insiders dubbed 19H1 has new tech to avoid... Read more...
It’s happened to us all at some point in time — you capture an image in less-than-ideal light conditions and the end result is a is grainy photo filled with digital noise. While you may still be able to make out many of the details in the photograph, wouldn’t it be nice if you could somehow magically restore it to... Read more...
It's not quite the Skynet that Tesla CEO Elon Musk has warned us about, but researchers from the Musk-based OpenAI initiative have made a breakthrough in AI algorithms using Dota 2 as a testbed. OpenAI's achievement is remarkable due in part to its scope. Most AI versus human matches -- be it go or a computer game --... Read more...
Anyone who has lived through the 1980s knows how maddeningly difficult it is to solve a Rubik's Cube, and to accomplish the feat without peeling the stickers off and rearranging them. It's not just challenging for humans, either. Apparently the six-sided contraption presents a special kind of challenge to modern deep... Read more...
Gigabyte today announced a couple of new 4U GPU servers for the datacenter, both packed with multiple NVIDIA Tesla GPUs to bring massive parallel computing capabilities to the sector. According to Gigabyte, its new G481-S80 and G481-HA0 offer some of the highest GPU density available in the 4U form factor—the former... Read more...
Hot on the heels of the debut of its 8th gen Core series, and also its brand-new top-end Core X chips, Intel just announced Loihi. With this new chip, Intel is going all-in on artificial intelligence (AI) and self-learning. It also drops a term you may have heard recently: neuromorphic computing; in effect, neural... Read more...
Microsoft was on hand at the Hot Chips 2017 show and rolled out a new deep learning acceleration platform dubbed Project Brainwave. Microsoft's Doug Burger says that the platform is a "major leap forward in both performance and flexibility for cloud-based serving of deep learning models." Microsoft says that it... Read more...
Intel is expanding its reach into the deep learning field today with the launch of the Neural Compute Stick (NCS), which as developed by its Movidius subsidiary. The Movidius NCS is aimed at democratizing deep learning and artificial intelligence, with Intel billing it as “the world’s first self-contained AI... Read more...
We talked yesterday of an example of how deep learning and artificial intelligence can be used to put words in people's mouths, creating video proof of something someone said, even if they didn't really say it. Prospects like that are downright scary, but so too are the realities of the jobs AI will be able to take... Read more...
Many of us have had to heed the warning of, "Don't put words in my mouth" at some point over the course of our lives, but generally speaking, no one actually means it from in a literal sense. In time, though, thanks to deep learning and artificial intelligence, putting words in someone's mouth could become a... Read more...
A board game like "Go" might not look complicated on the surface to the untrained eye, which could lead the uninformed to believe that it wouldn't be all that difficult for a computer to best a human player in a head-to-head match. We've seen many examples in the past where that hasn't been the case (IBM's Watson is a... Read more...
NVIDIA's push into deep learning and artificial intelligence markets is is paying off in a big way. As a result of its ongoing investments into data center GPUs, the company was able to rake in $1.94 billion in revenue during the first quarter of 2017, an increase of 48 percent from $1.3 billion in same quarter a year... Read more...
We are starting to sense a recurring theme from NVIDIA at the annual GPU Technology Conference (GTC). Last year at GTC NVIDIA unveiled its DGX-1, the world's first deep learning supercomputer. Featuring Tesla P100 GPUs based on its Pascal architecture, the DGX-1 made clear NVIDIA's commitment to deep learning and... Read more...
You don't need to look very far to see examples of how machine- and deep-learning could enrich our lives. You might be surprised, given how often we hear about deep-learning nowadays, that most companies have only been utilizing the technology to a notable degree for the past couple of years - including Amazon... Read more...
1 2 Next