- How AI will bring 2D to 3D in seconds
- AIs that are learning to “pay attention”
- Machine learning is better at predicting and avoiding traffic
The Conference on Neural Information Processing Systems (NeurIPS) took place in Vancouver last week. This has become the event of the year in the world of artificial intelligence (AI).
I was invited to attend, and I bought my ticket a couple months ago… But sadly, I couldn’t make the trip. I hated to miss it, but an urgent shoulder surgery last week interrupted my plans.
In recovery mode…
Jeff After Surgery
The surgery did not stop me from following along, however. The conference turnout was incredible. There were more than 13,000 people in attendance… for an AI event. That’s never happened before.
Anytime we see this level of activity in a specific area of technology, we know something big is happening. No doubt we will see some amazing AI breakthroughs in 2020.
As for specifics, more than 1,400 research papers were presented at the conference. No human being could possibly read through all those papers in one week… not even in a year. But I did read through a number of them. I tried to pick out the topics that were most widely discussed at the event.
Today, we will talk about three of the most interesting papers.
An AI that can generate 3D graphics…
A team from NVIDIA and the University of Toronto presented the first paper we’ll discuss. It described an AI that can turn two-dimensional (2D) images into three-dimensional (3D) graphics.
For context, one of the key elements of graphics processing today is the ability to generate 2D images from 3D models. This is called rendering. It’s a common process that’s been used for decades.
Well, this AI can basically reverse that process. It can take a 2D image and create a 3D model of it. And it can do this on the spot. It doesn’t need access to a 3D training set.
I’ll explain why this is so incredible with an example…
Let’s think about an image we have of a historical figure. Maybe one of America’s founders, for example. This AI could take that simple photographic image or portrait and create a 3D graphical model of the person as well as the setting of the picture.
And we’re talking about a detailed graphical model. The AI would predict the person’s size based on the image. And it would recreate the texture, colors, and even the lighting of both the person and the setting.
In this way, the AI could re-create a “real” historical setting. It could make history come to life. Imagine being able to witness a re-creation of the signing of the Declaration of Independence. Or perhaps we could listen to the Gettysburg Address as it was originally given.
There are plenty of creative applications for this…
This tech could create educational videos, make movies or video games, and even create a virtual reality simulation of historical events. I’m excited to think about the possibilities.
Plus, this AI will make the production of 3D graphics much easier, far cheaper, and ridiculously fast.
Right now, producing 3D graphics for any application is a time-consuming, largely manual process. This AI could automate it. And that means this will likely become the new industry standard for graphics generation.
Google’s Brain Team is developing “attention” in AIs…
One of Google’s top research and development teams focused on AI and machine learning (ML) is the “Brain Team.” The Brain Team gave a presentation on “attention” in AIs.
This refers to the ability of an AI to understand context and meaning with a much smaller, more focused data set. Another way to explain it is that the AI can give its “attention” to very specific tasks rather than digesting massive data sets and learning from the process.
This is very relevant for natural language processing. That’s the technology that can understand human speech and the written word. It can also translate from one language to another.
For example, attention would allow an AI to take two sentences in two different languages that have the same meaning and compare them to each other.
The AI could figure out how one sentence was relevant to the other. The better the AI is at understanding context like this, the better it will be at translating one language to the other.
And attention isn’t just for language translation…
Human communication is complex. We often imply certain things without explicitly saying them. Attention could help an AI pick up on this. It could help the AI discover nuance and implied meaning in verbal or written sentences.
This is referred to as self-attention. In a way, it’s the next step to self-aware AI. And this is a huge step toward developing AI digital assistants that can perform more complex tasks for us.
Right now, AI assistants like Amazon’s Alexa or Apple’s Siri can perform basic tasks like playing music or checking the weather. In the near future, AI assistants will make us dinner reservations. They’ll manage our appointments. They’ll even file our taxes.
But in order to do that, these assistants need to understand all the nuances of human communication.
So the work being done on this attention approach to AI right now will lead to some breakthroughs in 2020.
What’s more, the technology behind this is a more efficient method for AI than what we’ve used up to this point. Google’s Brain Team demonstrated an example of attention using 39% less processing power. That translates into lower electricity costs, which in turn makes it cheaper to run.
So expect to see some big things come out of this space next year. Personally, I look forward to having my own AI assistant.
AI can now predict traffic patterns…
If you’re tired of getting caught in traffic jams, read this last insight carefully…
Our last topic today comes from the Institute for Advanced Research in AI (IARAI). The IARAI conducted a research competition to see what kinds of machine learning systems could predict traffic patterns. It announced the results at the conference last week. And they were impressive…
For backstory, a company called HERE Technologies backed the competition. This is a company that has compiled geolocation and traffic data for many years.
That data feeds into navigation systems employed by a group of Europe’s top carmakers including Audi, Daimler, Porsche, and BMW. This group of carmakers actually acquired HERE Technologies for $3.1 billion back in 2015.
So HERE Technologies provided each competing team of researchers with 285 full days of traffic data in three diverse cities. It was a massive data set for the machine learning systems to analyze.
HERE also gave each team numerous days with incomplete traffic data. The AI’s task was to analyze the full days of data and use that to predict what happened on the days with incomplete data.
Obviously, traffic patterns in metropolitan areas are complex. The smallest incident can create traffic jams that disrupt traffic patterns for many miles/kilometers. So predicting what’s going to happen on a given day is an immense task. It’s certainly not something a human can do. But it turns out AI is up to the task…
The IARAI announced three winning teams. They were from South Korea, Oxford/Zurich, and Toronto. These teams used machine learning to accurately predict traffic patterns and congestion. And they found ways to optimize traffic to help avoid problems. Information like this will be immensely valuable to every city in the world.
What I especially liked about this was the fact that the IARAI and HERE Technologies crowdsourced solutions. They incentivized some of the brightest minds working on AI/ML to participate, and everyone benefits as a result.
In fact, HERE Technologies will likely plug its favorite algorithm into its navigation software to improve its service.
So the next time you avoid bumper-to-bumper congestion on your commute home, you might have an AI to thank.
Editor, The Bleeding Edge
P.S. And as a reminder, I had the good fortune to speak at length about AI with Glenn Beck when I was a guest on his podcast last month. Glenn is one of the few people who understand the ramifications – good and bad – that AI will have on the world. It was a great conversation. If any readers haven’t viewed the episode, be sure you watch it right here.
Like what you’re reading? Send your thoughts to [email protected].