- NASA’s telescope continues to deliver…
- It can read your mind…
- Uber Eats is going autonomous…
It was the one thing that gave me comfort about Apple’s business model.
Unlike Alphabet (Google) and Meta (Facebook), Apple – for a long time – didn’t generate revenues from advertising. Apple stood firm on privacy and let us all do what we wanted with Apple’s products without worrying about whether or not Apple was “watching” us.
It developed fantastic products, both hardware and software, and transformed markets with something that was simply better than anything else that existed. Sounds easy, but extremely hard to do.
Apple completely disrupted the music industry with iTunes, transformed the mobile phone industry with the iPhone, redefined tablet computing with the iPad, made ear pods cool, and became the number one watch company on the planet.
And there’s more to come. Apple will surely set the pace for the next generation of computing interfaces with its augmented reality and virtual reality technology set to be released next year. I can’t wait.
It didn’t need to cross the line and surveil us. It didn’t need to collect data on our usage, preferences, likes, and dislikes in order to have a fabulous business. After all, today Apple is worth $2.4 trillion with annual revenues at almost $400 billion this fiscal year.
Even more impressive is that the company will throw off $112 billion in free cash flow during that time.
The company sits on $179 billion in cash and yet pays a puny 0.58% dividend. It clearly doesn’t need any more money, and it doesn’t intend to pay it all out to its shareholders. And yet the company wants more.
Apparently, it just couldn’t resist the lure of generating billions more in revenue. So, it did cross the line. And it did it in a way that hurt the digital advertising industry and gave itself a home field advantage.
Apple implemented software that enables consumers to block apps like Facebook from tracking usage. In general, that’s a good thing.
That change has cost software companies that have smart phone applications tens of billions of dollars in lost revenue. Companies large and small have suffered from these changes given the wide adoption of Apple’s iPhones.
But it turned around and employed the same tactics for its own business. Apple is now aggressively expanding its own advertising business. And it’s doing something that it said it would never do – it is allowing advertisements within its own services.
Apple is using data from its services and our Apple accounts to generate advertising revenue. It’s no different than what Facebook and Google continue to do.
And Apple’s advertising business is already a massive business that generates around $4 billion a year. It will eclipse $10 billion in no time.
And while that’s just a tiny portion of Apple’s expected $400 billion annual revenues, it’s a large enough number to be a widely successful stand-alone business worth more than $50 billion.
The reality is that most consumers simply won’t care. The convenience and utility of Apple’s products is simply too great.
And the reality is, Apple’s operating system is head-and-shoulders more secure than what Google puts out with its Android operating system. There’s no comparison between the two.
I just wish it hadn’t sullied itself with data surveillance and advertising. It wasn’t necessary to continue to grow and have a spectacular business. The lack of data surveillance made Apple stand out compared to the rest of big tech.
But it just couldn’t resist. And that’s the problem. Once you’ve crossed the line, it’s almost certain more transgressions will follow.
During the last couple of years, we saw Apple censor applications in its app store that didn’t fit the political narrative. And as much as I like Apple, it pains me to ask the next question: Which line will it cross next?
The James Webb Telescope now has its sights set on exoplanets…
Amazing images keep coming from the James Webb Space Telescope (JWST).
For the sake of newer readers, the JWST is the most complex telescope ever constructed. NASA launched it to a destination about one million miles from Earth back in February.
After a journey of a few months to its final home, LaGrange Point 2 (L2) began producing incredible images after a couple months of calibration. L2 is on the opposite side of the Earth from the Sun. That makes it incredibly cold and somewhat protected from the sun, which is great for the telescope’s performance.
And in the latest development, JWST just sent back the very first image of an exoplanet. That is a planetary body outside of our own solar system.
Here it is:
Our First Exoplanet Image
As we can see, this particular exoplanet is called HIP 65426 b.
It was a relatively easy target because it’s a young, hot planet. It puts off a lot of infrared light. That makes it an easy target for the JWST.
We can see from the primary image at the top that this exoplanet orbits a star identified by the same name. The “b” is for the specific planet. And those bottom images are close-ups of the exoplanet at different resolutions.
What’s remarkable here is that this planet is roughly 400 light-years away from Earth. It’s a ridiculous distance from Earth that would take 400 years traveling at the speed of light in order to reach.
Yet, the JWST can capture incredibly high-resolution images at that distance.
It’s too far away for us to be able to have a clear visual image of the planet, but with the infrared light, we are able to “see” the planet. The image in the lower right is the highest resolution image produced.
So, the JWST continues to overdeliver with amazing images of our universe. NASA even said that the telescope is turning out to be 10 times better than expected.
And get this – the JWST is scheduled to image 79 other exoplanets between now and the end of the year. Some of those will be much closer to Earth than this one.
With this kind of imaging, we’re able to determine the atmospheric composition of these planets. We’ll be able to estimate both the oxygen and carbon dioxide levels.
That will give us great insight into which exoplanets could potentially sustain life, or for that matter are likely to already support life.
For the record, this particular exoplanet isn’t likely to support life. It’s too young and hot. But many others exoplanets that have already been discovered in a habitable zone are almost certain to prove to have conditions for life.
Either way, we have a lot to look forward to from the JWST between now and New Year’s Day.
Facebook wants to read our minds…
Meta (formerly Facebook) just released some wild research around artificial intelligence (AI). And the company’s goal could be sinister.
Here’s what it’s up to…
Meta started off with a form of AI known as convolutional neural networks. Then it fed the AI about 150 hours of data from neurological activity from human brains. This was data Meta collected by presenting human subjects with words and language.
The brain’s electrical and magnetic resonance was recorded at the time the brain was processing each word or phrase. The goal was to use this data to teach the neural network what brain activity corresponds to specific words and phrases.
And it worked.
The end result is an AI that can ingest a person’s neurological activity and determine what they are thinking at any given point with good accuracy.
That’s right – Meta developed an AI that can literally read our minds.
So that begs the big question – what is Meta up to here? Isn’t this a social media company that now wants to be a player in the metaverse space?
Well, there are some great applications of this technology.
If we think about the tens of millions of people out there who have suffered a traumatic brain injury – this tech could help them communicate with the outside world once again.
That’s certainly a noble cause. And no surprise, Meta is touting this kind of application.
But if we think about Meta’s core business model, it’s easy to see how there may be an ulterior motive here.
As regular readers know, Meta makes money by collecting personal data from consumers, generating somewhat of a dossier on each consumer, and then selling access to that information to advertisers. The more data Meta can collect, the more money it makes.
Well, if we are cynical, it’s easy to see how this technology could be a huge boon to Meta’s business.
Of course, Meta would have to convince consumers to wear some kind of headset to monitor their brain activity for this to work. It might even be designed as somewhat of a fashion item.
But if it does that, Meta could collect data on what we’re thinking irrespective of whether or not we’re using a computer or software app or not. Even random thoughts would be captured.
We wouldn’t even have to say or do anything. Meta could record our private thoughts and use them to enhance our profile.
So this is something we should very much be aware of. I would caution anyone from using such a product.
That said, the tech’s potential is fantastic, especially for those who might have a disability.
If Meta combines this with the electromyography (EMG) technology it got from its acquisition of CTRL-labs, the company could come up with some absolutely wild applications.
Regular readers may remember that CTRL-labs developed a wristband that could monitor electrical signals traveling from the brain to the fingers.
By wearing a forearm band connected to a computer, users could control a computer with just their thoughts. It is an exciting approach to a brain computer interface (BCI).
We’re getting very close to the realm of what was once considered to be science fiction. Within the next twelve months we’re going to see transformation breakthroughs in both BCI and artificial intelligence that will ultimately change the way that we work with computing systems.
Nuro inks a landmark deal…
We’ll wrap up today with big news from one of the early-stage startups we’ve been tracking for a few years now in these pages.
Nuro just inked a 10-year deal with Uber Eats. Uber Eats will deploy Nuro’s third-generation autonomous delivery vehicles in certain cities across the U.S., starting with Mountain View, CA, and Houston, TX.
We just talked about Nuro’s latest delivery vehicle back in January.
As a reminder, it’s an autonomous vehicle (AV) about 20% smaller than the average car. And it’s fully optimized for food and grocery deliveries.
In fact, the vehicle can hold up to 24 bags of groceries. And it can carry up to 500 pounds.
Here it is:
Nuro’s Delivery Vehicle
As we can see, Nuro’s AV is fully capable of navigating on normal roads.
And it’s big enough that Nuro can load it with advanced sensors. Here we can see radar and lidar devices affixed to the top of the vehicle.
This makes Nuro’s vehicle unique in the autonomous delivery space. Most other companies are working on smaller vehicles that travel on sidewalks and bike paths.
And that’s why Uber Eats entered into a long-term arrangement with Nuro. Uber had been testing autonomous delivery with other companies. But Nuro clearly stood out.
I’m sure a big reason for this is Nuro’s cargo space. It’s capable of hauling several deliveries at the same time. This makes it incredibly efficient compared to the smaller vehicles that can only fit a single delivery at a time.
So, I’m very excited to see how the services in Mountain View and Houston go. And I suspect we’ll see Uber Eats and Nuro expand into several additional cities before the year is out.
The end result is that delivery costs will drop for consumers. That’s because Uber Eats won’t have to pay a delivery driver and because customers won’t need to add any tips.
So, it’s a win for both Uber and for consumers. And somewhat sadly, I suspect the AVs will prove to be more dependable than humans.
Regardless, this is a great development. And if anyone who lives in either city experiences what it’s like to receive a delivery via Uber Eats and a Nuro, we’d love to hear about your experience here.
Editor, The Bleeding Edge