Peak Wireless and the data paradox

Have we reached the limits of wireless internet?

Australia’s government research agency, the CSIRO, released a somewhat alarming media alert this morning warning that our cities are approaching Peak Data.

Peak Data, which borrows from the ‘Peak Oil’ term coined in the 1970s to describe the point where oil production reaches a maximum, is where we run out available bandwidth on our wireless networks.

The release is around the agency’s new report, A World Without Wires, where the agency lays out its view of the future of cellular and radio communications.

“In the future, how spectrum is allocated may change and we can expect innovation to find new ways to make it more efficient but the underlying position is that spectrum is an increasingly rare resource,” says  the CSIRO’s Director of Digital Productivity and Services Flagship Dr Ian Oppermann.

“With more and more essential services, including medical, education and government services, being delivered digitally and on mobile devices, finding a solution to “peak data” will become ever more important into the future.”

The wireless data paradox

It’s a paradox that just as we’re entering a world of unlimited data, we have limitations of what we can broadcast wirelessly as radio spectrum becomes scarce and contested.

With fixed line communications, particularly fibre optics, available spectrum can be relatively simply increased by laying down more cables – wireless only has one environment to broadcast in –  so finding ways of pushing more data through the airways is what much of the CSIRO’s paper addresses.

For telecommunications companies, this presents both a challenge and an opportunity; the challenge being squeezing more data into limited spectrum while the opportunity lies in charging more for guaranteed connectivity.

The latter raises questions about network neutrality and the question of whether different types of traffic across wireless networks can be charged differently or given differing levels of priority.

Distributing the load

This also gives credence to the distributed processing strategies like Cisco’s Fog Computing idea that takes the load off public networks and can potentially hand traffic over to fixed networks or point to point microwave services.

While M2M data is tiny compared to voice and domestic user needs, it does mean business critical services will have to compete with other users, both in the private Wi-Fi frequencies or the public mobile networks spectrum.

Overall though, the situation isn’t quite as dire as it seems; technological advances are going to figure out new ways of stuffing data into the available spectrum and aggressively priced data plans are going to discourage customers from using data intensive applications.

A key lesson from this though is those designing, M2M, Internet of Things or smart city applications can’t assume that bandwidth will always be available to communicate to their devices.

For the Internet of Things, robust design will require considering security, latency and quality of service.

Similar posts:

  • No Related Posts

Cloud computing’s walled gardens

Things are getting interesting for cloud computing as vendors try to lock users into their walled gardens

I’ve spent the last few days playing with Microsoft’s Office 365 and its iOS Apps for a review for tomorrow’s Business Spectator.

One thing that’s clear with comparing the various competitors in the online space is how all of them are trying to lock users into their own walled gardens.

This the various web empires are tying to lock us into their worlds isn’t surprising – it’s been going on for some time – however now we’re seeing it becoming harder to keep out of making a choice on whose empire you have to choose.

For the next generation of computers, this is going to be a challenge as the Internet of Things will be crippled should it turn out that one’s brand of smartcar won’t talk to your phone or intelligent garage door opener, let alone logistics chains breaking down due to an incompatibility somewhere in the process.

The cloud computing industry has entered an interesting period where the big players are hoping to carve up the market for themselves; what the market thinks about this remains to be seen.

Similar posts:

  • No Related Posts

Evangelism and the makers’ movement

Salesforce’s Reid Carlberg talks tech Evangelism, the Internet of Things and the Makers’ Movement

The latest Decoding the New Economy interview is with Salesforce’s Reid Carlberg.

During the interview with Reid we cover how the Internet of Things and big data is changing business and society along with the journey to becoming a software company’s evangelist.

Reid has a fascinating story to tell about how the makers’ movement is evolving as big data and the internet of things develops.

The interview is an insight into a winding career path and how Big Data and the Internet of Things is changing business and society.

Similar posts:

  • No Related Posts

Moving from an industrial era to a data age

Cisco Vice President Wim Efrink describes the opportunities with the internet of everything

The last two weeks have been pretty hectic with Cisco, Salesforce and Microsoft events in Melbourne, as a result there’s a huge backlog of posts to put up.

One of the interviews that has worked out is with Cisco’s Vice President for Globalisation, Wim Elfrink, which is up on the Decoding the New Economy YouTube channel.

In it Wim covers how the next wave of upcoming nations, the TIPSS – Turkey, Indonesia, Poland, Saudi Arabia and South Africa – threaten to leapfrog the developed world and the opportunities for businesses in a world where everything is connected.

Similar posts:

  • No Related Posts

Garbage In and Garbage Out

The success of using Big Data depends upon the quality of both the data and the algorithm

UK tech site The Register reports that Google Flu Trends has been dismal failure with the service over-reporting the incidence of influenza by a factor of nearly 12.

The reason for this problem is the algorithm used to determine the existence of a flue outbreak is that it relies on people searching for the terms ‘flu’ or ‘influenza’ and it turns out we tend to over-react to a dose of the sniffles.

Google Flu Trends’ failure illustrates two important things about big data – the veracity of the data coming into the system and the validity of the assumptions underlying the algorithms processing the information.

In the case of Google Flu Trends both were flawed; the algorithm was based on incorrect assumptions  while the incoming data was at best dubious.

The latter point is an important factor for the Internet of Machines. Instead of humans entering search terms, millions of sensors are pumping data into system so bad data from one sensor can have catastrophic effects on the rest of the network.

As managing data becomes a greater task for businesses and governments, making sure that data is trustworthy will be essential and the rules that govern how the information is used will have to be robust.

Hopefully the lessons of Google Flu Trends will save us from more serious mistakes as we come to depend on what algorithms tell us about the data.

Similar posts:

  • No Related Posts

Eliminating the donkey work

Ross Mason, founder of Mulesoft, sees Big Data as one of the challenges facing business

Mulesoft founder and CTO Ross Mason worries about how companies are going to manage the data generated by the Internet of Things.

“I don’t think we’re ready for the amount of data that these devices are designed to build up,” Ross observes in the latest Decoding the New Economy video.

Ross’ aim in founding Mulesoft was to eliminate the donkey work in connecting IT systems and he sees the data moving between enterprise applications being a challenge for organisations

“We have energy companies that have connected their smart grid systems to their back end systems and most of them delete almost all the data because of the cost of storing that much data without doing anything with it.”

“Big data is still in the realm of we’re figuring out the questions to ask.” Ross states, in echoing the views expressed by Tableau Software founder Pat Hanrahan a few weeks ago.

“There’s a little bit of hype around big data right now, but it’s a very real trend;” Hanrahan said. “Just look at the increase in the amount of data that’s been going up exponentially and that’s just the natural result of technology; we have more sensors, we collect more data, we have faster computer and bigger disks.”

The interview with Ross covers his journey from setting up Mulesoft to the future of big data and software. It was recorded a few days before the company announced a major capital raising.

Mulesoft’s elimination of software ‘donkey work’ is another example of how the IT industry is changing as much of the inefficiencies are being worked out of the way developers and programmers work.

In many ways, Ross Mason’s story illustrates how the software industry itself is being disrupted as much as any other sector.

Similar posts:

  • No Related Posts

The Internet of Racing Machines

Formula One racing gives us a glimpse of the technologies that will be commonplace in businesses in the near future.

For the Formula One racing circuit, the financial crisis of six years ago was an opportunity to reinvent the sport; today the teams use a combination of technologies to gain an advantage over their competitors.

“A few years ago you wouldn’t have been here today,” Francois Puentes, Head Of Account Management at Team Lotus told a group of journalists ahead of this week’s Melbourne Grand Prix. “F1 was a completely different sport.”

The 2009 financial crisis was the catalyst for the changes Puentes says; “we all sat down as teams at the same table to make the sport more sustainable, this obliged us to run the sport as a business.”

“Before we didn’t know what the unit cost was for a part. We would very often produce two of the same parts without even knowing what was going on.”

To tighten their management systems, Lotus bought in a range of cloud based business software such as Microsoft Dynamics and also accelerated its adoption of computerised manufacturing techniques.

Speeding up development

Lotus employs over 500 people to keep its two cars on the road and most of the vehicles parts are designed and manufactured at its headquarters in Oxford, England. During the season the team’s workshop may produce up to five hundred replacement or redesigned components each week.

This brings together a number of technologies including Computer Aided Design, 3D Printing and cloud computing.

The internet of racing machines

Massive rule changes have also accelerated Formula One’s adoption of in car technology with information being gathered from sensors throughout the vehicles.

During races data is transferred from the vehicles’ sensors by radio for the teams’ crews to analyse performance. This includes information like gear box temperature, tyre condition, and aerodynamic performance data.

Following the race larger volumes of data are downloaded from the vehicle for engineers to tune the car for the next event.

While Lotus has teamed with technology companies like Microsoft and EMC, rival team Caterham partnered with GE whose Global Research team worked to integrate the technologies demanded by the new F1 rules.

Global technology

Caterham’s cars use intercoolers developed in Germany, carbon fibre composites and fibre optic sensors from the United States, and big data analysis techniques developed in India.

Key to gathering that data are sensors throughout the vehicle that capture a constant stream of data about forces acting on the car during the race, transmitting this information in a far more efficient way than traditional methods which relied on load sensors attached to the suspension.

The result is massive volumes of raw data. On the track, Caterham cars generate 1,000 points of data a second from more than 2,000 data channels. Up to 500 different sensors constantly capture and relay data back to the team’s command centre for urgent analysis.

Learning from Big Data

By applying what the company has learned from its Industrial Internet projects, GE was able to help Caterham cut its data processing time in half, leaving the team in a stronger strategic and tactical position.

Thanks to these analysis techniques, the Caterham team can look at slices of its data across an entire season, pinpoint setups that were particularly effective, and identify reliability issues earlier.

Inside the vehicle, GE has also found a way to replace metal pipes with carbon fibre, reducing the overall weight of the vehicle.

These technology developments will continue to find applications beyond the 2014 Grand Prix season.

Carbon composites are being used extensively in the aviation industry and big data analysis is playing an important role in the renewable energy sector.

Lewis Butler, Caterham’s chief designer, says working with GE is helping the team deepen its skills base.

“GE are working with Caterham to help with the manufacturing process and knowledge transfer, and giving Caterham F1 Team the capability to manufacture its own parts,” he says.

All the Formula One teams are using Internet of Things technologies to gather information on their vehicles, Big Data tools to manage that information along 3D printing to accelerate their research and manufacturing processes.

The Formula One world is a glimpse into the future of business as various technologies come together to change the way industries operate.

Paul travelled to the Melbourne Grand Prix as a guest of Microsoft and Team Lotus.

Similar posts:

  • No Related Posts