Apr 182014

After four decades the smartphone comes of age,” proclaims Micheal Wolf in Forbes Magazine.

Wolf is right to a point but he misses the key reason why the smarthome, or the entire internet of things, has become accessible – the technology has simply become affordable.

It was possible to build a smarthome two decades ago, but it was fiendishly expensive and only a few rich people could afford the technology. Today that technology is cheap and easy to install.

This is the common factor with all aspect of the Internet of Things, connecting devices has been possible since before the internet became common but it was expensive and cumbersome so only the highest value equipment – such as oil rigs – was connected.

Now it’s inexpensive and simple to connect things, people are doing it more and that is why there’s a range of security and privacy issues which weren’t so pressing when it was only a few obscure industrial devices that were wired up.

We aren’t inventing the wheel with technologies like the internet of things or big data, they already existed – they are just more accessible and that’s what’s changing business.

Apr 122014
steam train and inefficient business
A strange piece by author Jeremy Rivkin in The Guardian argues the internet of things will facilitate an economic shift from markets to collaborative commons which threatens capitalism as marginal costs fall to zero.

Rivkin argues that the rise of the ‘prosumer’, who contributes content and adds economic value for free, is undermining the basic tenants of capitalism.

A telling blow to capitalism in Rivkin’s eyes is the abundant data generated by the Internet of Things;

Siemens, IBM, Cisco and General Electric are among the firms erecting an internet-of-things infrastructure, connecting the world in a global neural network.

There are now 11 billion sensors connecting devices to the internet of things. By 2030, 100 trillion sensors will be attached to natural resources, production lines, warehouses, transportation networks, the electricity grid and recycling flows, and be implanted in homes, offices, stores, and vehicles – continually sending big data to the communications, energy and logistics internets.

Anyone will be able to access the internet of things and use big data and analytics to develop predictive algorithms that can speed efficiency, dramatically increase productivity and lower the marginal cost of producing and distributing physical things, including energy, products and services, to near zero, just as we now do with information goods.

That Rivkin mentions large corporations like Cisco, Siemens, IBM and General Electric illustrates the flaw in his idea — these companies are profiting from the Internet of Things and the data it’s generating.

Rather than being killed, capitalism is evolving to the new marketplaces.

Nowhere is this truer than in the sharing economy where the new lords of the digital manor are  profiting from the work and free content generated by unpaid ‘prosumers’.

How long the free business models can survive is open to question, in many respects the age of the digital sharecropper is a transition phase that isn’t sustainable and it’s more likely we’re seeing a move to an economy where information is far more abundant than it was previously.

Such a change is not unprecedented, far more basic human needs are food and energy. In Western economies, we have been living in a time of unimagined abundance of both for the last century.

In subsistence economies, food and the energy to grow or hunt it is scarce and its why living standards are low and life expectancies are short. Agricultural society start to solve the food scarcity problem and industrial societies automate farming and increase living standards through abundant energy.

During the pre-industrial era, the basic unit of energy was the horse – hence the term horsepower – and it was rare to have more than four horses driving a coach or piece of machinery.

Today, we have locomotive engines that provide 6,000 horsepower, a basic farm tractor delivers around 100 HP  and a typical family car around 200. We live in an age of abundant energy and our living standards reflect it.

We’re moving into an era of abundant information that will change our societies in a similar way to the age of abundant power has changed economies over the past 300 years.

Open source, the sharing economy and the internet of things will all change aspects of our economies and society but people will still be making a living one way or another so they can buy a meal and pay their rent.

The age of abundant information means massive change to the way we work, but it no more means the end of capitalism than the steam engine did.

Apr 022014

“This is the most difficult time in history to be a wine maker, declares Paul Mabray, Chief Strategy Office and founder of Vintank.

“Never has the wine industry been as competitive as it is today.”

Update: The Wine Communicators of Australia, who sponsored Mabray’s visit, have posted Paul’s presentation that covers this post’s theme in more detail.

Mabray’s business monitors social media for wineries and collects information on wine enthusiasts. Since Vintank’s founding in 2008 the service has collected information on over thirteen million people and their tastes in wine.

Rewriting the rule book

Social media, or social Customer Relationship Management (sCRM), is what Mabray sees as being part of the future of the wine industry that’s evolving from a model developed in the 1970s which started to break down with the financial crisis of 2009.

“In the old days there was a playbook originating with Robert Mondavi in the 1970s which is create amazing wine, you get amazing reviews and you go find wholesalers who bring this wine to the market.”

“As a result of the global proliferation of brands the increase of awareness and consumption patterns where people like wine more, those playbooks didn’t work in 2009 when the crisis started.”

With the old marketing playbook not working, wineries had to find other methods to connect to their markets and social media has become one of the key channels.

Now the challenge in the wine industry, like all sectors, is dealing with the massive amount of data coming in though social media and other channels.

The cacophony of data

“If you rewind to when social media came out, everyone had these stream based things and the noise factor was so heavy,” says Mabray.

“For small businesses this creates an ‘analysis to paralysis’ where they’d rather not do anything.”

Mabray sees paralysis as a problem for all organisations, particularly for big brands who are being overwhelmed by data.

“The cacophony of data at a brand level is just too much,” he says.

“It’s as noisy as all get go and I think the transition is to break Big Data down into small bite size pieces for businesses to digest is the future, it shouldn’t be the businesses problem, it should be the software companies’.”

A growing digital divide

Mabray sees a divide developing between the producers who are embracing technology and those who aren’t, “the efficiencies attributed to technology are obvious whether they’re using CRM, business intelligence or other components.”

“The people who are doing this are recognising the growth and saying ‘hey, this stuff actually works! If I feed the horse it runs.”

While Mabray is focused on digital media and the wine industry, similar factors are work in other industries and technology sectors; whether it’s data collected by farm sensors to posts on Instagram or Facebook.

Facebook blues

Mabray is less than impressed with Facebook and sees businesses concentrating on the social media service as making a mistake.

“I think that every social media platform that’s been developed had such a strong emphasis on consumer to consumer interaction that they’ve left the business behind, despite thinking that business will pay the bills.”

“As a result almost every single business application that’s come from these social media companies has met with hiccups. That’s because it wasn’t part of the original plan.”

Facebook in particular is problematic in his view, “it’s like setting up a kiosk in the supermall of the world.”

The business anger towards Facebook’s recent changes is due to the effort companies have put into the platform, Mabray believes; “everyone’s angry about Facebook because we put so much into getting the data there.”

“We said ‘go meet us on Facebook’, we spent money collecting the items and manufacturing the content to attract people and now we have to spend money to get the attention of the people we attracted to the service in the first place.”

Despite the downsides of social media Mabray sees customer support as one of the key areas the services. “It’s easy to do in 140 characters.”

Context is king

“Everything come back to context. There’s this phrase that ‘content is king’,” Mabray says. “Context is king.”

“Anyone can produce content. It’s a bull market for free content. We have content pollution – there’s so much junk to wade through.

Mabray’s advice to business is to listen to the market: “Customers are in control more than they have ever been in human history: Google flattens the world and social media amplifies it.”

For wineries, like most other industries, the opportunity is to deal with that flat, amplified world.

Mar 242014
Smart rubbish bins in Barcelona

UK tech site The Register reports that Google Flu Trends has been dismal failure with the service over-reporting the incidence of influenza by a factor of nearly 12.

The reason for this problem is the algorithm used to determine the existence of a flue outbreak is that it relies on people searching for the terms ‘flu’ or ‘influenza’ and it turns out we tend to over-react to a dose of the sniffles.

Google Flu Trends’ failure illustrates two important things about big data – the veracity of the data coming into the system and the validity of the assumptions underlying the algorithms processing the information.

In the case of Google Flu Trends both were flawed; the algorithm was based on incorrect assumptions  while the incoming data was at best dubious.

The latter point is an important factor for the Internet of Machines. Instead of humans entering search terms, millions of sensors are pumping data into system so bad data from one sensor can have catastrophic effects on the rest of the network.

As managing data becomes a greater task for businesses and governments, making sure that data is trustworthy will be essential and the rules that govern how the information is used will have to be robust.

Hopefully the lessons of Google Flu Trends will save us from more serious mistakes as we come to depend on what algorithms tell us about the data.

Mar 192014

Today was the main day of the Melbourne Cisco Live Conference; the company’s annual Australian event.

Much of the talk was around the Internet of Everything — which will be the basis of subsequent  posts — with a constant theme around the explosion of data.

A favourite statistic was that of Cisco’s Executive Vice President who pointed out that US Department store Walmart collects 2.5 Petabytes of customers data every hour.

The reason for this was pointed out by GE’s Australia and New Zealand CIO, Mark Sheppard, who pointed out that twenty years ago jet engines had few sensors while today they have hundreds, a point also made by Team Lotus’ Engineering Director Nick Chester to Networked Globe.

Chester observes that when he started in Formula One racing two decades ago, there were four or five sensors on a racing car; today Lotus’ vehicles have over two hundred.

All of these sensors are creating massive amounts of data and the big challenge for businesses is to manage all of this information, something we’ll be exploring over the next few weeks.

Mar 162014

Last year car rental giant AvisBudget acquired the vehicle sharing service Zipcar, at the time it looked like the established player was buying in the tech smarts of younger startup.

Citing ‘synergies’ at the time of a takeover is always a warning sign that a corporate acquisition may not go well and so it has proved with Avis’ efforts with Zipcar as travel news site Skift reports;

Speaking at the J.P. Morgan Gaming, Lodging, Restaurant & Leisure Management Access Forum in Las Vegas earlier this week, AvisBudget CEO Ron Nelson said fleet-sharing has turned out to be more complicated than the company thought because there’s a cost tied to moving the vehicles from one location to another.

That’s a strange statement as a casual observer would be forgiven for thinking that if any organisation understood the costs of moving vehicles around it would be a car hire company.

Apparently that’s not the case and the ‘synergies’ from acquisition will be pushed back to 2015.

Synergies are elusive things and it may well prove that Ron Nelson would be better served by examining how Zipcar’s technology, algorithms and flat management structures can be applied to a more staid organisation like Avis.

The real value in companies like Zipcar and Uber is the way they are applying technology to moving physical goods around – it’s no surprise that Uber’s Travis Kalanick describes his ambition for the future of his company as being the Amazon for logistics.

For Avis, Zipcar’s opportunities lie in more that just enhancing the company’s fleet utilization; understanding the marketplace and predicting demand is where the real gains could be made.

Mar 152014

Mulesoft founder and CTO Ross Mason worries about how companies are going to manage the data generated by the Internet of Things.

“I don’t think we’re ready for the amount of data that these devices are designed to build up,” Ross observes in the latest Decoding the New Economy video.

Ross’ aim in founding Mulesoft was to eliminate the donkey work in connecting IT systems and he sees the data moving between enterprise applications being a challenge for organisations

“We have energy companies that have connected their smart grid systems to their back end systems and most of them delete almost all the data because of the cost of storing that much data without doing anything with it.”

“Big data is still in the realm of we’re figuring out the questions to ask.” Ross states, in echoing the views expressed by Tableau Software founder Pat Hanrahan a few weeks ago.

“There’s a little bit of hype around big data right now, but it’s a very real trend;” Hanrahan said. “Just look at the increase in the amount of data that’s been going up exponentially and that’s just the natural result of technology; we have more sensors, we collect more data, we have faster computer and bigger disks.”

The interview with Ross covers his journey from setting up Mulesoft to the future of big data and software. It was recorded a few days before the company announced a major capital raising.

Mulesoft’s elimination of software ‘donkey work’ is another example of how the IT industry is changing as much of the inefficiencies are being worked out of the way developers and programmers work.

In many ways, Ross Mason’s story illustrates how the software industry itself is being disrupted as much as any other sector.