Data and the modern movie producer

WETA Digital shows the way on how other businesses will have to manage data in coming years.

Dealing with the massive wave of data flowing into businesses will be one of the defining management issues of the next decade. One company that is already dealing with this is New Zealand’s Weta Digital.

Wellington based Weta that’s best known for its work on Lord of the Rings and is part owned by director Peter Jackson employs 1400 staff for its movie special effects work and has won five visual effects Academy Awards over its 23 years of operations.

Kathy Gruzas, WETA Digital’s CIO, spoke to Decoding the new Economy at the Oracle OpenWorld forum in San Francisco this week about some of the challenges in dealing with the massive amount of data generated by the movie effects industry.

“We have some very heavy loads.” Kathy states. “We push our systems to the limit.”

Applying powerful systems

One challenge is the sheer computing power required, ‘the render frame processes one frame per server until you have four seconds of footage. Sometimes that takes over night or even longer and for that we use a lot of storage,” Kathy says. “The render farm being six thousand servers will write 60 to 100 terabytes of data a day and read a quarter to half a petabyte each day.”

“We need systems that will be very large to handle the volume of data we generate but also be very quick to handle those read and writes.”

“One render could use a thousand computers, sometimes more, and all of those will be reading and writing against the same block of storage so we have our own software layer that directs those loads but we try to minimise the load on our storage but we have the worst work load you can imagine with lots of servers, lots of small reads and writes and many of them random and concurrent with pockets of hot files.”

Despite the automation, the business is still extremely capital intensive. “In visual effects you probably need at least three hundred artists to work on one film, it’s a very labour intensive process to do the artistry and much like a production line.”

Going mobile

The nature of modern movie production means the effects teams are now part of the shoot which adds another level of complexity for Weta. “Although we are visual effects which is largely post-production we do go out with crews when they’re shooting the movie so we can do reference photography,” says Kathy.

“We do 3D scans so if we need to do something digitally and we do motion and facial capture as well,” she says. “There are 240 muscles that we tweak individually to get the expression. That’s a huge amount of data to capture.”

To do this, Weta created their own ‘road case’ that contains everything they need to grab the shots and store the data they need, “you can’t ask the director retake the shot because we missed something.”

Into the forest

“We have to take the case into the forest and into the rain and everywhere. It’s good having that roadcase that has storage, networking and servers in it.” The case, which was self assembled by Weta’s team is “probably the most travelled Oracle system on the planet,” laughs Kathy with “lots of data capture and sub-rendering.”

Weta’s story illustrates just how managing data is becoming a critical issue for companies. While movie special effects is very much a specialised field that’s far ahead of the curve in its technology use than most businesses, they do show the importance of managing and securing their data.

For other businesses, lessons from Weta is understanding your company’s – including staff and customers’ – needs then investing in the right tools to deliver is essential.

One important difference between technology intensive businesses like Weta and most other organisations is the New Zealand company is doing most of its processing and storage in house. Those without the same needs will almost certainly be shifting these tasks onto the cloud.

Thinking about networked thinking

In a world awash with data managers may have to start thinking about networked thinking

“We want to be the Wayze of enterprise software” is the line being repeated by executives at the Inforum2016 conference in New York today.

This is an interesting strategy for Infor, who provides a range of enterprise software tools to help companies track what is going on in their business, as Wayze is built upon aggregating user data to identify traffic problems to improve commuting times. It’s no surprise that Google bought the company a few years ago.

Infor position though is slightly different as it’s aggregating individual clients’ data for them. In a world where organisations are struggling not to be overwhelmed by information, Informa are in a good position, even if their executives do overdo it on the buzzwords.

Which leads us to another buzzphrase – design thinking – which has been drifting in and out of fashion over recent years. During the opening keynotes one of the comments was about the rise of  “network thinking.”

“Eighty percent of what most companies do deals with data from outside of their organisation,” says Kurt Cavano, Infor’s General Manager of their commerce cloud division. “We’ve seen in the power of networks with sites like Facebook, LinkedIn and Wayze.”

“Nobody wants to be on a network but everyone’s on a network. It takes a long time to build but once you have one it’s magical. That’s what we’re thinking for business, they need to evolve.”

In one respect this is another take on the ecosystem idea, that one vital corporate asset in the connected world is an ecosystem of partners, suppliers and users, however the Infor view articulated by Cavano is much more about the flow of data rather than the goodwill of a community.

So we may well be entering a world of ‘networked thinking’ where thinking about the effects of data flows and being able to understand them – if not manage them – becomes a key executive skill.

Paul travelled to New York as a guest of Infor

Evolving into a data centric company

The newly demerged HP Enterprise is dealing with a shifting market and a change in product focus.

I’m currently at the HP Enterprise Seize the Data roadshow in Singapore where the recently split company is showing off its range of data analytics tools.

Like companies such as IBM and Google, HPE are looking to make money out of data feeds and analytics with a key part being a platform for developers to create applications.

In launching their Haven OnDemand service, HPE are entering a crowded field with IBM, Salesforce, AWS and Splunk – among others – offering similar products. What compelling difference HPE will add to the field will be something I’ll be asking the company’s executive later.

One of the other services, HP Vertica, looks running data analytics against structured and ‘semi-structured’ sources. Again this is a field where other companies are well established and have an advantage in being able to examine unstructured data.

The overwhelming question though is how big, and lucrative, the market is for these data products. It’s not clear exactly how all of these companies are going to monetize these services and, should they be able to, their profitability.

As a company finding its feet less than a year after being split in two with the added problem of seeing its core server hardware business being eroded, HP Enterprise is realigning its business around data analytics and cloud services.

The challenge for the company is differentiating itself and providing competitive products in these markets, this will be a tough challenge.

Volkwagen shows the IoT’s data weakness

The Volkswagen emissions scandal shows the data weakness in the internet of things

The Volkswagen emissions scandal has rocked the company and cost its CEO his job, but the implications of the company falsifying data to past regulators’ test has serious implications for the Internet of Things.

As the Los Angeles Times explains, Volkswagen designed software to detect when its cars were being tested. During test the software would modify the car’s performance to give a false result.

This is similar to the Stuxnet worm which sent Iranian operators false information indicating the uranium enrichment centrifuges were operating normally when in truth they were running at speeds well outside their design.

Both the Volkswagen fraud and the Stuxnet worm show how software can be used to tell lies about data. For processes and businesses relying on that data, it’s critical to know that information is reliable and correct.

Data is the raw material of the internet of things and all the value derived comes from analysing that information. If the information is false, then there’s no value in the IoT. Designing systems that guarantee the integrity of data is going to be essential as devices become more connected.

Programming the Internet’s advertising

Appnexus CEO Michael Rubenstein tells us how the advertising industry is evolving with the internet

Michael Rubenstein, President of AppNexus is the first interview for a while on the Decoding the New Economy channel.

Rubenstein joined AppNexus as employee number 18 in 2009 and has been part of the company’s growth from a small startup to a global technology company with a workforce of 1,000 professionals.

AppNexus is one of the new wave of companies managing and programming online advertising, helping advertisers and publishers target their products better while giving ad tech companies deeper insights and data.

In this interview, Rubenstein discusses some of the forces changing global advertising along with the challenges of dealing with a high growth business.

Apologies for the bad hair on my part.

Management in an age of information abundance

How do managers and business owners deal with an age of abundant information?

The Twentieth Century was defined by abundant and cheap energy while this century will be shaped by our access to massive amounts of data.

How do managers deal with the information age along with the changes bought about by technologies like the Internet of Things, 3D printing, automation and social media?

Management in the Data Age looks at some of the opportunities and risks that face those running businesses. It was originally prepared for a private corporate briefing in June 2015.

Some further background reading on the topic include the following links.

 

Business in an age of data abundance

The economics of cheap data change industries the same way abundant energy defined the Twentieth Century

I’m preparing a corporate talk for next week on the changing economy and one theme that sticks out is how the Twentieth Century was defined by cheap energy and physical mobility as mains electricity and the internal combustion engine became ubiquitous and affordable.

The picture accompanying this post illustrates that shift, Sydney’s Circular Quay a hundred years ago was just at the beginning of the automobile era. The previous fifty years had bought trams, the telegraph and reliable shipping but the great strides of the Twentieth Century were still to happen.

At that stage the steam engine and advances in electrical transmission had bought reliable power to the masses, although it was still expensive. What was to come over the next fifty years was that energy was about to become cheap and abundant. That drove the suburbanisation of western societies and the development of industries around the availability of cheap power and a mobile workforce.

At the time though information was still expensive, the control of broadcast networks by a few license holders and print operations by those who could afford the massive costs of producing and distributing magazines or newspapers made data difficult to get and worth paying for.

Today we’re at the start of a similar shift in information; it’s no longer expensive or difficult to obtain.

What that means for the next thirty years is what industries will develop in an economy where information is basically free and ubiquitous. Just as cheap energy created the consumerist economy, we’re going to see a very different environment in an age of cheap data.

Twitter’s discordant note

Twitter’s decision to restrict access to its data has cost them dearly

It’s been a bad week for the social media service Twitter with its stock pounded after the leak of poorer than expected results.

Writer Matthew Ingham says Twitter lost its way five years ago when it started closing down access to third party developers, a move that hurt the service’s growth and user adoption.

Twitter’s move was greeted with disappointment at the time and many developers gave up working on the company’s APIs.

With the growth of third party applications stunted, there was little reason for new users to come on board and so Twitter is now disappointing the market with its results.

Basically Twitter CEO Dick Costolo and his team reaped what they sowed in restricting access; they kept control of their data but it’s cost them users and hurt their share value.

Twitter’s woes show that the economics of  cloud and social media services reward business that share data. While there may be some commercial and legal limits to what information can be shared, the default position should be to make data available.

In an information rich society, those who contribute the most get the rewards. This is the point Twitter’s management missed.

The high cost of distrust

A lack of trust in data is going to cost the world’s economy over a trillion dollars forecast a Cisco panel

A lack of trust in technology’s security could be costing the global economy over a trillion dollars a panel at the Australian Cisco Live in Melbourne heard yesterday.

The panel “how do we create trust?” featured some of Cisco’s executives including John Stewart, the company’s Security and Trust lead, along with Mike Burgess, Telstra’s Chief Information Security Officer and Gary Blair, the CEO of the Australian Cyber Security Research Institute.

Blair sees trust in technology being split into two aspects; “do I as an individual trust an organisation to keep my data secure; safe from harm, safe from breaches and so forth?” He asks, “the second is will they be transparent in using my data and will I have control of my data.”

In turn Stewart sees security as being a big data problem rather than rules, patches and security software; “data driven security is the way forward.” He states, “we are constantly studying data to find out what our current risk profile is, what situations are we facing and what hacks we are facing.”

This was the thrust of last year’s Splunk conference where the CISO of NASDAQ, Mark Graff, described how data analytics were now the front line of information security as threats are so diverse and systems so complex that it’s necessary to watch for abnormal activity rather than try to build fortresses.

The stakes are high for both individual businesses and the economy as technology is now embedded in almost every activity.

“If you suddenly lack confidence in going to online sites, what would happen?” Asks Stewart. “You start using the phone, you go into the bank branch to check your account.”

“We have to get many of these things correct, because going backwards takes us to a place where we don’t know how to get back to.”

Gary Blair described how the Boston Consulting Group forecast digital economy would be worth between 1.5 and 2.5 trillion dollars across the G20 economies by 2016.

“The difference between the two numbers was trust. That’s how large a problem is in economic terms.”

As we move into the internet of things, that trust is going to extend to the integrity of the sensors telling us the state of our crops, transport and energy systems.

The stakes are only going to get higher and the issues more complex which in turn is going to demand well designed robust systems to retain the trust of businesses and users.

Video and the internet of things

High resolution video coupled with the IoT are part of the Big Data explosion

A few days ago we discussed how 4k video cameras are going to change the sports broadcasting industry.

Yesterday executives from modular data center supplier VCE held a media lunch where they discussed some of their industrial applications. One of the areas they discussed was the monitoring of power stations with large resolution cameras.

The 4k cameras are trained on machine rooms with software watching for irregular conditions such as excessive vibrations, leaks or smoke. Should something out of the ordinary be detected, warnings can be triggered and potentially affected equipment spun down.

With the 4k resolution the cameras are able to watch large areas and like the sports coverage can zoom in for a detailed view of an affected area.

The use of 4k video cameras shows how the internet of things won’t just be about the data gathered from smart devices but also matching the information coming from IoT equipment with that of other environmental factors.

For companies like VCE these sort of applications are an opportunity as they need large amounts of data storage and processing power in local centres.

In many respects these small scale data centers are a large scale example of the fog computing being touted by companies like Cisco where most of the operational tasks are carried out by local equipment with only reports and exceptions being transmitted to the cloud.

This sort of application also shows the demands different industries are going to have for local data processing and storage with the VCE executives suggesting hospitals, mines and sports stadiums are also going to need these facilities.

For VCE – a troubled joint venture between Cisco, storage company EMC and computer virtualisation firm VM Ware – these are the sort of clients they are hoping to find to keep their business running.

Regardless of VCE’s prospects, the need for equipment to manage the data being collected by devices on the Internet of Things and 4k video is going to grow. That could give us one of the clues of where the jobs of the future are going to come from.

Reading the golden records – can we avoid a digital dark age?

Changing computer formats mean we risk a ‘digital dark ages’ industry experts warn.

In 1977 NASA’s Voyager mission launched from Cape Canaveral to explore the outer solar system, included on the vessel in case it encountered other civilisations were a plaque and a golden record describing life on Earth.

The record was, is, “a 12-inch gold-plated copper disk containing sounds and images selected to portray the diversity of life and culture on Earth.” It containing images,  a variety of natural sounds, musical selections from different cultures and spoken greetings in fifty-five languages.

Most American households in 1977 could have listened to the sounds on Voyager’s golden disk but were the spaceship to return today it would be difficult to find the technology to read the record.

This is the concern of Google Fellow and internet pioneer Vint Cerf who told the American Association for the Advancement of Science’s annual meeting in San Jose this week we are “facing a forgotten century” as today’s technologies are superseded rendering documents unreadable.

A good example of ‘bit rot’ is the floppy disk – the icon used by most programs to illustrate saving files is long redundant and few organisations, let alone households, have the ability to read a floppy disk.

For corporations the problem of dealing with data stored on tape is an even greater problem as proprietary hardware and software from long vanished corporations becomes harder to find or engineer.

As the Internet of Things rolls out and data becomes more critical to business operations, the need for compatible and readable formats will become even more important for companies and historical information may well become a valuable asset.

With libraries, museums and government archives having digitised historic information, this issue of accessing data in superseded formats becomes even more pressing.

It may be that important documents need to be kept on paper – although there’s still the problem of paper deteriorating  – to make sure the 21st Century doesn’t become the digital dark ages and our golden records remain unread.

The what and the why

SurveyMonkey CEO David Goldberg believes we’re still in the early days of understanding the new economy

“People are drowning in big data,” SurveyMonkey’s CEO Dave Goldberg says in the latest Decoding The New Economy video.

Goldberg sees SurveyMonkey as bringing order to the world of big data in allowing organisations to put their information in context, “We want people to ask the right questions so we can get better data.”

“Here’s a question I need to answer – how happy are my employees? what do customers think of my new product? What are my students doing at school this year?”

Growing the survey industry

One group that’s uncomfortable with the rise of SurveyMonkey, a privately listed company that’s worth $1.3 billion after a capital raising last year, are traditional market research firms who see the service as putting a powerful tool in experienced hands. Goldberg sees it as an opportunity for the market research industry.

“We’re not replacing market researchers,” says Goldberg, “most people who come to SurveyMonkey haven’t used a market researcher before. It actually probably creates more demand for more sophisticated research down the line.”

Goldberg himself isn’t from a market research background, instead he hails from the tech sector having set up LAUNCH in 1994, one of the early music streaming companies which he sold to Yahoo! in 2001 and became the company’s Director of Music.

He left Yahoo1 in 2007 and spent two years in the venture capital industry before joining SurveyMonkey as CEO in 2009.

Understanding the data

From his experience, Goldberg sees understanding data the key business skill for today’s workers, firmly believing that kids should be taught statistic rather than coding.

“Everyone is going to have to learn how to use data.” Says Goldberg, “someone was asking me the other day about sort of skills should we teach our kids to prepare them for the future and I think the thing we’re not doing enough of is teaching them how to use and analyze data.”

To Goldberg we’re still in the early days of understanding how mobile and social media are going to change business with understanding data being one of the great opportunities.

“Implicit data is really interesting but it tells you ‘what’, it doesn’t tell you the ‘why’, believes Goldberg. “We think what we do is the explicit side, we gotta ask people to get the ‘why.”