Tag: big data

  • Goodbye to the media buyers long lunch

    Goodbye to the media buyers long lunch

    Yesterday Decoding The New Economy posted an interview with Michael Rubenstein of AppNexus about the world of programmatic advertising and being part of a rapidly growing startup.

    The whole concept of programmatic advertising is a good example of a business, and a set of jobs, being disrupted.

    Media buying has been a cushy job for a generation of well fed advertising executives. David Sarnoff’s invention of the broadcast media model in the 1930s meant salespeople and brokers were needed to fill the constant supply of advertising spots.

    Today the rise of the internet has disrupted the once safe world of broadcast media where incumbents were protected by government licenses and now the long lunching media buyers are finding their own jobs are being displaced by algorithms like those of AppNexus.

    A thought worth dwelling on though is that media buyers are part of a wider group of white collar roles being disrupted by technology – the same Big Data algorithms driving AppNexus and other services is also being used to write and select news stories and increasingly we’ll see executive decisions being made by computers.

    It’s highly likely the biggest casualties of the current data analytics driven wave won’t be truck drivers, shelf pickers or baristas but managers. The promise of a flat organisation may be coming sooner than many middle managers – and salespeople – think.

    Similar posts:

    • No Related Posts
  • Dealing with the biggest of data

    Dealing with the biggest of data

    How do you deal with the biggest data sets of all? Bob Jones, a project leader for the European Organization for Nuclear Research – commonly known as CERN – described how the world’s largest particle physics laboratory manages 100 petabytes of data.

    The first step is not to collect everything, ““We can’t keep all the data, the key is knowing what to keep” says Jones. This is understandable given the cameras capturing the collisions have 150 million sensors delivering data at 40 million times per second.

    Jones was speaking at the ADMA Global Conference’s Advancing Analytics stream where he was describing how the project manages and analyses the vast amounts of data generated by the huge projects.

    Adding to Jones’ task and that facing CERN’s boffins is that data has to be preserved and verifiable so scientists can review the results of experiments.

    Discovering the Higgs Boson for instance required finding 400 positive results out of 600,000,000,000,000,000 events. This requires massive processing and storage power.

    Part of the solution is to have a chain of data centres across the world to carry out both the analytics and data storage supplemented by tape archiving, something that creates other issues..

    “Tape is a magnetic medium which means it deteriorates over time.” Jones says, “we have to repack this data every two years.”

    Another advantage with a two year refresh is this allows CERN to apply the latest advances in data storage to pack more data into the medium.

    CERN itself is funded by its 21 member states – Pakistan is its latest member – which contribute its $1.5 billion annual budget and the organisation provides data and processing power to other multinational projects like the European Space Agency and to private sector partners.

    For the private sector, CERNs computing power gives the opportunity to do in depth analytics of large data sets while the unique hardware and software requirements mean the project is a proving ground for high performance equipment.

    Despite the high tech, Jones says the real smarts behind CERN and the large Hadron Collider lie in the people. “All of the people analysing the data are trained physicists with detailed, multi year domain knowledge.”

    “The reason being is the experiment and the technology changes so quickly, it’s not written down. It’s in the heads of those people.”

    In some respects this is comforting for those of us worrying about the machines taking over.

    Similar posts:

    • No Related Posts
  • Towards the zero defect economy

    Towards the zero defect economy

    At 2.03 in the morning of July 11, 2012, a Norfolk Southern Railway Company freight train derailed just inside the city limits of Columbus, Ohio.

    The resulting crash and fire caused over a hundred people to be evacuated, resulted in over a million dollars in damages and created massive disruption throughout the US rail network.

    Could accidents like this be avoided by the Internet of Things? Sham Chotai, the Chief Technical Officer of GE Software, believes applying sensor technology to locomotives can detect conditions like defective rails and save US railway operators around a billion dollars a year in costs.

    “We decided to put the technology directly on the locomotive,” says Chotai in describing the problem facing railroad operators in scheduling track inspections. “We found we were mapping the entire railway network, and we were mapping anything that touched the track such as insulated joins and wayside equipment.”

    This improvement in reliability and its benefits to business is something flagged by then Salesforce Vice President Peter Coffee in an interview with Decoding the New Economy in 2013.

    “You can proactively reach out to a customer and say ‘you probably haven’t noticed anything but we’d like to come around and do a little calibration on your device any time in the next three days at your convenience.’”

    “That’s not service, that’s customer care. That’s positive brand equity creation,” Coffee says.

    Reducing defects isn’t just good for brands, it also promises to save lives as Cisco illustrated at an Australian event focused on road safety.

    Transport for New South Wales engineer John Wall explained how smarter car technologies, intelligent user interfaces and roadside communications all bring the potential of dramatically reducing, if not eliminating, the road toll.

    Should it turn out the IoT can radically reduce defects and accidents it won’t be good news for all industries as John Rice, GE’s Global Head of Operations, pointed out last year in observing how intelligent machines will eliminate the break-fix model of business.

    “We grew up in companies with a break fix mentality,” Rice says. “We sold you equipment and if it broke, you paid us more money to come and fix it.”

    “Your dilemma was our profit opportunity,” Rice pointed out. Now, he says engineering industry shares risks with their customers and the break-fix business is no longer the profit centre it was.

    A zero defect economy is good news for customers and people, but for suppliers and service industries based upon fixing problems it means a massive change to business.

    Similar posts:

    • No Related Posts
  • Literacy in old and new terms

    Literacy in old and new terms

    I’m in Wellington, the capital of New Zealand, for the next few days for the Open Source, Open Society conference.

    During one of the welcome events Lillian Grace of Wiki New Zealand mentioned how today we’re at the same stage with data literacy that we were two hundred years ago with written literacy.

    If anything that’s optimistic. According to a wonderful post on Our World In Data, in 1815 the British literacy rate was 54%.

    world-literacy-rates

    That low rate makes sense as most occupations didn’t need literate workers while a hundred years later industrial economies needed employees who could read and write.

    Another notable point is the Netherlands has led the world in literacy rates for nearly four hundred years. This is consistent with the needs of a mercantile economy.

    Which leads us to today’s economy. In four hundred years time will our descendants  be commenting on the lack of data literacy at the beginning of the Twenty-First Century?

     

    Similar posts:

    • No Related Posts
  • Big sports data – how tech is changing the playing field

    Big sports data – how tech is changing the playing field

    “When you’re playing, it’s all about the winning but when you retire you realise there’s a lot more to the game,” says former cricketer Adam Gilchrist.

    Gilchrist was speaking at an event organised by software giant SAP ahead of a Cricket World Cup quarter final at the Melbourne Cricket Ground yesterday.

    SAP were using their sponsorship of the event to demonstrate their big data analytics capabilities and how they are applied to sports and the internet of things.

    Like most industries, the sports world is being radically affected by digitalisation as new technologies change everything from coaching and player welfare through to stadium management and fans’ experience.

    Enhancing the fan experience

    Two days earlier rival Melbourne stadium Etihad in the city’s Docklands district showed off their new connected ground where spectators will get hi-definition video and internet services through a partnership between Telstra and Cisco.

    While Etihad’s demonstration was specifically about ‘fan experience’, the use of the internet of things and pervasive wireless access in a stadium can range from paperless ticketing to managing the food and drink franchises.

    In the United States, the leader in rolling out connected stadiums, venues are increasingly rolling out beacon technologies allowing spectators to order deliveries to their seats and push special offers during the game.

    While neither of the two major Melbourne stadiums offer beacon services at present, the Cisco devices around the Etihad have the facility to add Bluetooth capabilities when the ground managements decide to roll them out.

    Looking after players

    Probably the greatest impact of technology in sport is with player welfare; while coaches and clubs have been enthusiastic adopters of video and tracking technologies for two decades, the rate of change is accelerating as wearable devices are changing game day tactics and how injuries are managed.

    One of the companies leading this has been Melbourne business Catapult Sports which has been placing tracking devices on Australian Rules football players and other codes for a decade.

    For coaches this data has been a boon as it’s allowed staff to monitor on field performance and tightly manage players’ health and fitness.

    Professional sports in general have been early adopters of new technologies as a small increase in performance can have immediate and lucrative benefits on the field. Over the last thirty years clubs have adopted the latest in video and data technology to help coaches and players.

    As the technology develops this adoption is accelerating, administrators are looking at placing tracking devices within the balls, goals and boundary lines to give even more information about what’s happening on the field.

    Managing the data flow

    The challenge for sports organisations, as with every other industry, is in managing all the data being generated.

    In sports managing that data has a number of unique imperatives; gamblers getting access to sensitive data, broadcast rights holders wanting access to game statistics and stadium managers gathering their own data all raise challenges for administrators.

    There’s also the question of who owns the data; the players themselves have a claim to their own personal performance data and there could potentially be conflicts when a competitor transfers between clubs.

    As the sports industry explores the limits of what they can do with data, the world is changing for players, coaches, administrators and supporters.

    Gilchrist’s observation that there’s a lot more to professional sports than just what happens on the field is going to become even more true as data science assumes an even greater role in the management of teams, clubs and stadiums.

    Paul travelled to Melbourne as a guest of Cisco and SAP.

    Similar posts:

    • No Related Posts