Big Data News – 12 Sep 2016

Today's Infographic Link: Flat Vector Infographic Elements

Featured Article
Unless you are doing all your system configuration and implementation work internally, you'll be depending on consultants to get your cloud projects done. When consultants start doing dumb things, your project will be at risk. Even if they get your project done right, if they can't make a profit they'll eventually have to bail on you. Full disclaimer here: I'm a consultant, and this article is written entirely from the perspective of the consultant/system integrator. I'm not defending anybody's practices, but trying to wave a red flag for those of you in contract negotiations right now.

Top Stories
Aerohive Networks announces what it describes as the first software-defined LAN spanning both wired and wireless networking environments.

Teradata makes a number of announcements about its databases.

With several major players emerging as mega-cloud vendors, the distinctions between IaaS, PaaS, and SaaS have blurred. Here's a primer on the various offerings, based on the latest Forrester Wave report.

Aruba, a division of Hewlett Packard Enterprise, announced software today that's designed to help companies speed up secure integration of mobile devices and Internet of Things objects into their networks. Called Aruba Mobile First Platform, the software is based on application programming interfaces (APIs) for use by third-party developers and developer teams inside companies to help them boost automation with IoT devices and allow mobile workers to be more efficient. Mobile First is built on Aruba OS 8.0, the company's new operating system, which is deployed as a virtual machine on a server appliance. Also, Aruba announced enhancements to its existing Aruba ClearPass software for Mobile First to make it easier for IT security teams to integrate cloud-hosted services into ClearPass. This means customers can more easily build software workflows for Enterprise Mobility Management packages.

Accelerated change challenges change management, security DevOps and emerging technologies that enable business innovation and opportunities demand fast, frequent change from the enterprise. The speed and regularity as well as the kinds of change challenge change management and ultimately security. To secure the enterprise in environments of unwieldy change, the business needs to know how each new technology affects change management and the organization's security defenses. Organizations can then begin to evolve change management and security to close those gaps and avoid impacts on security. Emerging technologies such as DevOps, IoT, automation/intelligent software, information technology service partnering, cloud computing and BYOD all straighten out the curves in the race to make changes that propel the enterprise forward.

Industrial products maker A.W. Chesterton wanted to ratchet up sales, but its manual processes and siloed data just weren't doing the job anymore. The company didn't have a way to pull reports and analytics from 20-plus years of data on 1.5 million products that were locked away in disparate data sources in 100 countries. "Our sales guys really didn't want to deal with complex systems, but they desperately needed information about where to find customers, where they were in their orders [and] their margins, and what they had left to sell in the month," says Tom Meier, vice president of IT.

Organizational growth can sometimes lead to IT slowdowns. That's what happened when two leading Brazilian food companies merged in 2012. The newly formed company, BRF (formerly Brasil Foods), was the seventh-largest food company in the world and Brazil's market leader, and partly because of its size it now faced a costly problem. BRF serves global superstore chains such as Wal-Mart and Carrefour, as well as supermarkets, wholesalers, food-service operations and family-owned bodegas, and its SAP system was being hit with 20,000 to 25,000 pricing requests daily. However, it took 17 days to make price changes, which isn't ideal for a business that sells perishable commodities.

Cerner, a provider of healthcare technology, is putting big data and analytics to work to get a more complete picture of people's health and predict potentially life-threatening risks. Cerner's Enterprise Data Hub, which uses a big data platform from Cloudera, brings together data from an almost unlimited number of sources. By analyzing the petabytes of data available to them, data scientists can better understand patients, conditions or trends. They're now better able to determine, for example, the probability of a bloodstream infection, such as the early onset of sepsis. Cerner developed what it calls the St. John sepsis agent, a tool that uses an algorithmic approach to detect cases of the infection. It's deployed in a cloud-hosted production system that actively monitors more than 1 million patients daily, says Bharat Sutariya, vice president and chief medical officer of population health at Cerner.

When CPL Online opened for business in 2010, it offered 20,000 interactive online courses on food safety, health and safety, fire awareness and call centers. That number grew to 35,000 courses in 2011 and exploded to 160,000 by 2012. When its SQL platform couldn't keep up anymore, CPL seized the opportunity to transform its business and deployed the HPCC Systems platform, a high-performance big data analytics system from LexisNexis Risk Solutions. CPL's HPCC deployment uses data to track user trends, spot suspicious activity (to help root out cheaters) and identify the best-performing employees. Heat mapping shows how users work their way through every mouse-click of a course and helps CPL Online improve engagement and content. "They know which areas are more or less relevant or are attracting most of the focus from the users," says Flavio Villanustre, vice president of technology at LexisNexis Risk Solutions.

In 2010, Duke Energy launched a program called My Home Energy Report (MyHER), through which it sends customers an analysis of their energy use with comparisons to similar households. The goal is to motivate people to reduce their energy use, and Duke tries to help them do that by providing timely tips and special offers, including rebates on purchases of energy-efficient appliances, free and discounted lightbulbs, and free home energy audits. [ Download this story and lots more in Computerworld's September digital magazine! ] Homeowners receive MyHER reports eight times a year, says Kelly Kuehn, senior products and services manager at Duke Energy. When the program started, reports were only sent out in the mail, but now they're also available online, she says.

Two years ago, GE Healthcare faced a challenge: It had to transition its Centricity EDI Services Clearinghouse off of its existing platform to something new — and it had to do it without interrupting the flow of business. A unit of General Electric, GE Healthcare provides medical systems and services, and its clearinghouse serves more than 2,100 healthcare organizations in the U.S., processing some 588 million claims that annually represent $185 billion in transactions.

Like many law enforcement agencies, the Halton Regional Police Service has been collecting data on its activities for years. But as is the case with most such efforts, the data was ending up in a "digital filing cabinet" — and users couldn't find anything unless they knew exactly what they were looking for. So in 2014, the Toronto-area police department created an analytics unit with a mandate to make things more efficient. Operating like a technology startup, the unit works to promote a culture of data-driven decision-making; it's staffed with data scientists, mathematicians, programmers and police officers.

Welcome to the stories of the 2016 Computerworld Data+ Editors' Choice Award honorees. Chosen by a panel of Computerworld editors, this year's 20 winning organizations have used data analytics to achieve a wide spectrum of gains, from improving business profitability to uncovering trends in criminal activity and reducing energy use — even keeping trains running on schedule. Interested in finding a project that might be specific to what you do? We've collected and categorized them all in a searchable table, below — click through any item in the first column to see a project profile. Or, you can browse the projects by clicking through them in the navigation strip at the bottom of this story and all the project profiles. And remember to read through the full September digital magazine, which features the winners as well as news analysis, opinions and more.

The great minds at NASA's Jet Propulsion Laboratory have always known that by analyzing what they've already accomplished — in data, documents, videos and images — they could save months of time and millions of dollars, and even enable breakthrough discoveries. However, every mission has its own unique data sets and tools are constantly evolving, creating challenges for engineers and scientists who want to quickly search and analyze petabytes of disparate data that in some cases had been created decades apart. Technology has finally caught up to JPL's dream. Its interactive search and analytics project uses open development technologies, methodologies and tools on cloud platforms to unlock the data.

Kentucky transportation officials found themselves in a tough spot after back-to-back extreme winters. Average winter costs almost doubled when unforeseen storms hit. They needed better insight into fleet management and the ability to deploy resources more quickly. Toward that end, the Kentucky Transportation Cabinet developed a system that uses real-time and crowdsourced data about road conditions to give crews the insight they need to respond to situations more quickly. To build what it calls its Intelligent Transportation System (ITS), the agency first partnered with the makers of the crowdsourced traffic app Waze, which lets drivers report what they're seeing on the roads in real time. Next, it enlisted mapping software vendor Esri to incorporate the real-time data into existing GIS applications. The beta system added data from official state weather stations and traffic sensors, as well as information about the location and status of snowplows.

Large banks, insurance companies and hedge funds are all required to gather reams of data on their clients to determine their risk and to meet regulatory requirements — a process known as client onboarding. The problem was, onboarding typically required humans to read and analyze sometimes hundreds of thousands of pages of documents, with little room for error. Professional services firm KPMG was looking for a way to automate the costly onboarding process for its clients. Reports needed to capture information from SEC filings, blog entries, social media, text messages and other sources of structured and unstructured data.

Today's fitness equipment can track your heart rate, count calories and sync with smartphones and fitness trackers. Now Life Fitness is taking things a step further by harnessing data and wireless technology to help gym owners keep their cardio machines up and running. The company now offers a remote monitoring service called LFconnect Protect that analyzes equipment diagnostics in real time and notifies gym owners of any maintenance issues it discovers. "We track data [wirelessly], collect it online and analyze certain sets of data points," says Amad Amin, senior digital product manager at Life Fitness.

In 2005, the 15-person team at MFC Netform needed to persuade potential customers to take a gamble on the upstart supplier of automotive equipment. Today, MFC is a leading supplier of powertrain parts for the automotive and agricultural industries. "We had to overcome marketing challenges and competition from larger, more established providers," says Jeff Schroeder, manager of information systems at MFC Netform. "To answer this challenge, we developed an ambitious plan to aggressively grow and secure business from large automotive and agricultural [equipment manufacturers]." From the company's inception, its leaders wanted to center operations around ERP as a competitive differentiator. "There was a keen focus on providing a strong tool set for operations, quality and engineering," Schroeder says. "Working hand in hand with these departments, we eventually chose the Plex Manufacturing Cloud as our single system."

At Pinterest, a small support team with just five staffers is responsible for helping millions of people use the popular photo-sharing social media site. They offer assistance to users — or "pinners" — who need to regain access to their accounts or report bugs, among other things. To avoid being overwhelmed, they needed a way to identify support emails that require immediate action, and they found a tool that does just that: Zendesk's customer satisfaction prediction application, says Maggie Armato, reactive team lead for Pinner Operations. "Before, we'd manually sort through tickets and mark them as 'high-priority.' Now Zendesk does it for us — and much more accurately," she says.

Life science research and diagnostic company Qiagen learned years ago that younger scientists were increasingly digital and weren't interested in traditional sales methods, such as on-site visits and in-person meetings. The company needed to reach its target audience in the digital world. As part of its digital transformation, Venlo, Netherlands-based Qiagen developed 10 new tools, including one that it calls the sales cockpit, an insight-delivering app store. As part of the effort, 10 data sources and four new data streams were combined into one analytics database. Information found there includes customers' purchase histories, biotech domain data, funding information, published articles and crawled websites.

Download the latest edition of the Computerworld Digital Magazine!

TriCore Reference Laboratories performs more than 9.5 million tests per year and handles 70% of New Mexico's clinical laboratory services. But TriCore wanted to do more than just issue test results; it wanted to help close the care gap between when treatment is recommended and when it is received, says Patrick Prescott, vice president of financial solutions at Rhodes Group, a clinical IT solutions provider that merged with TriCore in 2015. The team developed a system that analyzes patients' full records and uses predictive analytics to provide new insights into people's health. One of its pilots identified high-risk patients needing prenatal care. TriCore plans to do similar pilots of its Diagnostic Optimization system for patients with Hepatitis C and diabetes. The system monitors nearly 2 million people in New Mexico, for example, and it identified 100,000 diabetic patients in care-gap status. That information can be sent to health plan coordinators who can arrange care.

A few years ago, Arkansas state officials had to find a way to improve the quality of medical care while cutting costs. They faced an 8% annual increase in Medicaid costs coupled with a potential budget shortfall of $140 million for the Division of Medical Services. Hoping to fulfill both goals, the state launched the Arkansas Health Care Payment Improvement Initiative, which set up a value-based reward system. Providers that demonstrated value while delivering high-quality care would share 50% of the savings.

As it prepared to launch its first oral oncology drug, Takeda Pharmaceuticals wanted to find a way to identify outliers — patients who had difficulty getting the medication they need for one reason or another. The company hoped to use the vast amounts of patient prescription and status data that pharmacies have at their disposal, reasoning that it could help identify people who were actively using its medication, as well as those who were stalled by benefits investigations or whose insurers were putting up barriers. "The person who's spending 20 days on benefits investigation status is a person who's struggling to get on therapy," says Rebecca Greenberg, director of field services and commercial systems at Takeda.

In 2013, TD Bank set an ambitious goal: to transform how it uses IT to strategically drive the business. "The goal is to provide self-serve access to the right kind of data to the right users based on their approved privilege and intended usage, such as performing data analytics, reporting and data processing," says Mok Choe, senior vice president and chief architect. "We had to navigate the complex process of automating the data-ingest process from dozens of data sources in the format of mainframe, XML and flat files that were being pulled into our big data platform," says Choe. [ Download this story and lots more in Computerworld's September digital magazine! ] The bank deployed a data management tool from Podium Data, which it installed on its Hadoop cluster. While the full implementation won't be complete until later this fall, there has already been an impact on the business, including a 39% improvement in identifying certain customers for marketing campaigns, and new revenue opportunities enabled by customer segmentation made possible by big data efforts.

Surgical site infections represent a serious and growing problem for hospitals and their patients. Recent data indicates that in the U.S. alone, roughly one out of every 20 patients admitted to a hospital will end up with a hospital-acquired infection, with a total cost of about $10 billion per year to the healthcare industry. Looking for a way to solve that problem, University of Iowa Hospitals & Clinics decided to see if using predictive analytics to drive real-time decision-making in operating rooms could help lower the rate. "Many factors determine whether or not a patient gets a surgical site infection, including characteristics of the patient, their medical history and the illness they're being treated for," says Dr. John Cromwell, associate chief medical officer and director of surgical quality and safety at the organization.

Utah Transit Authority has one of the largest service areas of any public transportation agency in the country, operating in six counties that span 1,400 square miles and are home to 80% of Utah residents. To keep trains on schedule, assess gaps in coverage, pinpoint causes of delays and identify ways to improve services, UTA employees once had to access a huge amount of data from multiple sources and then analyze it manually in Excel — or wait for IT to create custom reports. To automate the process, the UTA implemented an analytics platform that brings all of that information together. Using several tools from Information Builders, including the iWay DataMigrator, the WebFocus business intelligence portal and the InfoAssist intelligence and integration technologies, it was able to provide users with guided self-service analytics tools to query and visualize data through dashboards.

Utah Transit Authority has one of the largest service areas of any public transportation agency in the country, operating in six counties that span 1,400 square miles and are home to 80% of Utah residents. To keep trains on schedule, assess gaps in coverage, pinpoint causes of delays and identify ways to improve services, UTA employees once had to access a huge amount of data from multiple sources and then analyze it manually in Excel — or wait for IT to create custom reports.

Making machines smarter.

EDP helps organisations deal with their data.

Open data science languages – Python and R – offer tremendous advantages over legacy, proprietary products like SAS and MATLAB. You can embrace modern innovation, attract a new generation of data scientists, and go from ad hoc analysis to production models in one platform that embraces the open source ecosystem. But how does your enterprise make the transition without descending into anarchy? How can you embrace R, Python, and their thousands of powerful analytic packages without their accompanying legal risks? How do you see through the legacy vendor FUD and make open source work? We are here to help – Continuum Analytics EVP Anaconda Business Unit Michele Chambers and Sr. Data Scientist Christine Doig will help you embark on your enterprise's journey to Open Data Science in their webinar: Breaking Data Science Open on September 15th.

View this email online Check out this demo of Statistica Network Analytics. Grasp complex relationships by combining predictive analytics and human expertise. Register for Webcast New webcast: Six (Million) Degrees of Separation: The Practical Power of Network Analytics Date: Thursday, September 22, 2016 Time: 8 a.m. PT / 11 a.m. ET / 5 p.m. CEST The relationships hidden within complex networks are no match for the network analytics capabilities of Statistica 13.1. Join us as we discuss the practical power of network analytics and showcase Statistica's new features, use cases, visual capabilities and more. You'll learn how to tackle: Fraud detection Supply chain optimization Recommendation engines Identifying influencers and affinities    Register Now    Follow Us: Contact Us: | Privacy Statement  ©2016 Dell Inc. All rights reserved.

Real Time Anomaly Detection Read the whitepaper Real-Time Anomaly Detection and Analytics for Today's Digital Business (White Paper) Detecting incidents in streaming business data is a unique challenge, and data-heavy companies are often faced with: " Static thresholds that are either meaningless or cause alert-storms for seasonal data " Dashboards and reports that lag behind " Delays in identifying business incidents that impact revenue In this White Paper, Jason Bloomberg of Intellyx discusses how real-time anomaly detection based on machine learning is a game changer for digital technology companies.

Predictive Analytics World for Healthcare – Oct 23-27 in New York – brings together top predictive analytics experts, practitioners, authors, and healthcare thought leaders to discuss concrete examples of deployed predictive analytics in the healthcare industry. This event is held alongside two additional events: PAW Financial Services and PAW Business.   

  London Data Festival, November 16th & 17th The program for the London Data Festival is filling out, with new sessions being added every week – the depth and breadth of content covered is unrivaled with three separate summits all under one roof. Consisting of over 400+ senior-level attendees, exclusive presentations and interactive workshops, this the best place to improve your data literacy and connect with the best.  

  R and Visual Analytics Read the whitepaper The Power of R and Visual Analytics How to make sense of large, complex datasets quickly By combining the statistical horsepower of R with user-friendly visual analytics, organizations can further leverage the work of data scientists by empowering any business user to explore and understand the results. Read this whitepaper and learn how to easily: " Understand sentiment trends in text data " Identify similar groups within complex datasets " Optimize decision-making Get this free whitepaper to learn how you can use visual analytics alongside R to speed up your data science projects and get them in front of more eyes, leading to smarter, data-driven business decisions.

  Real Time Anomaly Detection Read the whitepaper White Paper: Real-Time Anomaly Detection and Analytics for Today's Digital Business Data-heavy companies face a unique challenge: detecting incidents in streaming business data. " Static thresholds are either meaningless or cause alert-storms for seasonal data. " Dashboards and reports lag behind. " You need to identify revenue-impacting business incidents in minutes, not days or weeks.

  Strata + Hadoop World: The one event you can't miss Early Price ends Friday, August 12 Strata + Hadoop World is September 26-29 in New York. Selling out last year with over 6,300 attendees, many call the biggest data gathering in the world the "one event you can not miss".

  Visualization is the best way to explore and communicate insights about data. Whether you're dealing with geospatial, time series or tabular data, interactive graphics allow everyone on your team, from analysts to executives, to understand the patterns in your data.

This entry was posted in News and tagged , , , , , , , . Bookmark the permalink.