Big Data News – 09 Nov 2016

Today's Infographic Link: Understanding Big Data

Featured Article
Cray and SGI will supply systems as part of the DoD's program to upgrade its capabilities in high-performance computing.

Top Stories
While many smart cities are using IoT for transportation issues, there's a host of other initiatives these urban centers should start to address with the technology. Environmental and sustainability programs top a new list from Gartner.

The Upshot, FiveThirtyEight, Predictwise, etc: their predictions for President varied over the campaign as you'd expect as new data came in, but consistently made Clinton a solid favorite, with a probability of a win topping 70% the day before election day. So what went wrong? As in any statistical forecast, there are three possibilities: Source: New York Times The models were wrong. No model is perfect, but it seemed to me at least that the various forecasts, despite their differing methodologies, all captured the essential mechanisms of being elected President: the electoral college; the similar behaviours of some states; the influence of economic and demographic statistics; the relationship between polls and votes.

The framework for managing network virtualization software is now baked into the Nutanix platform.

Developers are finding a new way to create apps and programs – here's a quick look at how microservices work, and what it means for IT shops and their development operations.

Developers are finding a new way to create apps and programs – here's a quick look at how microservices work, and what it means for IT shops and their development operations.

The recent scandal at Wells Fargo could have been prevented if the right CRM system had been in place.

Modular and open source are now the watchwords for network infrastructure, whether you're delivering internet connections or VR cat videos. On Tuesday at the Structure 2016 conference in San Francisco, Facebook announced its most powerful modular data-center switch yet, and AT&T gave an update on its huge migration from dedicated servers to a software-based architecture. Once the same kind of hardware can do different things in a network, everyone gets more freedom to accomplish what needs to get done. That's true for Facebook, which built on its own switch innovations and software stack in the new Backpack switch, and for AT&T, which says enterprises can now order and turn on services in 90 seconds instead of 90 days. Agility is also the key selling point for cloud companies like Google, which hopes its customers can ignore hardware altogether in a few years.

Like Google, Microsoft has been differentiating its products by adding machine learning features. In the case of Cortana, those features are speech recognition and language parsing. In the case of Bing, speech recognition and language parsing are joined by image recognition. Google's underlying machine learning technology is TensorFlow. Microsoft's is the Cognitive Toolkit.  (Insider Story)

United Nations and Qlik has announced partnership to utilize data analytics to improve efficiency and efficacy of humanitarian works across the world

It is likely that such a move will see consumers gravitate towards digital payments at a faster pace.

Earlier this year, Microsoft made a splash at its Ignite conference for IT professionals when it announced that it has been racking cards of programmable chips together with servers in its cloud data centers. The chips, called field-programmable gate arrays (FPGAs), can be reconfigured after being deployed to optimize them for particular applications such as networking and machine learning. Now, Microsoft is investing in tools that would allow customers to program the FPGAs, said Scott Guthrie, the executive vice president in charge of Microsoft's cloud and enterprise division, during a talk at the Structure conference in San Francisco.

Modular and open-source are now the watchwords for network infrastructure, whether you're delivering internet connections or VR cat videos. On Tuesday at the Structure 2016 conference in San Francisco, Facebook announced its most powerful modular data-center switch yet, and AT&T gave an update on its huge migration from dedicated servers to a software-based architecture. Once the same kind of hardware can do different things in a network, everyone gets more freedom to accomplish what needs to get done. That's true for Facebook, which built on its own switch innovations and software stack in the new Backpack switch, and for AT&T, which says enterprises can now order and turn on services in 90 seconds instead of 90 days. Agility is also the key selling point for cloud companies like Google, which hopes its customers can ignore hardware altogether in a few years.

IBM's Mac Devine spoke at the 2016 Structure Conference on how IoT, big data, and cognitive computing are changing the way that enterprises are approaching their infrastructure.

Tableau is moving into the data-wrangling business, announcing plans for visual data-preparation software code-named Project Maestro. The idea is to bring the same sort of "self-service" visualization to the prepping and cleaning of data as they've built for data analysis, Dan Jewett, Tableau vice president of product management, told Tableau's user conference this morning. "Maestro is going to make data preparation a breeze." The software is expected to be available "later next year." In a brief demo, Jewett showed visual ways of inspecting, joining and editing data. Results could then be piped into Tableau for analysis.

Google infrastructure czar Urs Hölzle is focused on a cloud future where customers don't think about the infrastructure underlying all of the workloads they're running. In his view, one of the key advantages of the cloud is that customers can get the benefits of new hardware without having to completely rework their software. "So that means you can have a million customers who move to that new hardware platform, not knowing they did," he said Tuesday at the Structure Conference in San Francisco. "Which means that you can really insert this new technology in a much faster cycle than you could if you did the same thing on-premises." That means companies can get quick, seamless improvements to performance, as opposed to an on-premises deployment. When operating their own data centers, companies must take the time to evaluate new hardware, and take the time to roll it out.

Artificial intelligence is as potent in malware battle as in other endeavors where data must be assessed, trends deduced, plans of action put in play.

The Senseable City Lab and the transportation company Uber are launching a research initiative to explore how car- and ride-sharing networks could reshape the future of urban mobility. This initiative will explore new mobility paradigms for the 21st century, building on both parties' data and analytics strengths. "We all know how the 'sharing economy' has revolutionized many aspects of our lives. How will it challenge traditional notions of mobility and individual freedom after the advent of self-driving?" asks Carlo Ratti, professor of the practice of urban technologies and director of the Senseable City Lab. "In the United States, cars are idle 95 percent of the time, so they are an ideal candidate for the sharing economy."




Google, Amazon and Facebook can magically recognize images and voices, thanks to superfast servers equipped with GPUs in their mega data centers. But not all companies can afford that level of resources for deep learning, so they turn to cloud services, where servers in remote data centers do the heavy lifting. Microsoft has made such cloud services trendy with Azure and is one of the few companies offering remote servers with GPUs, which excel in machine-learning tasks. But Azure uses older Nvidia GPUs, and it now has competition from Nimbix, which offers a cloud service with faster GPUs based on the Nvidia's latest Pascal architecture.

Give your data and analytics journey to the cloud a running start with these four considerations developed by industry analyst Claudia Imhoff.

Apple has been trying to make headway into the enterprise for a number of years now, with varying degrees of success.

Sencha adds support for JavaScript ES2015 to its Ext JS framework.

There is a fairly standard narrative around cloud computing that says that the bulk of workloads will end up sitting with two of three mega-vendors in the public cloud space. At the same time, those organizations that need their workload on an on-premises or private cloud will leverage either a proprietary platform (VMware, for example) or one from an open source initiative like OpenStack. That narrative doesn't leave much room for specialist hosting vendors who get squeezed out between the various players. So it was interesting to hear from hosting solutions vendor Symmetry, a small firm that is built to support and manage production SAP and SAP HANA workloads (admittedly alongside more traditional IT workloads). The company has expanded its platform geographically into Phoenix and Northern Virginia as an augmentation from its existing Chicago and Madison, Wis., data centers. Symmetry suggests that this expansion really moves the needle on its ability to deliver product and service to these enterprise workloads.

I started my journey at Hortonworks a little over five years ago and have been to many Hadoop Summits in the US, Europe and now Asia.  I just kicked off a two day sold out show in Tokyo and shared my thoughts and stories on architectures and business transformation.    John Kreisa's post here has… The post Sold Out Hadoop Summit Tokyo Shows Excitement For Connected Data Platforms appeared first on Hortonworks.

IDC is forecasting big growth for cognitive computing and AI in the next 5 years. This infographic shows the growth, industries, and use-cases for these technologies.

One of the ugliest presidential elections in American history has been powered by advances in technology. And yet, tech is making this election more democratic than ever.

[Title Image Abbreviations: CRM – Customer Relationship Management, OPM – Offer Portfolio Management, PMML – Predictive Model Markup Language] The perceived status of the consumer experience enhancement industry from an innovation practitioner's vantage point is that it is mature. Operational perspectives and best practices are continuing to emerge and seem to be linked to corporate… The post Contextual Experience Innovation appeared first on Predictive Analytics Times.

Data on the number of people who have committed suicide tends to be reported with a substantial time lag of around two years. We examine whether online activity measured by Google searches can help us improve est…

Hortonworks sees big data and IoT evolving together. After all, every business is a data business. And in a connected world everything is an IoT device. For example, consider the connected car. The connected car is actually multiple use cases of IoT data. There is data from the car needed for scheduled maintenance,  recalls and… The post Big Data and IoT appeared first on Hortonworks.

Last week we announced third quarter results, and it was a milestone quarter.   Customers in 60 countries chose Hortonworks to help with their on-premise, hybrid and public cloud data management strategies. A big thanks to the Hortonworks team. I wanted to stop for a second and note the significance of the fact that we just… The post Hortonworks Market and Customer Momentum Continues to Accelerate with 1,000+ Customer Business Transformations appeared first on Hortonworks.

by Ali Zaidi, Data Scientist at Microsoft Apache Spark and a Tale of APIs Spark is an exceptionally popular processing engine for distributed data. Dealing with data in distributed storage and programming with concurrent systems often requires learning complicated new paradigms and techniques. Statisticans and data scientists familiar wtih R are unlikely to have much experience with such systems.

SAP HANA 2 embeds a broader range of analytics, algorithms and application programming interfaces (APIs) into the core in-memory computing platform.

Data science is having its 15 minutes of fame.

The next release of SAP's in-memory HANA database plugs a few gaps for enterprise high availability features. As the in-memory database market commoditizes, SAP differentiates with analytic capabilities for extended data types.

RPA tools are software robots that use business rules logic to execute specifically defined and repeatable functions in the same way that a person would.

SAP wants businesses struggling to keep up with the pace of innovation in its HANA in-memory database to relax as it readies a new version, to be known as HANA 2. Since introducing HANA in 2010, SAP has been releasing updates twice a year, providing customers with new capabilities but also pushing them to keep their software current to benefit from continuing support. The new version gives businesses two reasons to relax, according to Marie Goodell, vice president of product marketing at SAP. HANA 2 is designed to simplify things for the IT department, reducing the effort it takes to keep the lights on so that businesses can spend more time working on new, next-generation applications that take advantage of new types of data, she said. Even if they choose to keep upgrading, that should involve less work going forward.

Facebook users will be able to record smartphone videos that ape the style of famous artworks with a new feature unveiled Tuesday. Using a technique called style transfer, the feature takes live video and turns it into something that resembles the work of Van Gogh, Picasso and other artists. That effect is probably familiar to people who have used the app Prisma, which uses similar techniques to change the look of photos. Prisma's app can't perform live filtering, and some filters require a connection to the internet. Facebook's system can work offline and render live.

SAP wants businesses struggling to keep up with the pace of innovation in its HANA in-memory database to relax as it readies a new version, to be known as HANA 2. Since introducing HANA in 2010, SAP has been releasing updates twice a year, providing customers with new capabilities but also pushing them to keep their software current to benefit from continuing support. The new version gives businesses two reasons to relax, according to Marie Goodell, vice president of product marketing at SAP. HANA 2 is designed to simplify things for the IT department, reducing the effort it takes to keep the lights on so that businesses can spend more time working on new, next-generation applications that take advantage of new types of data, she said. Even if they choose to keep upgrading, that should involve less work going forward.

In an interview with Salvi Mittal, Dinesh Aggarwal explains about how he is leveraging analytics at IndiaMart.




In 2017, edge analytics will find itself in the limelight for companies of all sizes in a wide array of industries. What's driving this trend and how can organizations prepare for the change? Keep on reading: 2017 will be the launchpad for IoT edge analytics

This opinion piece shares insights into U.S./China market dynamics for companies trying to navigate this challenging path. Keep on reading: The IoT demonstrates the interdependence of Chinese and U.S. tech companies

Apple's success with the iPod and iPhone didn't really lead to another digital revolution with the iPad.

Business, from factory floor to training room and the showroom, is likely to be the biggest beneficiary of the fascinating technology of VR and AR.

Cyber crime is no longer a mere nuisance but is quickly becoming a huge problem. Just recently a cyber criminal was charged with wire fraud and computer fraud as he tried to steal more than $1.5 million. Cyber criminals are becoming increasingly more brazen as they exploit vulnerabilities in new technologies. According to Robert L. Capers, US Attorney for the Eastern District of New York, cyber criminals roam the Internet for information they can steal.

Do you find yourself increasingly having to make decisions amid uncertain conditions? The advanced capabilities offered by IBM SPSS Statistics aim to make Monte Carlo simulation a part of your risk analysis by bringing these two worlds together in a single software solution.

This entry was posted in News and tagged , , , , , , , . Bookmark the permalink.