Big Data News – 28 Oct 2016

Today's Infographic Link: Customer Journey Map

Featured Article
Of all the ways in which advanced analytics and machine intelligence can impact the enterprise business model, perhaps none is more crucial than its effect on IT itself. As infrastructure becomes more distributed and data loads become more complex, IT must become more adaptive, even to the point where it exceeds a technician’s ability to collect operating data, figure out what it all means and implement the required changes. So before organizations turn Big Data loose on functions like sales, marketing and compliance, it makes sense to implement it on the infrastructure and operational layers of the data environment itself.

Top Stories
If you know how to code with internationalization in mind, the day your company wants to go global, it's going to be so much faster and easier.

Sanovi Technologies is a provider of orchestration and visualization tools optimized for managing data protection.

Databricks, corporate provider of support and development for the Apache Spark in-memory big data project, has spiced up its cloud-based implementation of Apache Spark with two additions that top IT's current hot list. The new features — GPU acceleration and integration with numerous deep learning libraries — can in theory be implemented in any local Apache Spark installation. But Databricks says its versions are tuned to avoid the resource contentions that complicate the use of such features. [ InfoWorld's quick guide: Digital Transformation and the Agile Enterprise.

New funds aimed to fuel growth by accelerating development, expanding sales and marketing, and growing international operations.

In recent years, the best-performing systems in artificial-intelligence research have come courtesy of neural networks, which look for patterns in training data that yield useful predictions or classifications. A neural net might, for instance, be trained to recognize certain objects in digital images or to infer the topics of texts. But neural nets are black boxes. After training, a network may be very good at classifying data, but even its creators will have no idea why. With visual data, it's sometimes possible to automate experiments that determine which visual features a neural net is responding to. But text-processing systems tend to be more opaque. At the Association for Computational Linguistics' Conference on Empirical Methods in Natural Language Processing, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new way to train neural networks so that they provide not only predictions and classifications but rationales for their decisions.

It seems that Google Fiber — which is a division of Alphabet — has come to a crossroads.

At the first OpenPower European Summit, the group unveiled new projects and offerings based on the open architecture.

The recent DDoS attacks launched from IoT devices demonstrate that the internet of things spans all parts of IT and that most companies deploying it still need a lot of help. That's the message from ARM, the chip design company behind nearly every smartphone and a big chunk of IoT, at its annual TechCon event this week in Silicon Valley. Small, low-power devices like sensors and security cameras are the most visible part of IoT, and they're right in ARM's wheelhouse as the dominant force in low-power chips. But on Wednesday, the company highlighted a cloud-based SaaS product rather than chips or edge devices themselves. IoT depends on back-end capabilities as much as edge devices, and the company wants to play a role in all of it.

End users think they are tech savvy or knowledgeable about security issues, but in reality, they aren't as informed as they think.

SAP aims to close the divide that exists between most operational systems and backend IT platforms that need to converge in an IoT environment.

We recently hosted a webinar on the newest features of Hortonworks DataFlow 2.0 highlighting: the new user interface new processors in Apache NiFi Apache NiFi multi-tenancy Apache NiFi zero master clustering architecture Apache MiNiFi One of the first things you may have noticed in Hortonworks DataFlow 2.0 is the new user interface based on Apache…

Over the last few months, Microsoft has turned around its SharePoint platform and embarked on a mission to modernize the popular collaboration and portal product. This is great news for customers, who can look forward to a number of improvements, including a complete overhaul of SharePoint's aging web user interface. However all the change introduces a lot of uncertainty. While it's clear that Microsoft is investing in SharePoint again, how can enterprise customers invest without worrying that Microsoft will make their work obsolete? The SharePoint experts at BlueMetal have written a white paper to address these concerns, and to help clients plan a roadmap to the Future of SharePoint.

IT organizations should prioritize their self-service analytics tools projects to enable business users to experience the value of analytics and big data investments.

As much as we data scientists would like to work solely in R, there inevitably comes the time when our managers or customers want to see the fruits of our labours as a Word document or PowerPoint presentation. And not just any old document: it needs to be in the official corporate template, and will no doubt go through a series of back-and-forth revisions — marked up directly in the document — before it's considered final. R has several options for generating Word and PowerPoint documents, notably RMarkdown and Slidify. But in both these cases, R is in the driver's seat, and if you want to make any changes to the document that will persist through changes to the R code, you'll need to make those changes with R. And as Aimee Gott from Mango points out, that doesn't work so well if the document will go through a review cycle with non-R users.

Apache Spark has been Open Source's new kid on the block. Companies are using Spark to develop sophisticated models that would enable them to discover new opportunities or avoid risk. But what does the future or at least the near future hold for Spark? In this blog we have outlined five trends we see in…

Announcements related to IBM Watson and cognitive continue to pour forth from IBM Insight at World of Watson 2016. Discover what the conference has highlighted thus far in this comprehensive overview.

The complexity of multiple data sources contributing to the rising tide of data has executives at many enterprises up at night because of concerns involving risks, regulations and compliance. See why information governance is especially vital in today's complex ecosystem of voluminous data sources to help ensure protection of the information in collaborative, integrated environments.

Over the past two years IBM has been moving in the direction of being a data driven cognitive and cloud company. As part of this transformation, IBM has acquired The Weather Company that provides some of the most accurate weather data to pinpoint the impact of impending weather event to a specific address.

Bank Of America Merrill Lynch is in the midst of sweeping digital transformation involving all aspects of its customer-facing operations. InformationWeek spoke with several of the company's technology executives to find out where IT fits in its evolving technology roadmap.

The optimism and demand for innovation has never been higher. We are on the verge of a massive change in the day-to-day experiences of employees and customers as applications become more cognitive, but there is significant work to be done. Today, our most advanced applications are intelligent. Look no further than IBM Watson or Salesforce Einstein A.I. Bluewolf's recent The State of Salesforce Report showed that over half of companies surveyed described their most essential applications as at least somewhat intelligent already, able to anticipate and either take or suggest the next action. Increasing investments in intelligent applications is one key element to driving business results, but that alone is not enough. Companies must also invest in their employee and customer experience, and focus on translating their overwhelming collections of data into intuitive, automated employee experiences that, in turn, can power incredible customer moments.

A year ago the management consulting giant McKinsey & Co. predicted that the internet of things (IoT) could unlock $11 trillion in economic value by 2025. It's a bold claim, particularly given that IoT currently proves more useful in launching massive DDoS attacks than in recognizing that I need to buy more milk. Now, McKinsey has a new projection. It involves cars, and it declares that data "exhaust" from autos will be worth $750 billion by 2030. The consulting firm even goes so far as to lay out exactly how we can grab that revenue. If only it were as easy to make money off car data — which consumers may not want to share — as it is to prognosticate about it.

Apache Ignite is an in-memory computing platform that can be inserted seamlessly between a user's application layer and data layer. Apache Ignite loads data from the existing disk-based storage layer into RAM, improving performance by as much as six orders of magnitude (1 million-fold). The in-memory data capacity can be easily scaled to handle petabytes of data simply by adding more nodes to the cluster. Further, both ACID transactions and SQL queries are supported. Ignite delivers performance, scale, and comprehensive capabilities far above and beyond what traditional in-memory databases, in-memory data grids, and other in-memory-based point solutions can offer by themselves.

Many commerce firms have build their own analytics platform to predict user behavior and sell accordingly

NEWS ANALYSIS: The computing power of Graphics Processing Units built into high performance servers and super computers is being harnessed to bring new artificial intelligence capabilities to scientific research.

The recent DDoS attacks have shone a bright spotlight on the security problems within the Internet of Things and the cloud.

Get a firsthand look at IBM World of Watson 2016 in Las Vegas, spanning stadiums, the Cognitive Concourse and interactive booths. Also hear thoughts and impressions from big data influencer Ronald van Loon.

Many avenues can be explored at IBM Insight at World of Watson 2016. See what one observer came away with after attending several sessions devoted to the CDO and collaboration, tools and strategy within the CDO domain.

The newfound emphasis on tools and service integration is shaping a new crop of industry professionals – the actual faces behind the IT infrastructure.

According to a recent Allied Market Research forecast report, we can expect a 32% CAGR in the Software-Defined Data Center (SDDC) market. This growth will mainly be driven by customers running big data analytics on ever increasing data sets. This allows them to gain new insights in their business, and build competitive differentiation. Throughout IBM's…

Whether in the automotive world or at the copier, predictive modeling is helping organizations boost their return on investment without incurring marketing expenses, augmenting staff rosters or changing product lines. In this rundown of three smaller sessions from IBM Insight at World of Watson 2016, discover why modern businesses are turning to modeling for the power to excel.

Here are the top 10 strategic technology trends that will impact most organizations in 2017.

IBM Insight at World of Watson 2016 certainly has a lot to offer, and one good place to start is the conference bookstore. Take a look at this overview for surviving the challenge of finding the right title for your technology of choice.

This entry was posted in News and tagged , , , , , , , . Bookmark the permalink.