In an era of many complex data management systems, there is a need for tools that help to organize and save time and money. Gareth Vincent, VP, and CEO of EMEA in Talend, a company that provides a unique tool based on data fabric – one of the biggest ten technological trends of data analysis in 2019, speaks of the challenges and solutions.
What is Data Fabric?
“It’s about a single environment with a unified architecture that offers technologies and services that help organizations manage their data. The data fabric’s main goal is to maximize the data’s value and accelerate the digital transformation, and by doing so improving customer relations, comply with regulations, and optimize the chain of supply”.
What is The Problem with The Old Method?
“About 75% of organizations use six tools or more for data integration, and the amount of data and its sources keep growing, as the difficulty to analyze, process, and distribute. Organizations seek to consume their data after it had been processed and integrated into the organizational data system as quickly as possible. Data experts that work nowadays with a few outdated tools spend about 75% of their time on tasks that aren’t data processing, and this makes it very difficult for organizations to produce the most out of the data they possess quickly and compete in an ever-changing competitive market. Furthermore, the old-generation tools limit the organization because the customed pricing system is such that with more data, the higher the cost. That pricing system makes the process ten times as expensive compared to using the Talend solution.”
What Challenges Do Organizations Face, In Our Context, In the Digital Age?
“Nowadays there is more data than ever. There are more bytes of data than there are stars in the sky, and the amount of data doubles every couple of years. To over-rely on software as service (SaaS) compels data system teams to combine over 100 software, and by doing so, drains their resources. Because data engineering is ineffective, businesses tend not to find the data they are looking for. Only 37% of responders say that decisions are of high quality and speed.”
Why does The Need for Adopting New Methods for Data Management Exist?
“New methods for data management clearly increase the organizations’ profits by making better and faster decisions, developing better products, and improving the clients’ experience. In addition, they lower costs with business efficiency and lower the risk of not meeting regulations.
A study held by Gartner showed that organizations spend 20 times the cost on building artificial intelligence modules and analytics solutions (190 billion dollars in 2019) than on data integration, integrity, and quality (9.8 billion). As a result, the data they use is not complete and optimized, and most of their analytics projects are failing. To improve, organizations need to change their paradigms for data intake, optimization, repair, and distribution to digital and analytics systems. This requires a quality, inclusive, and holistic solution, and that’s exactly what Data Fabric offers”.
How Can Talend’s Data Fabric Concept Help Organizations?
“Integrating the data fabric will help them build a single environment that enables access and collection of all of the data, without considering its location and how it’s stored, and thus getting rid of all the hidden data islands in the organization. Talend’s data fabric is the only one that enables data integration, reliability, and control in a single platform so all the aspects of the work are simplified and unified. This unique and holistic approach is also the one that enables Talend’s trust score, that analysis in an instant the quality and reliability of the data that was distributed. Using Talend, users can quickly understand, measure, and improve the organizational data’s reliability.”
Yosi Rodrik, CEO of UCL and founder of Aqurate, Talend’s exclusive distributor in Israel, talked about the advantages of this unique solution the company offers: “Data fabric platform is different from any other solutions in this field. The platform supports any data life cycle – starting at data intake (from 2000 different sources) and on to primary optimization and preparing the data using artificial intelligence, robust data optimization, and distributing the data online (ESB) and in batch (ETL). All of this is performed in a single developer window, in addition to a robust data catalog.”
“Another unique quality is that the platform generates readable code and doesn’t require a designated running server, simplifying data management, distribution, and application. The platform’s pricing is based only on the number of developers and lets the organizations operate limitlessly with the number of serves and data. Because development and runs can happen in the developer’s personal computer environment, a three users license can serve more than 20 developers simultaneously”, he adds.
Rodrik also said: “the company invests a lot in research and development, so their product will always be the first to support the new technologies and architectures that pop up. Additionally, some mechanisms enable endorsing code that was made on older tools from different manufacturers”.