Google takes strides towards interoperability with new Data Cloud Alliance and cross-platform solution BigLake
Google has made two announcements aimed at bettering cloud and data interoperability among multiple vendors. One is the launch of BigLake, which is a unified storage engine that simplifies data access for data warehouses by “providing uniform control across multi-cloud storage and open-formats”.
The solution that is aimed at making it easier for enterprises to analyse data in their warehouses and data lakes will work with AWS and Azure.
Google said that developers will get access to one uniform storage engine and the ability to query different data sets through a single system without needing to duplicate data.
“Managing data across disparate lakes and warehouses creates silos and increases risk and cost, especially when data needs to be moved,” pointed out Gerrit Kazmaier, VP and GM of Databases, Data Analytics and Business Intelligence at Google Cloud.
Also read: Google Maps to show toll amounts on Indian roads, navigate on Apple Watch
BigLake allows companies to unify their data warehouses and lakes to analyse data without worrying about the underlying storage format or system, which eliminates the need to duplicate or move data from a source and reduces cost and inefficiencies,” added Kazmaier.
In another big announcement, there is a new cloud alliance in town, called the Data Cloud Alliance. It aims to provide global businesses with “seamless access and insights into the data needed for digital transformation. The founding members of this alliance include Google Cloud, Accenture, Confluent, Databricks, Dataiku, Deloitte, Elastic, Fivetran, MongoDB, Neo4j, Redis and Starburst.
The goal is to make data more portable and accessible across disparate business systems. The alliance, according to a statement, will work on bettering the adoption of data analytics, artificial intelligence, and machine learning best practices, through creating data models and open standards for the same.