Apache could be game changer for Nutanix as CEO Pandey plots next moves

MARK ALBERTSON | November 10, 2017 | 106 views

Nutanix Inc. has plenty of moves it can make on the hyperconverged infrastructure, or HCI, chessboard. Integrating Apache more fully into its offerings may very well be one of them. “To me, the single biggest game changer for the company would be in what else we can do with Apache. Over time, we’ll do many more things with open source, including in the platform space,” said Dheeraj Pandey (pictured), founder, chairman and chief executive officer Nutanix.

Spotlight

OneSource Virtual

As a partner and customer since 2008, OneSource Virtual is dedicated exclusively to Workday and is one of the most experienced service providers in the ecosystem. Using an innovative Business Process as a Service (BPaaS) model, we are able to operate within your Workday application to deliver services and support as an extension of your team. Our Workday Experts have performed more than 380 initial Workday deployments, and include more than 320 Workday certified consultants holding over 860 total certifications.

OTHER ARTICLES
APPLICATION INFRASTRUCTURE

Adapting Hybrid Architectures for Digital Transformation Implementation

Article | August 8, 2022

For the majority of businesses, digital transformation (DX) has emerged as a significant priority. By incorporating digital technologies into all aspects of an organization's operations, digital transformation is a continuous process that alters how organizations operate as well as how they supply goods and services to customers and connect with them. Employing hybrid network infrastructures can aid businesses in putting DX strategies into action. An IT architecture and environment is a hybrid infrastructure that combines on-premises data centers with private or public clouds. Operating systems and applications can be deployed anywhere in this environment, depending on the needs and specifications of the firm. Managing and keeping an eye on an organization's whole IT infrastructure requires the use of hybrid IT infrastructure services, sometimes referred to as cloud services. Given the complexity of IT environments and needs, this is essential for digital transformation. What Does Hybrid Network Infrastructure Have To Offer? Flexibility Companies can employ the appropriate tools for the job, thanks to flexibility. For instance, a business needs access to a lot of data if it wants to use machine learning (ML) or artificial intelligence (AI). Utilizing public cloud services like AWS or Azure can help with this. However, these services might be pricey and not provide the performance required for some applications. Durability Hybrid networks are more tolerant of interruptions. For instance, a business can continue to function if there is a problem with its public cloud by using its private data center. This is due to the fact that the outage in the public cloud has no impact on the private data center. Security Businesses can utilize a hybrid cloud strategy to protect sensitive data while utilizing the resources and services of a public cloud, potentially lowering the chance of crucial information being compromised. While analytics and applications that use data kept in a private environment will probably still need to function in a public cloud, you can use encryption techniques to reduce security breaches. Scalability and Efficiency Traditional networks can't match the performance and scalability of hybrid networks. This is due to the fact that public clouds offer enormous bandwidth and storage that may be used as needed. By using a hybrid architecture, a company can benefit from the public cloud's flexibility and capacity while still keeping its business-critical data and operations in the private cloud or on-premises data center. Conclusion A cultural shift toward more flexible and intelligent ways of conducting business, supported by cutting-edge technology, involves integrating digital technologies throughout all company activities, improving current processes, developing new operational procedures, and offering higher value to clients. Infrastructures for hybrid networks are necessary for the success of digital transformation.

Read More
APPLICATION INFRASTRUCTURE

Network Security: The Safety Net in the Digital World

Article | August 3, 2022

Every business or organization has spent a lot of time and energy building its network infrastructure. The right resources have taken countless hours to establish, ensuring that their network offers connectivity, operation, management, and communication. Their complex hardware, software, service architecture, and strategies are all working for optimum and dependable use. Setting up a security strategy for your network requires ongoing, consistent work. Therefore, the first step in implementing a security technique is to do so. The underlying architecture of your network should consider a range of implementation, upkeep, and continuous active procedures. Network infrastructure security requires a comprehensive strategy that includes best practices and continuing procedures to guarantee that the underlying infrastructure is always safe. A company's choice of security measures is determined by: Appropriate legal requirements Rules unique to the industry The specific network and security needs Security for network infrastructure has numerous significant advantages. For example, a business or institution can cut expenses, boost output, secure internal communications, and guarantee the security of sensitive data. Hardware, software, and services are vital, but they could all have flaws that unintentional or intentional acts could take advantage of. Security for network infrastructure is intended to provide sophisticated, comprehensive resources for defense against internal and external threats. Infrastructures are susceptible to assaults like denial-of-service, ransomware, spam, and illegal access. Implementing and maintaining a workable security plan for your network architecture can be challenging and time-consuming. Experts can help with this crucial and continuous process. A robust infrastructure lowers operational costs, boosts output, and protects sensitive data from hackers. While no security measure will be able to prevent all attack attempts, network infrastructure security can help you lessen the effects of a cyberattack and guarantee that your business is back up and running as soon as feasible.

Read More
APPLICATION INFRASTRUCTURE

Data Center as a Service Is the Way of the Future

Article | June 10, 2022

Data Center as a Service (DCaaS) is a hosting service that gives clients access to actual data center infrastructure and amenities. Through a Wide-Area Network, DCaaS enables clients to remotely access the provider's storage, server, and networking capabilities (WAN). Businesses can tackle their on-site data center's logistical and financial issues by outsourcing to a service provider. Many enterprises rely on DCaaS to overcome the physical constraints of their on-site infrastructure or to offload the hosting and management of non-mission-critical applications. Businesses that require robust data management solutions but lack the necessary internal resources can adopt DCaaS. DCaaS is the perfect answer for companies that are struggling with a lack of IT help or a lack of funding for system maintenance. Added benefits data Center as a Service allows businesses to be independent of their physical infrastructure: A single-provider API Data centers without Staff Effortlessly handle the influx of data Data centers in regions with more stable climates Data Center as a Service helps democratize the data center itself, allowing companies that could never afford the huge investments that have gotten us this far to benefit from these developments. This is perhaps the most important, as Infrastructure-as-a-Service enables smaller companies to get started without a huge investment. Conclusion Data center as a service (DCaaS) enables clients to access a data center remotely and its features, whereas data center services might include complete management of an organization's on-premises infrastructure resources. IT can be outsourced using data center services to manage an organization's network, storage, computing, cloud, and maintenance. The infrastructure of many businesses is outsourced to increase operational effectiveness, size, and cost-effectiveness. It might be challenging to manage your existing infrastructure while keeping up with the pace of innovation, but it's critical to be on the cutting edge of technology. Organizations may stay future-ready by working with a vendor that can supply DCaaS and data center services.

Read More
IT SYSTEMS MANAGEMENT

Enhancing Rack-Level Security to Enable Rapid Innovation

Article | July 6, 2022

IT and data center administrators are under pressure to foster quicker innovation. For workers and customers to have access to digital experiences, more devices must be deployed, and larger enterprise-to-edge networks must be managed. The security of distributed networks has suffered as a result of this rapid growth, though. Some colocation providers can install custom locks for your cabinet if necessary due to the varying compliance standards and security needs for distinct applications. However, physical security measures are still of utmost importance because theft and social engineering can affect hardware as well as data. Risk Companies Face Remote IT work continue on the long run Attacking users is the easiest way into networks IT may be deploying devices with weak controls When determining whether rack-level security is required, there are essentially two critical criteria to take into account. The first is the level of sensitivity of the data stored, and the second is the importance of the equipment in a particular rack to the facility's continuing functioning. Due to the nature of the data being handled and kept, some processes will always have a higher risk profile than others. Conclusion Data centers must rely on a physically secure perimeter that can be trusted. Clients, in particular, require unwavering assurance that security can be put in place to limit user access and guarantee that safety regulations are followed. Rack-level security locks that ensure physical access limitations are crucial to maintaining data center space security. Compared to their mechanical predecessors, electronic rack locks or "smart locks" offer a much more comprehensive range of feature-rich capabilities.

Read More

Spotlight

OneSource Virtual

As a partner and customer since 2008, OneSource Virtual is dedicated exclusively to Workday and is one of the most experienced service providers in the ecosystem. Using an innovative Business Process as a Service (BPaaS) model, we are able to operate within your Workday application to deliver services and support as an extension of your team. Our Workday Experts have performed more than 380 initial Workday deployments, and include more than 320 Workday certified consultants holding over 860 total certifications.

Related News

HYPER-CONVERGED INFRASTRUCTURE,APPLICATION INFRASTRUCTURE

Web3 Decentralized Storage Company W3 Storage Lab Changes Name to Fog Works

Fog Works | September 23, 2022

W3 Storage Lab announced today it has changed its name to Fog Works. The new name better reflects the company’s positioning, has greater brand-building potential, and is more indicative of the company’s vision of being a key builder of Web3 infrastructure, applications, and devices. The name Fog Works is derived from the term fog computing which was coined by Cisco. Fog computing is an extension of cloud computing: a network architecture where computing and storage is mostly decentralized and pushed to the edge of the network, but a cloud still exists in the center. Web3 is a fully decentralized, blockchain-enabled iteration of the internet. By being entirely decentralized, Web3 is essentially the ultimate fog computing architecture with no cloud in the center. “Our goal is to make Web3 a reality for everyday consumers. “Because we’re making Web3 work for everyone, the name Fog Works really encapsulates our vision. We’re excited to build a brand around it.” Xinglu Lin, CEO of Fog Works Fog Works has co-developed a next generation distributed storage ecosystem that is based on the public blockchain, CYFS, and the Datamall Coin. CYFS is a next-generation protocol that re-invents basic Web protocols – TCP/IP, DNS, and HTTP – to create the infrastructure necessary for the complete decentralization of Web3. It has been in development for over seven years, practically eliminates latency in file retrieval – a huge problem with current decentralized storage solutions – and has infinite scalability. Fog Works is developing a series of killer applications for both consumers and enterprises that will use both CYFS and the Datamall Coin, which facilitates a more efficient market for decentralized storage. To further the development of decentralized applications (dApps) on CYFS, Fog Works is co-sponsoring the CodeDAO Web3 Hackathon. CodeDAO is the world’s first fully decentralized code hosting platform in the world. During the hackathon, developers will compete for prizes by developing dApps using CYFS. Teams will have seven days to develop their projects. The CodeDAO Hackathon runs October 15, 2022, to October 21, 2022. For more information, please visit https://codedao.ai/hackathon.html. About Fog Works Fog Works, formerly known as W3 Storage Lab, is a Web3 decentralized application company headquartered in Sunnyvale, CA with operations around the world. Its mission is to leverage the power of Web3 to help people manage, protect, and control their own data. Fog Works is led by an executive team with a highly unique blend of P2P networking experience, blockchain expertise, and entrepreneurship. It is funded by Draper Dragon Fund, OKX Blockdream Ventures, Lingfeng Capital, and other investors.

Read More

HYPER-CONVERGED INFRASTRUCTURE,DATA STORAGE,IT SYSTEMS MANAGEMENT

Kyndryl and Elastic Announce Expanded Partnership to Enable Data Observability, Search and Insights Across Cloud and Edge Computing Environments

Kyndryl | September 23, 2022

Kyndryl, the world’s largest IT infrastructure services provider, and Elastic (NYSE: ESTC), the company behind Elasticsearch, today announced an expanded global partnership to provide customers full-stack observability, enabling them to accelerate their ability to search, analyze and act on machine data (IT data and business data) stored across hybrid cloud, multi-cloud and edge computing environments. Under the partnership, Kyndryl and Elastic will collaborate on creating joint solutions and delivery capabilities designed to provide deep, frictionless observability at all levels of applications, services, and infrastructure to address customer data, analytics and IT operations management challenges. The companies will focus on delivering large-scale IT operations and AIOps capabilities to joint customers by leveraging Kyndryl’s data framework and toolkits and Elastic’s Enterprise Search, Observability, and Security solutions, enabling streamlined migrations, modernized infrastructure and tenant management, and AI development for efficient and proactive IT management. As part of the partnership, Kyndryl and Elastic plan to collaborate to support customer needs and requirements via joint offerings and solutions across the following areas: IT Data Modernization – Helping organizations manage exponential storage growth and giving them the capability to search for data wherever it resides. IT Data Management Services for Elastic – Providing flexibility to users of Elastic by letting Kyndryl manage the entire stack infrastructure and analytics workloads for IT operations. Intelligent IT Analytics – Enabling actionable observability through AI/ML capabilities that deliver unified insights for proactive and efficient IT operations with technology domain-specific insights. Data Migration Services for Elastic – Delivering the capability to streamline migrations and deploy self-managed Elastic workloads to the hyperscalers of a customer’s choice. Kyndryl’s global team of data management experts will also participate in the global Elastic certification program to expand their expertise in advising, implementing and managing Elastic solutions across critical IT projects and environments. “Customers in all industries are seeking to improve their capacity to search and analyze the data stored in the cloud and on edge computing environments. “We are happy to partner with Elastic to create and bring forward a unified approach that will help customers overcome hurdles and improve their ability to access and gain insights at scale from their business data.” Nicolas Sekkaki, Applications, Data & AI global practice leader for Kyndryl “Enabling customers to gain actionable insights from their data is a key enabler of data-driven digital transformation,” said Scott Musson, Vice President, Worldwide Channel and Alliances at Elastic. “The combination of Kyndryl’s global expertise in managing mission-critical information systems and the proven scale and flexibility of the Elastic Search Platform provides the critical foundation to help organizations drive speed, scale, and productivity, and address their observability needs across hybrid cloud, multi-cloud and edge computing environments.” For more information about the Kyndryl and Elastic partnership, please visit: https://www.kyndryl.com/us/en/about-us/alliances About Kyndryl Kyndryl is the world’s largest IT infrastructure services provider serving thousands of enterprise customers in more than 60 countries. The Company designs, builds, manages and modernizes the complex, mission-critical information systems that the world depends on every day. About Elastic Elastic is a leading platform for search-powered solutions. We help organizations, their employees, and their customers accelerate the results that matter. With solutions in Enterprise Search, Observability, and Security, we enhance customer and employee search experiences, keep mission-critical applications running smoothly, and protect against cyber threats. Delivered wherever data lives, in one cloud, across multiple clouds, or on-premise, Elastic enables 18,000+ customers and more than half of the Fortune 500, to achieve new levels of success at scale and on a single platform.

Read More

HYPER-CONVERGED INFRASTRUCTURE,APPLICATION INFRASTRUCTURE,IT SYSTEMS MANAGEMENT

Fluree and ZettaLabs Announce Merger to Serve Enterprises Seeking Data-Centric Architecture and Legacy Data Infrastructure Modernization

Fluree | September 22, 2022

Fluree, a company headquartered in Winston-Salem, North Carolina, which has developed a distributed ledger graph database platform, and New Jersey-based ZettaLabs, a business that uses artificial intelligence and machine learning to prepare raw data for analytics use, today announced the merger of the two companies. The combination of Fluree and ZettaLabs will enable Fluree to expand its offerings beyond its established expertise of working with “green-field,” new data-centric initiatives that encompass bleeding-edge innovation. With ZettaLabs now part of Fluree, the company possesses the prowess to tackle enterprise legacy data architectures and take the first steps toward modernization. All ZettaLabs employees will integrate into the Fluree ecosystem, bringing Fluree’s total headcount to 50. “At Fluree, we are building the data infrastructure for the future,” said Brian Platz, Fluree co-founder and CEO. “While many of our customers enjoy the unique benefits of our semantic graph distributed ledger database technology, we recognize that organizations first need a way out of their entrenched silos in order to build their end-goal infrastructures. Dealing with legacy infrastructure is one of the biggest challenges for modern businesses, but nearly 74% of organizations are failing to complete legacy data migration projects today due to inefficient tooling and a lack of interoperability. By adding the ZettaLabs team and product suite to our own, Fluree is poised to help organizations on their data infrastructure transformation journeys by uniquely addressing all major aspects of migration and integration: security, governance and semantic interoperability.” ZettaSense has been rebranded as Fluree Sense. The data pipeline that uses AI and machine learning, as well as ontologies, to normalize, cleanse and harmonize data from disparate data sources that need to be integrated in a way that eliminates any requirement for additional data governance, master data management or data quality software. Fluree Sense makes data in existing legacy databases, data warehouses and data lakes ready for downstream enterprise consumption and sharing, whether in analytic repositories like Snowflake or Databricks, or Fluree’s immutable knowledge graph database. “We developed our flagship product, ZettaSense, to ingest, classify, resolve and cleanse big data coming from a variety of sources. who will become Fluree’s president. “The problem is that the underlying data technical architecture -- with multiple operational data stores, warehouses and lakes, now spreading out across multiple clouds -- is continuing to grow in complexity. Now with Fluree, our shared customer base and any new customers can evolve to a modern and elegant data-centric infrastructure that will allow them to more efficiently and effectively share cleansed data both inside and outside its organizational borders." Eliud Polanco, co-founder and CEO of ZettaLabs The merger, the first in Fluree’s history, makes Fluree a go-to company for roughly 90% of businesses hindered by legacy infrastructure and database systems that do not have the toolset or talent to undergo an effective transformation. It also augments Fluree’s customer base, which now includes large, enterprise financial-services customers. Use cases for Fluree Sense include: Legacy data migrations that cleanse and harmonize data from multiple sources to enable migration from a legacy enterprise business platform to a target digital platform; Customer data integrations that integrate customer, account, product and transaction data from across multiple data sources into a single golden 360-degree customer record; Consent management that enables active customer consent and control of how data is shared across products, regions and business functions within an organization; and, Cross-border data residency that allows secure sharing of information across borders adhering to the various national data-privacy regulations using multi-party computation. “We don’t have a lack of data today — we have a lack of high-quality data,” said Peter Serenita, retired Chief Data Officer and current chairman of the New York City-headquartered nonprofit organization Enterprise Data Management Council. “This is why it is essential for enterprises to take a data-centric approach to their modernization initiatives in order to truly transform their legacy infrastructure and eliminate their data silos for good. Joining forces with the ZettaLabs team and product will allow Fluree to continue its mission of turning big data into better data for sustainable business outcomes.” While Fluree currently serves the existing enterprise data management market as an innovative database solution, it is mostly for new data projects that have identified a specific requirement for data trust, integrity, sharing or security. The merger with ZettaLabs enables Fluree to provide value to all enterprise data teams looking to get a handle on their legacy infrastructure and modernize their platforms to satisfy increasingly complex business goals. Fluree now has a full spectrum of data management capabilities for organizations — from the first step of integrating and migrating legacy system data infrastructure with ZettaLabs’ technology to building modernized operational and analytical data infrastructure atop Fluree’s database system. “Fluree’s merger with ZettaLabs is directly in line with Fluree’s vision to deliver data-centric capabilities to modernize enterprise data abilities,” said Dan Malven, managing director of 4490 Ventures, a Madison-based venture capital firm and Fluree lead investor. “Enterprises seeking data-centric architectures now not only have a landing place with Fluree’s core ledger graph database technology, but also now a starting point for their legacy infrastructure to onboard their data management into data centricity.” About Fluree Co-founded in 2016 by CEO Brian Platz and Executive Chairman Flip Filipowski, Fluree PBC is headquartered in Winston-Salem, North Carolina. Fluree is pioneering a data-first technology approach with its data management platform. It guarantees data integrity, facilitates secure data sharing and powers data-driven insights. The Fluree platform organizes blockchain-secured data in a scalable semantic graph database — establishing a foundational layer of trusted data for connected and secure data ecosystems. The company’s foundation is a set of W3C semantic web standards that facilitate trusted data interoperability. Fluree currently employs 50.

Read More

HYPER-CONVERGED INFRASTRUCTURE,APPLICATION INFRASTRUCTURE

Web3 Decentralized Storage Company W3 Storage Lab Changes Name to Fog Works

Fog Works | September 23, 2022

W3 Storage Lab announced today it has changed its name to Fog Works. The new name better reflects the company’s positioning, has greater brand-building potential, and is more indicative of the company’s vision of being a key builder of Web3 infrastructure, applications, and devices. The name Fog Works is derived from the term fog computing which was coined by Cisco. Fog computing is an extension of cloud computing: a network architecture where computing and storage is mostly decentralized and pushed to the edge of the network, but a cloud still exists in the center. Web3 is a fully decentralized, blockchain-enabled iteration of the internet. By being entirely decentralized, Web3 is essentially the ultimate fog computing architecture with no cloud in the center. “Our goal is to make Web3 a reality for everyday consumers. “Because we’re making Web3 work for everyone, the name Fog Works really encapsulates our vision. We’re excited to build a brand around it.” Xinglu Lin, CEO of Fog Works Fog Works has co-developed a next generation distributed storage ecosystem that is based on the public blockchain, CYFS, and the Datamall Coin. CYFS is a next-generation protocol that re-invents basic Web protocols – TCP/IP, DNS, and HTTP – to create the infrastructure necessary for the complete decentralization of Web3. It has been in development for over seven years, practically eliminates latency in file retrieval – a huge problem with current decentralized storage solutions – and has infinite scalability. Fog Works is developing a series of killer applications for both consumers and enterprises that will use both CYFS and the Datamall Coin, which facilitates a more efficient market for decentralized storage. To further the development of decentralized applications (dApps) on CYFS, Fog Works is co-sponsoring the CodeDAO Web3 Hackathon. CodeDAO is the world’s first fully decentralized code hosting platform in the world. During the hackathon, developers will compete for prizes by developing dApps using CYFS. Teams will have seven days to develop their projects. The CodeDAO Hackathon runs October 15, 2022, to October 21, 2022. For more information, please visit https://codedao.ai/hackathon.html. About Fog Works Fog Works, formerly known as W3 Storage Lab, is a Web3 decentralized application company headquartered in Sunnyvale, CA with operations around the world. Its mission is to leverage the power of Web3 to help people manage, protect, and control their own data. Fog Works is led by an executive team with a highly unique blend of P2P networking experience, blockchain expertise, and entrepreneurship. It is funded by Draper Dragon Fund, OKX Blockdream Ventures, Lingfeng Capital, and other investors.

Read More

HYPER-CONVERGED INFRASTRUCTURE,DATA STORAGE,IT SYSTEMS MANAGEMENT

Kyndryl and Elastic Announce Expanded Partnership to Enable Data Observability, Search and Insights Across Cloud and Edge Computing Environments

Kyndryl | September 23, 2022

Kyndryl, the world’s largest IT infrastructure services provider, and Elastic (NYSE: ESTC), the company behind Elasticsearch, today announced an expanded global partnership to provide customers full-stack observability, enabling them to accelerate their ability to search, analyze and act on machine data (IT data and business data) stored across hybrid cloud, multi-cloud and edge computing environments. Under the partnership, Kyndryl and Elastic will collaborate on creating joint solutions and delivery capabilities designed to provide deep, frictionless observability at all levels of applications, services, and infrastructure to address customer data, analytics and IT operations management challenges. The companies will focus on delivering large-scale IT operations and AIOps capabilities to joint customers by leveraging Kyndryl’s data framework and toolkits and Elastic’s Enterprise Search, Observability, and Security solutions, enabling streamlined migrations, modernized infrastructure and tenant management, and AI development for efficient and proactive IT management. As part of the partnership, Kyndryl and Elastic plan to collaborate to support customer needs and requirements via joint offerings and solutions across the following areas: IT Data Modernization – Helping organizations manage exponential storage growth and giving them the capability to search for data wherever it resides. IT Data Management Services for Elastic – Providing flexibility to users of Elastic by letting Kyndryl manage the entire stack infrastructure and analytics workloads for IT operations. Intelligent IT Analytics – Enabling actionable observability through AI/ML capabilities that deliver unified insights for proactive and efficient IT operations with technology domain-specific insights. Data Migration Services for Elastic – Delivering the capability to streamline migrations and deploy self-managed Elastic workloads to the hyperscalers of a customer’s choice. Kyndryl’s global team of data management experts will also participate in the global Elastic certification program to expand their expertise in advising, implementing and managing Elastic solutions across critical IT projects and environments. “Customers in all industries are seeking to improve their capacity to search and analyze the data stored in the cloud and on edge computing environments. “We are happy to partner with Elastic to create and bring forward a unified approach that will help customers overcome hurdles and improve their ability to access and gain insights at scale from their business data.” Nicolas Sekkaki, Applications, Data & AI global practice leader for Kyndryl “Enabling customers to gain actionable insights from their data is a key enabler of data-driven digital transformation,” said Scott Musson, Vice President, Worldwide Channel and Alliances at Elastic. “The combination of Kyndryl’s global expertise in managing mission-critical information systems and the proven scale and flexibility of the Elastic Search Platform provides the critical foundation to help organizations drive speed, scale, and productivity, and address their observability needs across hybrid cloud, multi-cloud and edge computing environments.” For more information about the Kyndryl and Elastic partnership, please visit: https://www.kyndryl.com/us/en/about-us/alliances About Kyndryl Kyndryl is the world’s largest IT infrastructure services provider serving thousands of enterprise customers in more than 60 countries. The Company designs, builds, manages and modernizes the complex, mission-critical information systems that the world depends on every day. About Elastic Elastic is a leading platform for search-powered solutions. We help organizations, their employees, and their customers accelerate the results that matter. With solutions in Enterprise Search, Observability, and Security, we enhance customer and employee search experiences, keep mission-critical applications running smoothly, and protect against cyber threats. Delivered wherever data lives, in one cloud, across multiple clouds, or on-premise, Elastic enables 18,000+ customers and more than half of the Fortune 500, to achieve new levels of success at scale and on a single platform.

Read More

HYPER-CONVERGED INFRASTRUCTURE,APPLICATION INFRASTRUCTURE,IT SYSTEMS MANAGEMENT

Fluree and ZettaLabs Announce Merger to Serve Enterprises Seeking Data-Centric Architecture and Legacy Data Infrastructure Modernization

Fluree | September 22, 2022

Fluree, a company headquartered in Winston-Salem, North Carolina, which has developed a distributed ledger graph database platform, and New Jersey-based ZettaLabs, a business that uses artificial intelligence and machine learning to prepare raw data for analytics use, today announced the merger of the two companies. The combination of Fluree and ZettaLabs will enable Fluree to expand its offerings beyond its established expertise of working with “green-field,” new data-centric initiatives that encompass bleeding-edge innovation. With ZettaLabs now part of Fluree, the company possesses the prowess to tackle enterprise legacy data architectures and take the first steps toward modernization. All ZettaLabs employees will integrate into the Fluree ecosystem, bringing Fluree’s total headcount to 50. “At Fluree, we are building the data infrastructure for the future,” said Brian Platz, Fluree co-founder and CEO. “While many of our customers enjoy the unique benefits of our semantic graph distributed ledger database technology, we recognize that organizations first need a way out of their entrenched silos in order to build their end-goal infrastructures. Dealing with legacy infrastructure is one of the biggest challenges for modern businesses, but nearly 74% of organizations are failing to complete legacy data migration projects today due to inefficient tooling and a lack of interoperability. By adding the ZettaLabs team and product suite to our own, Fluree is poised to help organizations on their data infrastructure transformation journeys by uniquely addressing all major aspects of migration and integration: security, governance and semantic interoperability.” ZettaSense has been rebranded as Fluree Sense. The data pipeline that uses AI and machine learning, as well as ontologies, to normalize, cleanse and harmonize data from disparate data sources that need to be integrated in a way that eliminates any requirement for additional data governance, master data management or data quality software. Fluree Sense makes data in existing legacy databases, data warehouses and data lakes ready for downstream enterprise consumption and sharing, whether in analytic repositories like Snowflake or Databricks, or Fluree’s immutable knowledge graph database. “We developed our flagship product, ZettaSense, to ingest, classify, resolve and cleanse big data coming from a variety of sources. who will become Fluree’s president. “The problem is that the underlying data technical architecture -- with multiple operational data stores, warehouses and lakes, now spreading out across multiple clouds -- is continuing to grow in complexity. Now with Fluree, our shared customer base and any new customers can evolve to a modern and elegant data-centric infrastructure that will allow them to more efficiently and effectively share cleansed data both inside and outside its organizational borders." Eliud Polanco, co-founder and CEO of ZettaLabs The merger, the first in Fluree’s history, makes Fluree a go-to company for roughly 90% of businesses hindered by legacy infrastructure and database systems that do not have the toolset or talent to undergo an effective transformation. It also augments Fluree’s customer base, which now includes large, enterprise financial-services customers. Use cases for Fluree Sense include: Legacy data migrations that cleanse and harmonize data from multiple sources to enable migration from a legacy enterprise business platform to a target digital platform; Customer data integrations that integrate customer, account, product and transaction data from across multiple data sources into a single golden 360-degree customer record; Consent management that enables active customer consent and control of how data is shared across products, regions and business functions within an organization; and, Cross-border data residency that allows secure sharing of information across borders adhering to the various national data-privacy regulations using multi-party computation. “We don’t have a lack of data today — we have a lack of high-quality data,” said Peter Serenita, retired Chief Data Officer and current chairman of the New York City-headquartered nonprofit organization Enterprise Data Management Council. “This is why it is essential for enterprises to take a data-centric approach to their modernization initiatives in order to truly transform their legacy infrastructure and eliminate their data silos for good. Joining forces with the ZettaLabs team and product will allow Fluree to continue its mission of turning big data into better data for sustainable business outcomes.” While Fluree currently serves the existing enterprise data management market as an innovative database solution, it is mostly for new data projects that have identified a specific requirement for data trust, integrity, sharing or security. The merger with ZettaLabs enables Fluree to provide value to all enterprise data teams looking to get a handle on their legacy infrastructure and modernize their platforms to satisfy increasingly complex business goals. Fluree now has a full spectrum of data management capabilities for organizations — from the first step of integrating and migrating legacy system data infrastructure with ZettaLabs’ technology to building modernized operational and analytical data infrastructure atop Fluree’s database system. “Fluree’s merger with ZettaLabs is directly in line with Fluree’s vision to deliver data-centric capabilities to modernize enterprise data abilities,” said Dan Malven, managing director of 4490 Ventures, a Madison-based venture capital firm and Fluree lead investor. “Enterprises seeking data-centric architectures now not only have a landing place with Fluree’s core ledger graph database technology, but also now a starting point for their legacy infrastructure to onboard their data management into data centricity.” About Fluree Co-founded in 2016 by CEO Brian Platz and Executive Chairman Flip Filipowski, Fluree PBC is headquartered in Winston-Salem, North Carolina. Fluree is pioneering a data-first technology approach with its data management platform. It guarantees data integrity, facilitates secure data sharing and powers data-driven insights. The Fluree platform organizes blockchain-secured data in a scalable semantic graph database — establishing a foundational layer of trusted data for connected and secure data ecosystems. The company’s foundation is a set of W3C semantic web standards that facilitate trusted data interoperability. Fluree currently employs 50.

Read More

Events