HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE

Web3 Decentralized Storage Company W3 Storage Lab Changes Name to Fog Works

Fog Works | September 23, 2022 | Read time : 02:50 min

Fog Works
W3 Storage Lab announced today it has changed its name to Fog Works. The new name better reflects the company’s positioning, has greater brand-building potential, and is more indicative of the company’s vision of being a key builder of Web3 infrastructure, applications, and devices.

The name Fog Works is derived from the term fog computing which was coined by Cisco. Fog computing is an extension of cloud computing: a network architecture where computing and storage is mostly decentralized and pushed to the edge of the network, but a cloud still exists in the center. Web3 is a fully decentralized, blockchain-enabled iteration of the internet. By being entirely decentralized, Web3 is essentially the ultimate fog computing architecture with no cloud in the center.

“Our goal is to make Web3 a reality for everyday consumers. “Because we’re making Web3 work for everyone, the name Fog Works really encapsulates our vision. We’re excited to build a brand around it.”

Xinglu Lin, CEO of Fog Works

Fog Works has co-developed a next generation distributed storage ecosystem that is based on the public blockchain, CYFS, and the Datamall Coin. CYFS is a next-generation protocol that re-invents basic Web protocols – TCP/IP, DNS, and HTTP – to create the infrastructure necessary for the complete decentralization of Web3. It has been in development for over seven years, practically eliminates latency in file retrieval – a huge problem with current decentralized storage solutions – and has infinite scalability. Fog Works is developing a series of killer applications for both consumers and enterprises that will use both CYFS and the Datamall Coin, which facilitates a more efficient market for decentralized storage.

To further the development of decentralized applications (dApps) on CYFS, Fog Works is co-sponsoring the CodeDAO Web3 Hackathon. CodeDAO is the world’s first fully decentralized code hosting platform in the world. During the hackathon, developers will compete for prizes by developing dApps using CYFS. Teams will have seven days to develop their projects. The CodeDAO Hackathon runs October 15, 2022, to October 21, 2022. For more information, please visit https://codedao.ai/hackathon.html.

About Fog Works
Fog Works, formerly known as W3 Storage Lab, is a Web3 decentralized application company headquartered in Sunnyvale, CA with operations around the world. Its mission is to leverage the power of Web3 to help people manage, protect, and control their own data. Fog Works is led by an executive team with a highly unique blend of P2P networking experience, blockchain expertise, and entrepreneurship. It is funded by Draper Dragon Fund, OKX Blockdream Ventures, Lingfeng Capital, and other investors.

Spotlight

When Discovery’s Corporate IT discovered that its infrastructure dependencies were affecting the stability of its VMware environment, it turned to VMware’s vSAN solution to alleviate these dependencies and better architect its virtualized environment.“The vSAN platform has given us greater control of our infrastructure stack as well as improved management of our VMware environment. It has also provided greater performance and simplification of the SAN infrastructure, afforded us freedom when scaling as well as flexibility in our choice of hardware.”


Other News
HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE

Delinea Opens New Data Centre in the UK

Delinea | September 15, 2022

Delinea, a leading provider of privileged access management (PAM) solutions for seamless security, today announced the addition of a new data centre in the UK, broadening compliance options for organisations in the region. The new facility offers access to locally hosted instances of Secret Server Cloud, optimising product performance and giving UK-based customers the peace of mind that their cybersecurity investments with Delinea align to legislative requirements for data protection. As companies continue their digital transformation journeys, they are increasingly migrating their infrastructure to the cloud, including IT security systems. In a recent global survey carried out by Delinea, 55% of UK respondents said they are storing privileged identities in the cloud, and 36% also indicated that integration into the cloud will be a priority over the next 12-18 months in relation to privileged access security. The new data centre complements Delinea's existing facilities in Canada, East and West Coast US, Germany, Singapore and Australia and it further enhances the company's cloud infrastructure to meet the growing demand for cloud-based PAM, offering customers increased deployment options and better serving organisations with stringent data residency requirements. "The new data centre provides customers with the security, flexibility and performance they need to protect their digital assets," said Spence Young, VP EMEA at Delinea. "In addition to delivering benefits, such as lower latency and increased capacity, it enables organisations to confidently plan their migration to the cloud, helping them remain compliant with data protection regulations." About Delinea Delinea is a leading provider of privileged access management (PAM) solutions that make security seamless for the modern, hybrid enterprise. Our solutions empower organizations to secure critical data, devices, code, and cloud infrastructure to help reduce risk, ensure compliance, and simplify security. Delinea removes complexity and defines the boundaries of access for thousands of customers worldwide. Our customers range from small businesses to the world's largest financial institutions, intelligence agencies, and critical infrastructure companies.

Read More

HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE, APPLICATION STORAGE

Hewlett Packard Enterprise Introduces Next-Generation Compute Engineered for a Hybrid World

Hewlett Packard Enterprise | November 01, 2022

Hewlett Packard Enterprise today announced a next generation compute portfolio that delivers a cloud operating experience designed to power hybrid environments and digital transformation. The new HPE ProLiant Gen11 servers provide organizations with intuitive, trusted, and optimized compute resources, ideally suited for a range of modern workloads, including AI, analytics, cloud-native applications, graphic-intensive applications, machine learning, Virtual Desktop Infrastructure (VDI), and virtualization. “The foundation of any hybrid strategy is compute,” said Neil MacDonald, executive vice president and general manager, Compute, at HPE. “HPE Compute brings businesses closer to the edge, where data is created, where new cloud experiences are delivered, and where security is integral. The new HPE ProLiant Gen11 servers are engineered for the hybrid world to deliver an intuitive cloud operating experience, trusted security by design, and optimized performance for workloads.” Intuitive cloud operating experience On HPE ProLiant servers, an HPE GreenLake for Compute Ops Management subscription provides a cloud-native management console. This increases operational efficiency by securely automating the process to access, monitor, and manage servers, no matter where the compute environment lives. The console provides simple, unified, and automated capabilities to allow customers to control their compute with global visibility and insight. Customers can also easily onboard thousands of distributed devices and benefit from faster server firmware updates to focus efforts on business operations, and not on managing complex IT infrastructure. HPE GreenLake for Compute Ops Management also includes carbon footprint reporting for customers to view emission metrics, from individual servers to full compute environments, to monitor energy usage. Trusted security by design HPE continues to lead and deliver secure infrastructure, from edge to cloud, starting at the silicon level with the HPE Silicon Root of Trust, an industry-exclusive security capability that protects millions of lines of firmware code, from malware and ransomware, with a digital fingerprint that is unique to the server. Today, the HPE Silicon Root of Trust secures millions of HPE servers around the world. The next-generation HPE ProLiant servers build on this security innovation with the following new features to protect data and systems: Ensure verification and authentication for device components with the new version of the HPE Integrated Lights-Out (iLO), with iLO6. ILO is a remote server management software that enables customers to securely configure, monitor, and update HPE servers seamlessly. The latest version features new authentication using the Security Protocol and Data Model (SPDM), a key security capability in servers for authenticating and securely monitoring devices in an open standards-based approach. Prevent alterations to unique server identity access with the inclusion of platform certifications and Secure Device Identity (iDevID) by default. Gain an additional layer of authentication by monitoring a secure boot and system state through the Trusted Platform Module (TPM). Adopt the highest level of security through the HPE Trusted Supply Chain. The HPE Trusted Supply Chain advances end-to-end security with certified servers that feature hardened data protection during the manufacturing process. Recently, HPE extended options for certified servers, from US-based factories, to produce and ship worldwide.1 Optimized performance for any workload As organizations run more demanding workloads, including AI, machine learning, and rendering projects, they require optimal compute and accelerated compute performance. The next-generation HPE ProLiant servers are optimized to deliver high performance on an organization’s most data-intensive workloads and support a diverse set of architectures, including 4th Generation AMD EPYC™ processors, 4th Gen Intel® Xeon® Scalable processors, and Ampere® Altra® and Ampere® Altra® Max Cloud Native Processors. Compared to the previous generation, the new HPE ProLiant Gen11 servers support twice as much I/O bandwidth for the most demanding applications, 50% more cores per CPU for improved workload consolidation, and 33% more high-performance GPU density per server to support AI and graphic-intensive workloads. Service providers, and enterprises that are embracing cloud-native workloads, require dedicated, cloud-native compute to deliver agile and extensible capabilities to drive innovation. In June 2022, HPE announced that it was the first tier-one server provider to offer compute with optimized cloud-native silicon, using Ampere® Altra® and Ampere® Altra® Max Cloud Native Processors in the new HPE ProLiant RL300 Gen11 server. Delivering a pay-as-you-go consumption model with HPE GreenLake Organizations looking to transition from one generation to the next, can adopt HPE’s next-generation compute through a traditional infrastructure purchase or through a pay-as-you-go model with HPE GreenLake. HPE GreenLake is an as-a-service platform that enables customers to accelerate data-first modernization and provides over 70 cloud services that can run on-premises, at the edge, in a colocation facility, and in the public cloud. Additionally, through HPE Financial Services (HPEFS), customers can convert existing technology assets into capital to purchase new or upgraded technology. Expanding the customer experience with new services Through HPE Pointnext Services, an award-winning team of over 15,000 experts, customers adopting the HPE ProLiant Gen11 servers can leverage in-depth global expertise to deploy next-generation HPE ProLiant servers and create new experiences, gain real-time insights from their data, and modernize IT to unlock value. Today, HPE unveiled enhancements to its customer experience, supporting HPE ProLiant Gen11 servers, including: HPE Pointnext Complete Care Secure Locations offers customers assigned experts to deliver support to locations where access, connectivity as well as electronic and verbal communications, are subject to specific security measures. HPE Expert on Demand provides customers with access to services professionals with dedicated expertise related to HPE’s next-generation compute offerings. HPE Support Center, which provides online services and a support platform, has been enhanced to include greater collaboration, case management, enhanced virtual agent troubleshooting and a new digital insights dashboard. Support for HPE’s next-generation compute has been extended from three to five years, to up to seven years. About Hewlett Packard Enterprise Hewlett Packard Enterprise (NYSE: HPE) is the global edge-to-cloud company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way people live and work, HPE delivers unique, open and intelligent technology solutions as a service. With offerings spanning Cloud Services, Compute, High Performance Computing & AI, Intelligent Edge, Software, and Storage, HPE provides a consistent experience across all clouds and edges, helping customers develop new business models, engage in new ways, and increase operational performance.

Read More

HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE

DJIB Launches First Ever Enterprise Grade Decentralised Data Storage Drive

DJIB | September 06, 2022

Today DJIB launched the first ever end-to-end encrypted, enterprise grade decentralised data storage drive with embedded multi chain non-fungible token functionality. It enables the widespread adoption of NFTs in business applications. Cloud data storage is dominated by services such as Amazon AWS, Google Cloud and Microsoft Azure. However, in the age of blockchains, users find traditional storage limiting as it is centralised in the hands of individual corporations. User data can be potentially accessed without their knowledge by employees of such providers. The currently missing ability to save objects as NFTs will be increasingly required in business applications. This is why, while being AWS S3 compatible and blazingly fast, the DJIB data storage drive for the first time addresses all of these concerns by being end-to-end encrypted, censorship resistant, and with built-in NFT functionality. It reimagines the concept of NFTs, treating them as a new type of file format, whereby users can "Save as NFT" any file stored on the drive, thus demystifying the creation of NFTs. Files can be up to 5TB large, which removes currently existing technical constraints. Users can either attach custom business logic to their NFTs, or use pre-defined templates from a library without knowing how to code. For example, a musician can publish a song with pre-defined licensing rights, or a pharmaceutical company can allow patients to share and profit from their medical data with very granular permissions and usage rights - all without the need of any intermediaries or use of specialist software. Any asset can now be tokenised. Any financial director can issue share certificates in NFT format. Such NFTs are immediately interoperable with all the blockchains with which DJIB has a connector. It started from Solana, Ethereum and BSC, but will soon cover all key networks. DJIB is already working on connectors with teams from major blockchains, starting with those that are enterprise focused and see this as an opportunity to foster the development of applications within their ecosystems. Moe Sayadi, DJIB CEO whose background is of a solutions architect at Microsoft and Avaloq, says: "Making our decentralised drive available to enterprise customers and removing the mystery behind the creation of NFTs opens an unimaginable trove of opportunities. It puts a powerful tool into the hands of non-technical domain experts. They can focus on the business logic attached to any document and potentially physical item, and move entire business processes to the cloud. This enables Object Oriented Business Process Management and many other exciting innovations which are in our pipeline and will be announced soon. We are discussing with corporate CTOs some very interesting use cases and I can confidently say that NFT evolution has finally passed the apes stage."

Read More

HYPER-CONVERGED INFRASTRUCTURE, APPLICATION INFRASTRUCTURE

Adeo Modernizes its Data Infrastructure with Datometry on Google BigQuery

Datometry | October 12, 2022

Datometry, the database virtualization company, announced today that leading French home goods retailer Adeo has completed a key milestone of the modernization of its data infrastructure by migrating from their legacy data warehouse, Teradata, to Google Cloud's BigQuery. The largest DIY retailer in Europe and third largest in the world, has 150,000 employees and 1,000 retail shops in 20 countries that meet a range of home goods needs, including renovations, DIY, and décor. The company needed to move from its highly entrenched data system to BigQuery, wanting to do so without disrupting business users worldwide. Using the Datometry Hyper-Q virtualization platform - the industry's first to make existing applications seamlessly interoperable with cloud databases - Adeo was able to deliver on a multi-project architecture leveraging BigQuery that serves users in multiple geographies simultaneously. In a joint effort between Datometry, Google Cloud, Adeo, and Adeo's service partners Sopra Steria and CGI, Adeo redesigned and implemented its ETL. Then, Datometry generated the mapping of over 200,000 database objects from Teradata to BigQuery, fully automatically. That set the stage for Adeo to virtualize the entire reporting and consumption layer using Datometry Hyper-Q. This approach made for a completely seamless transition so that Adeo's business users around the world would not even know that their core data platform has been replaced with BigQuery. "Adeo, a global leader in its industry, was acutely aware that a conventional database migration would come with significant disruptions to its business," said Mike Waas, CEO, Datometry. "Therefore, together with Google Cloud, we delivered on a vision that empowered Adeo to adopt BigQuery as their new data warehouse platform faster than with any other methodology—and, notably, without disrupting the business." Datometry Hyper-Q uniquely addressed Adeo's business objectives, enabling the company to transfer its existing applications natively to BigQuery without costly rewrites of SQL code, at a fraction of the time and risk associated with typical database migrations. Had Adeo chosen to rewrite and redesign their reporting layer instead of virtualizing it, Adeo would have faced additional expenses of millions of dollars over the course of several years, with no guarantee of success. "When Google Cloud first introduced us to Datometry we were thrilled. "Datometry enabled us to migrate at a highly accelerated pace and quickly unlock the benefits of Google Cloud. With Datometry Hyper-Q, we were able to implement a multi-project architecture that would have been close to impossible to build otherwise." Eric Foratier, Digital Domain Leader, Adeo "We're proud to partner with Datometry and help Adeo along its digital transformation journey," said Sudhir Hasbe, Sr. Director of Data Analytics, Google Cloud. "By migrating its data warehouse with Datometry to BigQuery, Adeo teams can access and leverage data insights at global scale, enabling elevated customer experiences for its end customers." Datometry Hyper-Q is used by leading Fortune 500 and Global 2000 enterprises to accelerate cloud modernization and move workloads between data warehouses. The Datometry Hyper-Q virtualization platform eliminates risk-laden, expensive, and time-consuming application rewrites. For more information, visit www.datometry.com. About Datometry Datometry is the global leader in database system virtualization. Datometry empowers enterprises to run their existing applications directly on the cloud database of their choice without the business disruption of costly and risk-laden database migrations and application rewrites. Leading Fortune 500 and Global 2000 enterprises worldwide realize significant cost savings and consistently outpace their competition by using Datometry during this critical period of transformation to cloud-native data management.

Read More

Spotlight

When Discovery’s Corporate IT discovered that its infrastructure dependencies were affecting the stability of its VMware environment, it turned to VMware’s vSAN solution to alleviate these dependencies and better architect its virtualized environment.“The vSAN platform has given us greater control of our infrastructure stack as well as improved management of our VMware environment. It has also provided greater performance and simplification of the SAN infrastructure, afforded us freedom when scaling as well as flexibility in our choice of hardware.”

Resources