Sep 15, 2021 | Nikola Apostolov
There are several challenges that organizations planning their long-term digital transformation are facing. Whether their long-term goal is off-site data protection, distributed or hybrid workflows, they have to not only provision for storage and networking, including scalability, but also adopt architecture that will allow them to remain current in the fast-evolving cloud services landscape and to scale in the future without disruption. To put the rise of cloud-based services into perspective, there are currently more than 6,000 cloud-based services offered by each of the leading cloud providers.
In this article we’ll look at the most important factors when designing a cloud infrastructure in relation to storage, services and workflows, and identify the key ingredients of any successful future-proof cloud adoption strategy.
Common concerns that organizations face when mapping out their cloud adoption strategy include a lack of dedicated IT staff, cost constraints, potential data loss, ineffective data management, evolving customer needs and more.
When looking at how to achieve that transformation, they are facing further challenges such as different file formats, fear of disruption, complex architecture & cost, and compliancy requirements. While disruption and complexity are imminent problems that enterprises are trying to solve, proprietary file formats (i.e., being locked in by a certain vendor) and increasing TCO are typically being overlooked. The majority of early adopters that have chosen a vendor that locks in their data or built the wrong architecture try to change that during the next renewal cycle, mostly due to cost constraints and lack of flexibility.
Many solutions on the market use proprietary formats that make direct connectivity to cloud-based services impossible. They also lock customers to their platforms which makes control of the cost difficult. On top of that, migrating out from such solutions can take a lot of time or generate excessive migration and change management costs. Being dependent on such solutions and so unable to easily transition to another vendor without significant costs & complexities in the future is what is known as vendor lock-in.
Disruption & IT Complexity
The exponential growth of unstructured data requires enhanced data management, data protection, tiering and intelligent archiving. These can be easily addressed by the existing cloud services today. However, replacing or modifying traditional applications that cannot use cloud storage with gateways using NFS/SMB access is a difficult and disruptive process while data is being used. Changing the UI and the workflow, including user access and administration policies, requires substantial change management and training resources. It also considerably increases the risk of losing data. This is a crucial factor during risk assessment. Choosing a less disruptive solution can cut down on a lot of change management resources and training and is crucial for adopting a hybrid workflow.
Cost of adoption
Big data businesses that store their assets in on-premises storage and tapes or plan to ingest a lot of content in the cloud so they can extract more value from it are facing unbearable costs. For example, an average hospital that is planning to modernize their pathology workflow would require tens of petabytes of storage, just to scan and digitize 5-10 years’ worth of archived slides, not to mention the new scans coming every day. Those organizations cannot keep up with the increasing costs of on-premises storage infrastructure. The same applies to the field of video surveillance or just a regular set of file servers in a company. Gateway solutions that will allow them to do so typically cost many times the cost of cloud storage while the real value they provide is a fraction of that of cloud storage.
There are different regulations ensuring that sensitive data assets are managed, organized, and stored in a manner that protects against loss, corruption, theft, and misuse. Existing on-premises workflows that are compliant with those must stay regulated when the workflow changes to hybrid. While most cloud vendors are compliant, many solutions that act as middleware are not. This has to do with using proprietary hardware and formats, encryptions, and protocols. As a result, organizations with regulated workflows are looking at solutions that will not change their compliancy and will get a green light from their data privacy officers, something that is considered a major setback in a hybrid project.
1. Ensure future compatibility – use open file formats
An open format involves storing data in a way that allows anyone to access it in the future.
Any solution that stores the data in an open format in the cloud is considered non-proprietary. This means that if an organization stops using it, they should be able to have access to their data without the help of that solution. There are some cases when the data is stored in an open format, but businesses have to still rely on proprietary ways to access the metadata which is, in a manner of speaking, held hostage. Such solutions cannot be considered vendor lock-free.
Open formats are essential to the concept of future-proof architecture because they enable cloud-based services to read data directly. Universal access to data will allow enterprises to use a single data repository for many use cases including tapping into new services that cloud vendors release every month. One fine example is keeping inactive data in an archive data tier. While its cost is up to 5 times lower and archive data typically does not require immediate access, some time in the future it can be moved back to a cool storage tier where it can be processed by a service thereby extracting additional value for the enterprise. All this can happen without incurring additional costs or sacrificing compliancy and resiliency if open file formats are used.
2. Prevent disruption to demanding on-premises workflows – do not change existing data structure and I/O patterns
An open format involves storing data in a way that allows anyone to access it in the future.
While the cloud file storage benefits are clear (scalability, interoperability and cost to name but a few), standard applications cannot use the storage without cloud tiering solutions. There are several types that exist:
File-based hybrid tiering does not discard the existing infrastructure and keeps all legacy applications and functionality unchanged. It uses filter drivers or direct integration with the application. More importantly, it enhances the scalability, continuity, security and interoperability of the existing infrastructure and the related workflow. As a result, the cloud augments existing workflows rather than being an external network share requesting a significant change in order to be used. Let’s face it, even if there is only one application in the workflow that cannot support (or has limited functionality with) a network share, this can cause a significant disruption.
3. Go beyond storage – tap into cloud-based services
A key factor in digital transformation is the ongoing market shift from product-based to service-based models. Services such as self-driving vehicles, modern healthcare workflows, predictive maintenance, and many more, will not be possible without the help of cloud-based cognitive artificial intelligence and machine learning services which are rising in popularity.
The modernization capabilities of the cloud are not limited to file sharing, off-site data protection, migration or archiving and collaboration. Those are just pieces of the puzzle. Each of the major cloud vendors offers around 6,000 services. The end game for any modern enterprise will be workflow augmentation using cloud services. Anything from Business Intelligence, analytics to cognitive AI/ML will provide a new type of smarter, faster, and better services on a large scale. Those services will continue to be needed on the edge in a legacy workflow for a long time. On top of that, it looks like the pace at which services are evolving surpasses the rate of development of cloud-native applications, which is another reason to believe that edge infrastructure will not go away in the near future.
Tiger Bridge is a non-proprietary, software-only data management solution that blends on-premises and multi-tier cloud storage into a single space and enables hybrid workflows. This human-friendly and transparent file-based hybrid tiering solution enables millions of Windows server users to benefit from cloud scale and services, while securely preserving legacy applications and workflows.
Tiger Bridge enables businesses to meet major cloud adoption goals including:
– Disaster Recovery
– Backup & Archive
– Continuous Data Protection
– Remote Collaboration
– Multi-site Sync
– Storage Extension
– Hybrid Workflows
– Cloud Migration
– AI Integrated Workflows: Tiger Bridge can sync data with cloud AI services bi-directionally, so that the augmented metadata is available immediately without the need for third-party solutions.
When creating a long-term strategy for your organization, future-proof architecture is absolutely vital. Considering only short-term goals that are imminent can create difficulties later. Storing 10s or 100s of TB in the cloud may look affordable today but doing it the wrong way may become a major obstacle in the future. When considering their future architecture, organizations must consider all important factors. Ensuring future interoperability by implementing the right strategy will reduce costs, prevent disruption and avoid unnecessary migration.
Want to learn more about Tiger Bridge? Check out our 5-minute overview video.