Modernize your On-premises SQL Server Infrastructure by Utilizing Azure and Azure Data Studio

Data estates are becoming increasingly heterogeneous as data grows exponentially and spreads across data centers, edge devices, and multiple public clouds. In addition to the complexity of managing data across different environments, the lack of a unified view of all the assets, security and governance presents an additional challenge.

Leveraging the cloud for your SQL infrastructure has many benefits like cost reduction, driving productivity, accelerating insights and decision-making can make a measurable impact on an organization’s competitiveness, particularly in uncertain times. While infrastructure, servers, networking, etc. all by default are maintained by the cloud provider.

With SQL servers 2008 and 2012 reaching their end-of-life, it is advisable to upgrade them or migrate them to Azure cloud services. Modernizing any version of SQL server to Azure brings up many added benefits, including:

  • Azure PaaS provides 99.99% availability
  • Azure IaaS provides 99.95% availability
  • Extended security updates for 2008, 2012 servers
  • Backing up SQL Server running in Azure VMs is made easy with Azure Backup, a stream-based, specialized solution. The solution aligns with Azure Backup’s long-term retention, zero infrastructure backup, and central management features.

Tools leveraged

For modernizing the SQL infrastructure, SNP leveraged a variety of tools from Microsoft, such as the following.

  • The Azure Database Migration Service has been used since the beginning to modernize on-premises SQL servers. Using this tool, you can migrate your data, schema, and objects from multiple sources to Azure at scale, while simplifying, guiding, and automating the process.
  • Azure Data Studio is one of the newest tools for modernizing SQL infrastructure with an extension of Azure SQL Migration. It’s designed for data professionals who run SQL Server and Azure databases on-premises and in multi cloud environments.

Potential reference architecture diagram

Let’s take a closer look at the architecture, what components are involved and what is being done in Azure Data Studio to migrate or modernize the on-premises SQL infrastructure.

Among the components of Azure data studio are the source to be modernized, the destination where the on-premises SQL must be moved, and the staging layer for the backup files. Backup files are a major component of modernization.

There are various components that are involved in the Azure Data Studio migration or modernization- Source SQL server. The on-premise SQL server which is to be modernized/migrated, Destination Server- The Azure SQL VM to which the on-prem SQL server will be moved, and the staging layer (Storage Account or the Network Share Folder) for the backup files. Backup files are a major component of modernization.

Azure Data Studio and Azure SQL Migration primarily rely on backup files. It uses a full backup of the database as well as transactional log backups. Another important component is the staging layer, where backup files will be stored.

Microsoft Azure Data Studio uses a network share folder, an Azure storage container, or an Azure file. There must be a specific structure or order in which backup files are placed in either of the places. As shown in the below architecture, backup files specific to the Database must be placed in their own folders or containers.

As part of the migration to Azure, Azure Data Studio along with the Azure SQL Migration extension utilizes a technology called Data Migration Service, which is the core technology behind the scenes. It has also been integrated with Azure Data Factory, which runs the pipeline at regular intervals to copy the backup files from the on-prem network share folder to Azure thereby restoring them on the target or restoring them if they are in containers.

When the backup files are in a network share folder, Azure Data Studio uses Self Hosted Integration Run time to establish a connection between on-premises and Azure. After the connection has been established, the Azure Data Studio begins the modernization process leveraging Azure DMS.

Initially, all full and subsequent transactional log backup files of the databases are placed in a specified database folder or database container. Azure Data Studio copies backup files from network share folders to Azure storage containers if the backup files are in a network share folder.

Following this, Azure Data Studio restores them to the target Azure SQL VM or Azure SQL Managed Instance while Azure Data Studio directly restores backup files from the storage account to the Azure target if the backup files are stored in the storage account.

Following the completion of the last log restoration on the target Azure SQL database, we need to cut over the database and bring it online on the target. The databases will be placed in the Restoring mode during the restoration of the backup files, which means that we will not be able to access them until the cutover has been completed.

Your next steps

If you like what you have read so far, let’s move forward together with confidence. We are here to help at every step. Contact SNP’s migration experts.

Microsoft Fabric: Meet your Organizational Goals for Effective Data Management and Analytics

In 2023 Microsoft announced major updates to its Azure data platform. While their OpenAI service has been dominating the headlines, questions about Microsoft’s new, comprehensive analytics solution Fabric have been just as central in our customer discussions.

Adoption of a new data analytics platform is no easy feat, with concerns around skilling, tool integration, obsolescence, security and so forth. The concerns and path forward will vary depending on the organization’s circumstances – from those having a legacy SQL Server implementation on-premises to those having an advanced, cloud-native analytics deployment, and everywhere in between. That said, these four high-level questions should be in mind for any organization evaluating the Fabric potential:

  1. Should we consider exploring Microsoft Fabric given our existing data platform maturity and investment?
  2. Which Fabric capabilities would be the most suitable and beneficial for my organization?
  3. How can I preserve my current data platform investment while capitalizing on the advantages offered by Fabric?
  4. What steps are necessary to establish governance and cost management in this new platform?

Before we continue with our recommendations to address these questions, let’s take a few minutes to level-set on what Fabric is and is not.

The Microsoft Fabric Ecosystem

At its core, Fabric is software-as-a-service (SaaS) integrating three Microsoft products: Azure Data Factory, Power BI and the Synapse data platform for an all-in-one suite designed for performance, scalability and ease of use. Underlying Fabric is the OneLake unified data foundation and Azure Purview for persistent data governance.

Microsoft Fabric Ecosystem

Source: What is Microsoft Fabric – Microsoft Fabric | Microsoft Learn

Facets of the Microsoft Fabric Platform:

  • Data Engineering: Empowers data engineers to transform and democratize large-scale data through Apache Spark within a lakehouse.
  • Data Factory: Equips business and technical users with tools for data integration, offering over 300 transformations (including AI-based) via 200+ native connectors while managing data pipelines.
  • Data Science: Integrates with Azure Machine Learning, enabling data scientists to develop and deploy ML models and leverage SynapseML for scalable ML pipelines.
  • Data Warehouse: Natively stores data in the open Delta Lake format separate from compute to promote scalability and performance for analytics workloads.
  • Real-Time Analytics: Facilitates querying and analysis of observational data, including real-time streams, unstructured/semi-structured data, and IoT data.
  • Power BI: An integral component of Fabric, providing data visualization integrated with Microsoft 365 apps and within Power BI.
  • Dataflows Gen 2: A new generation of dataflows accelerates authoring with a no-code/low-code experience.

While Fabric is a SaaS offering, it is still a resource installed in an Azure subscription. As such, a landing zone with prerequisite identity, networking, security, and governance must be in place.

While a Fabric analytics solution can be fully composed with Fabric tooling, within the Azure tenant an organization can integrate Fabric with Azure resources that sit outside Fabric, for instance Cosmos DB, Azure AI Services and Azure Monitor.

For more information about Fabric, please see the Resources section at this end of this blog.

Microsoft Fabric Evaluation Criteria

When working with our customers, SNP recommends the high-level success criteria below be evaluated when contemplating Microsoft Fabric as their Data & Analytics platform:

  • Seamless Integration: Evaluate how Fabric, with external tools, should be achieve data sharing and workflow orchestration without disruption to your established ecosystem.
  • Improved EfficiencyThe true value of any platform lies in its ability to simplify processes. Microsoft Fabric should reduce the time and effort required for data engineering, data science, and analytics tasks. Evaluate how migrating the workloads to Fabric will increase efficiency and productivity across data engineering services and also result in faster time to derive insights in BI apps.
  • Data Democratization Microsoft Fabric empowers business users and data scientists alike, offering self-service access to data and analytics capabilities. Evaluate how this feature can help in extending the utility of data throughout your organization.
  • Scalability: As your organization grows, so do your data needs. Evaluate Microsoft Fabric’s ability to scale effectively, accommodating increased workloads without compromising performance.
  • Cost Optimization:  Financial considerations are paramount. Evaluate how Fabric can help in cost reduction, optimized resource utilization, and improved cost management capabilities, when compared to your existing data platform architecture.
  • Enhanced Data Governance: Data governance is critical, especially in today’s regulatory environment. Evaluate how Fabric facilitates effective enforcement of data governance policies, ensuring data quality, and maintaining compliance standards.
  • Data Security: With data breaches an ever-present threat, success means strengthening data security and privacy, especially for sensitive or regulated data. Evaluate the Security features of Fabric to over the risks and implement tighter security policies.

Conclusion

In this blog, we’ve explored the evaluation criteria that can guide your organization’s adoption of Microsoft Fabric as your Data & Analytics platform. By keeping these criteria in mind, you can maximize the value of this platform alongside your existing investments, leading to more effective decision-making and a competitive edge in your industry.

Your Next Steps:

If you like what you’ve read so far, let’s move forward together with confidence. SNP recommends the following approach to start understanding, exploring, and evaluating Microsoft Fabric for your business: We’re here to help at every step. Contact SNP’s Data & AI experts here

SNP process

Modernize and Migrate your SQL Server Workloads with Azure

Modernizing and migrating SQL Server is just one of the many reasons why a company might want to migrate its data. Other common reasons may include mergers, hardware upgrades or moving to the cloud. In most cases, however, data migrations are associated with downtime, data loss, operational disruptions, and compatibility problems.

With SNP Technologies Inc., these concerns are alleviated, and the migration process is simplified. We help businesses migrate complete workloads seamlessly through real-time, byte-level replication and orchestration. For enhanced agility with little to no downtime, we can migrate data and systems between physical, virtual and cloud-based platforms.

When it comes to the modernization of SQL Server, we can migrate and upgrade your workload simultaneously. Production sources can be lower versions of SQL Server that are then upgraded to newer versions, for example, SQL 2008, 2008R2, and 2012 can be moved to a newer version of Windows and SQL or to Azure.

 

Some key benefits of modernizing or migrating your sql workloads include:

  • Built-in high Availability and disaster recovery for Azure SQL PaaS with 99.99% availability
  • Automatic backups for Azure SQL PaaS services
  • High availability with 99.95% for Azure IaaS
  • Can leverage the azure automatic backups or Azure Backup for SQL Server on Azure VM

Listed below are the various steps SNP follows to migrate an on-premises SQL server to Azure PaaS or IaaS

  • Assessment to determine what is the most appropriate target and their Azure sizing.
  • A performance assessment will be conducted before the migration to determine potential issues with the modernization.
  • A performance assessment will be conducted post-migration to determine if there is any impact on performance.
  • Migration to the designated target.

 

As part of our modernization process, we utilize a variety of tools that Microsoft provides. The following are various tools or services we leverage during the modernization process.

Assessment to determine what is the most appropriate target and their Azure sizing with Azure Migrate:

Azure migrate is a service in Azure that uses Azure SQL Assessment to assess the customer’s on-premises SQL infrastructure. In Azure Migrate, all objects on the SQL server are analyzed against the target (whether it’s Azure SQL Database, Azure SQL Managed Instance or SQL Server on Azure VM) and the target is calculated by considering all performance parameters such as IOPS, CPU, Memory, Costing etc., along with the appropriate Azure size. Following the assessment, SNP gets a better idea of what needs to be migrated, while the assessment report recommends the most appropriate migration solution.

This assessment generates four types of reports:

  • Recommended type (it gives us the best option by comparing all the available options (best fit)– If the SQL server is ready for all the targets, it will give us the best fit considering all the factors like performance, cost, etc
  • Recommendation of instances to Azure SQL MI– It gives the information If the SQL server is ready for MI. If the SQL server is ready for MI, it gives us a target recommendation size. If the Server has any issues with SQL MI, it shows us all the various issues it has and its corresponding recommendations
  • Recommendation of Instances to Azure SQL VM– It will assess individual instance and provides us with the suitable configuration specific to individual instance
  • Recommendation of Servers to SQL Server on Azure VM– If the server is ready to move to the SQL server on Azure, it will give us the appropriate recommendation

Our assessment checks if there are any performance impacts post migration with Microsoft’s data migration assistant

To prepare for modernizing our SQL infrastructure to Azure, we need to know what objects will be impacted post-migration, so we can plan what steps to take post-migration. A second assessment is performed using a Microsoft Data Migration Assistant (DMA) tool to identify all the objects that will be impacted after migration. This tool can be used to determine which objects are going to be impacted post-migration during this phase. The DMA categorizes the objects into five/ four categories for modernizing to SQL Server on Azure VM.

Some key factors considered at this stage include:

  1. Breaking Changes:  These are the changes that will impact the performance of a particular object. Following a migration, we will need to ensure that breaking changes are addressed.
  2. Behavior Changes: There are changes that may impact query performance and should be addressed for optimal results.
  3. Informational issues: We can use this information to identify issues that might affect post-migration
  4. Deprecated Feature: These are the features that are going to be deprecated
  5. Migration blockers: These are the objects that are going to block the migration, either we remove them prior to migration or change them as per the business requirements.

Note: Migration blockers are specific to the Modernization of SQL Server to Azure SQL PaaS

 

Modernization using Azure Data Studio:

Once we have an Azure target along with the Azure size and a list of affected objects, we can move on to modernization, where we migrate our SQL infrastructure to the Azure target. In this phase, the SQL infrastructure is modernized using a tool called Azure Data Studio, which uses an extension called Azure SQL Migration, leveraging Azure Data Migration Service (Azure DMS).

In Azure Data Studio, you will be able to perform a modernization of the SQL server infrastructure by using the Native SQL backups (the latest full back up as well as the transactional log backups from the previous backup). In this method, backup files of SQL server databases are copied and restored on the target. Using Azure Data Studio, we can automate the backup and restore process. All we must do is manually place the backup files into a shared network folder or Azure storage container so that the tool recognizes the backups and restores them automatically.

Post Migration:

Upon completion of modernization, all objects impacted by the modernization should be resolved for optimal performance. DMA provides information regarding all impacted objects and offers recommendations on how to address them.

Your Next Steps:

If you like what you’ve read so far, let’s move forward together with confidence. We’re here to help at every step. Contact SNP’s migration experts here