This article is intended for administrators wishing to leverage SvSAN as a storage repository for their backup target.
Information
Any storage device can be used as a repository with applications such as Veeam Backup and Replication, and SvSAN is no different, with the value proposition being a synchronously mirrored storage.
We typically think of SvSAN as the storage we use for our production workloads, however it can also be attached, via iSCSI, directly to a Windows server (bare metal or virtual) to be used in conjunction with Veeam, as a backup repository.
Having a software-defined-storage platform such as SvSAN serve up some space for these backups brings benefits in terms of data resiliency.
The SvSAN storage will likely be mirrored and therefore 2 “copies” of this backup data, alongside potentially cache in memory and SSD:
https://support.stormagic.com/hc/en-gb/articles/5887719016989-SvSAN-Caching
as well as encrypted:
https://support.stormagic.com/hc/en-gb/articles/5978263848861-SvSAN-Encryption
as well as being flexible enough that it can be tailored for use, and hardware selected for optimal performance vs. cost.
https://support.stormagic.com/hc/en-gb/articles/5986612738717-SvSAN-Performance
Having reliable storage from which to restore in the event of a catastrophe is as important as backing up in the first place. Gone are the days of offsite tape backups and rotating disks being taken home by staff, and in their place are live backup repositories with fully functional connectivity able to replace the core infrastructure either in full or in part with just a few clicks and with no physical movement of data.
With this in mind, what differences are there between these use cases, and what settings/features should be considered when troubleshooting problems or optimizing efficiency?
SvSAN was tested and validated as the storage using a backup solution with SvSAN as both the source and target of the investigation.
StorMagic recently applied for and passed the VeeamReady alliance partnership, and this consisted of running a suite of tests against SvSAN storage in a controlled way.
There were several tests from a simple backup of VMs to a full synthetic backup - comprising of reconstructed data from existing backups rather than re-copying data already in the backup repository – alongside a simple restore and a instant recovery – where the VM actually runs from the backup repository on the original host it was running on, testing network connectivity as well as the suitability of the backup storage itself to support these features in the real world.
Veeam Ready Test Setup
Veeam test infrastructure setup
Hardware:
2x ThinkSystem SR850 servers
- Intel(R) Xeon(R) Gold 6142 CPU @ 2.60GHz
- 768GB RAM
- Broadcom ThinkSystem RAID 930-16i 4GB Flash PCIe 12Gb Adapter
- ThinkSystem 1610-4P NVMe Switch Adapter
--2x 222GB RAID1 – Vmware and StorMagic
--2x NVMe Datacenter SSD [3DNAND] ME 2.5" U.2 (P4610) – Source Datastore and Target Repository
2x10GbE management connection (172.20.70.0/24)
2x 82599 10 Gigabit Dual Port Network Connection (direct attached) (192.168.70.0/24)
RAID and performance settings
The submitted results quoted throughout this document were run using 2x NVME drives in RAID0 providing 2.9TB volume on each server. This is due to availability and that the test required 1.6TB minimum.
Software:
VMware ESXi 7.0.3
VMware vCSA 7.0.3
StorMagic SvSAN 6.3
Veeam Backup and Replication 12 (Windows Server 2022- 16 cores, 32GB RAM)
SvSAN Storage configuration
Server1: 2x 1.45TB NVME in RAID0 presented to SvSAN via Raw Device Mapping, then combined in software RAID0 and shared as a simple target datastore to both hosts as “VM Store” VMFS Datastore
Server 1 will host all the Guest VMs included in the test, including the managing vCenter, and a Linux Jump VM created to access the system.
Server2: 2x 1.45TB NVME in RAID0 presented to SvSAN via Raw Device Mapping, then combined in software RAID0 and shared as a simple target datastore to both hosts as “VeeamRepository – V:\”
Server 2 will host the Veeam Backup and Replication Windows VM, running on the SvSAN shared datastore from Server1.
VeeamRepository Target on svsan-b
This is just a simple target presented only to windows (note initiator added)
Target mounted in Windows iSCSI initiator
The Windows VM with a virtual NIC on the same vSwitch as the pair of VSAs for iSCSI was able to use a 10GbE connection to mount using a single path.
Disk initialised, and a new simple volume created, ReFS 64K allocation size
The Veaam test conditions stipulated an ReFS-formatted disk with a 64K allocation unit size as above. Veeam uses a 1MB before 2x compression block size, meaning each block is 512KB or less. Aligning this with the filesystem and in turn the storage subsystem can have a significant impact on write performance.
For example, a 64KB allocation unit file system as being used here means Veeam will require 8 IOs per 512KB block, (64 x 8) which is a 2x improvement over a 32KB allocation file system which will require 16 IOs for the same 512KB block (32 x 16).
Veeam file system settings
Once created and formatted, it can be selected as the Backup repository, in this case V:\ within Windows to use for testing.
Use this newly created volume as a backup repository
On Server 1, the storage is presented more traditionally as a simple target to both hosts using the same underlying disk configuration.
Simple target from svsan-a on Server 1
Datastore mounted on both hosts
Veeam Ready – Repository
The Veeam Ready partnership we have applied for involves testing the software storage solution ensuring it meets the standards required to be considered a “Veeam Ready – Repository”.
Veeam Ready alliance partnership categories
There are other categories as above, for more info, see here: https://www.veeam.com/alliance-partner-technical-programs.html?page=1
For Veeam 12, the testing is done via an automated script, but essentially runs the same suite of tests as that on the Veeam 11 validation.
1. Main backup and incremental jobs
Restore Job
Instant Recovery
Synthetic Full Backup
NAS NFS Backup Full and Incremental
NFS Restore
Script output samples:
Comments
0 comments
Article is closed for comments.