Author name: Managecast Technologies

General Cloud Backup

Considering a Low Cost Cloud Backup Solution?

Ouch, Carbonite is not having a good day.  I see some people choose these low cost cloud backup providers without realizing they are not the same as enterprise-class backup providers like Managecast. It would seem you get what you pay for. Carbonite Forces Password Reset After Password Reuse Attack! Interested in learning how Managecast can help your business with its cloud backup and disaster recovery solutions? Fill out this form for more information!

General Cloud Backup

Top 5 Cloud Backup Myths

As a cloud backup managed service provider, we often encounter common myths about cloud backup. Here, we aim to dispel some of the most pervasive—and incorrect—perceptions: Myth 1: Cloud Backup Isn’t Secure One of the biggest concerns people have about cloud backup is the security and privacy of their data, especially with constant news of data breaches. Ironically, the right cloud backup solution can make your data much more secure than traditional backup methods. Often, when we hear someone express concerns about cloud security, it becomes clear their existing backup solution is far less secure. Traditional, customer-managed backup systems struggle to get data offsite quickly and securely, and they often don’t follow best practices around media rotation, encryption, and data protection. Security is a top priority for cloud backup service providers, with stringent protocols to ensure data protection. For instance, we offer highly encrypted services (AES 256-bit, FIPS 140-2 certified encryption), with clients maintaining control of the encryption key. This means we do NOT store your encryption key, ensuring we don’t have access to your data unless you request it. Summary: Cloud backup providers use cutting-edge security measures to ensure your data is protected—far beyond traditional backup methods. Myth 2: Restoring Data from the Cloud Takes Too Long It’s true that restoring massive amounts of data from the cloud could take time, but most enterprise cloud backup solutions also store data locally. In fact, 99% of recoveries are made from local storage at LAN speed, so restoring from the cloud is rarely needed. In the rare case of a site-wide disaster where local backups are compromised, most business-class cloud backup providers can ship your data on portable, fully encrypted media within 24-48 hours. Some providers can even spin up recovered servers in the cloud for fast recovery. Summary: Restoring data from the cloud is rarely necessary, and cloud backup providers offer quick, alternative recovery methods. Myth 3: Too Much Data to Backup Many people believe they have too much data to back up in the cloud. However, with modern cloud backup solutions, this is rarely an issue. Traditional backup systems often rely on full backups, which can be time-consuming and impractical for cloud backups. However, cloud systems use an “Incremental Forever” approach, where only the initial full backup is performed once. After that, only incremental backups—transferring just the changed data at the block level—are made, significantly reducing the amount of data being backed up. The initial full backup is typically performed on mobile media (like a USB drive) and shipped (encrypted) to the data center, avoiding large data transfers over the internet. As a rule of thumb: for every 1TB of data, you need 1 T-1 (or 1.55Mbps) of bandwidth. A 20Mbps internet connection can support a 12TB environment. Summary: Cloud backup solutions efficiently manage large amounts of data through incremental backups, making data volume rarely a concern. Myth 4: Incremental Forever Means Hundreds of Restores A common misconception about “Incremental Forever” backups is that restoring data will require restoring hundreds of small backups. This is far from the truth. Modern incremental backup software is designed to assemble your data automatically at any point in time, allowing you to restore to any moment with just a few clicks in a single operation. Summary: Restoring with incremental backups is quick and straightforward—just one operation to restore data to any point in time. Myth 5: Cloud Backup Is Too Expensive Nothing is more costly than losing your business-critical data. Our solution is based on the size of your backups, not the number of devices or servers being backed up. Plus, data is deduplicated and compressed, reducing overall storage costs. Older, archived data can also be stored at a lower cost, helping you align backup costs with the value of your data. In many cases, we can reduce costs by moving older data to lower-cost storage tiers. Additionally, you’re getting expert management, monitoring, and support services from your cloud provider. Without a managed service, backups often go unmonitored, untested, and unrestored. With us, you receive full expert support and monitoring—ensuring your data is safe—at a much lower cost than doing it all in-house. Summary: Cloud backup costs are justified when you consider the security, management, and peace of mind that come with a managed solution.

Veeam

Managecast is Now a Proud Member of the Veeam Cloud Connect Partner Program

  Managecast is a featured partner on Veeam’s list of service providers that offer the Veeam Cloud Connect services. Veeam Cloud Connect enables you to quickly and efficiently get your Veeam backups offsite, safe and secure, so you can always recover your data no matter what! Our services powered by Veeam allow for: Managecast is offering 30 day, no-obligation, free trials and enabling your existing Veeam installation could not be easier. Get your existing Veeam backups offsite using the familiar Veeam management console. Managecast can also provide updated VEEAM software and licensing if required. Our Cloud-based disaster recovery and offsite backup powered by Veeam can now be easily used to provide offsite disaster recovery capabilities for your organization. Contact us for a free trial.

General Cloud Backup

Tape is Not Dead, and Why I Finally Bought a Tape Library

Being the “Cloud Backup Guy” I’ve made a living off replacing tape. Tape is that legacy media right? It’s true that for most small to medium businesses, tape is hard to manage, expensive to rotate offsite, and has virtually been replaced by disk-to-disk (or disk-to-disk-to-cloud) technologies. However, I am finally willing to say tape definitely has it’s place. Given that I have been so anti-tape for many years, I thought it was news to share when I finally decided that tape had it’s place. Don’t get me wrong. I’ve had nearly 30 years of IT consulting experience. In the old days I used nothing but tape as it was the only real option for data protection. I’ve also had my share of bad experiences with tape (mostly the old 4mm and 8mm drives and tapes). I hated the stuff and never wanted to rely on it. Like many seasoned IT professionals of the past, many of us had nightmares to tell about tape backup. When I got into the Cloud Backup business, the passion I had for disliking tape really helped me convince folks not to use it. Now don’t get me wrong, I think for most SMB’s tape is dead. However, as your data volume grows, and I am talking 50TB+ of data, you can not ignore the efficiency and cost effectiveness of good old tape. Tape has also come a long, long way over the years. Gone are the days of 4mm and 8mm DAT tapes.  LTO, the clear tape standard for the modern era, boasts LTO-7, now with a native capacity of 6TB+ (15TB compressed) per tape cartridge. LTO offers a reliable and cost effective method to store huge quantities of data at a much lower cost than disk storage technology. What brought about this decision to finally embrace tape?  The decision to choose tape became apparent as we were gobbling up more and more disk space for cloud backups. Our growth rate has been significant and trying to keep up with backup growth meant buying more and more disk. It’s not just the cost of disk we had to buy, but the rack space, the power, cooling, and other costs associated with hundreds of spinning disks, plus the cost of replicating the data to another data center with more spinning disks!  A significant segment of our backup storage was consumed by long-term archival storage of older data that continued to grow rapidly as data ages. Our cloud backup solution allows tiering of the data so that older, less frequently used data could be pushed to longer-term archival storage. Once I faced the decision to have to buy even more disk versus the cost of a tape solution to store the ever growing mountain of archive data, it became a no-brainer. Tape was the clear winner in that type of scenario. Allow me to stress that I am not a proponent of tape except for all but the largest of companies or others who required long-term archive of a large amount of data. It still introduces manual labor to swap and store tapes, take them offsite, etc. For near and medium term data, we still keep everything stored on disk for quick and easy access. However, for the long-term archival data, we are using tape and love the stuff. The nice thing is that our customers still don’t have to worry about using tape as we manage everything for them.

Asigra

The Requested Operation Could Not be Completed Due to a File System Limitation (Asigra)

On trying to backup an Exchange database using Asigra we were seeing the message “The requested operation could not be completed due to a file system limitation” after about 4 hours of backing up. This was an Exchange database backup (non VSS), and it was copying the database to the DS-Client buffer.  The Exchange database was 1TB+.  The DS-Client was running on Windows 8.1. The message: The requested operation could not be completed due to a file system limitation  (d:\buffer\buf\366\1\Microsoft Information Store\database1\database1.edb) Solution: There is a default limitation with the Windows file system for supporting large files. We had to reformat the buffer drive on the DS-Client using the command: format d: /fs:ntfs /L /Q After making this change we no longer experienced the error message and backups completed successfully.

General Cloud Backup

Why You Need a Backup/Disaster Recovery MSP

From the one-person office to the largest enterprise, every business has valuable information that needs to be managed, backed up, and securely stored. Today, more companies—large and small—are turning to Managed Service Providers (MSPs) to handle their backup and disaster recovery (DR) needs. This raises an important question for businesses of all sizes: Why do YOU need a backup and disaster recovery MSP? The answer is simple: Expertise. Experts agree that disaster recovery planning requires complex preparation and flawless execution. This level of execution may be difficult, if not impossible, for businesses to achieve on their own without the support of a skilled MSP. An MSP ensures that DR plans are fully prepared and continuously monitored, helping to minimize recovery time objectives (RTO) and keep your business running smoothly. We all know that disasters—whether natural or man-made—can happen at any time. It’s the MSP’s job to ensure companies are prepared and that their data is protected. In today’s digital world, protecting your company’s data is a critical task. For businesses without an MSP, backups and disaster recovery are often neglected. These processes tend to become part-time tasks within a company, and there’s rarely someone dedicated solely to managing backup and recovery. Regardless of the company’s size, full-time monitoring and planning are essential for effective backup and disaster recovery. Without an MSP, backups can easily be sidelined by other projects. Often, companies assign data backup to someone with little experience—sometimes “the new guy or girl.” This leads to a lack of consistent monitoring, which puts the company at serious risk of data loss. A backup administrator, without the right expertise, may struggle to keep backups efficient and secure. Additionally, system restores are rarely tested without the guidance of an MSP, which leaves companies vulnerable in a real disaster. How Managecast Solves These Problems: At Managecast Technologies, we provide comprehensive backup and disaster recovery solutions for businesses of all sizes. Using industry-leading software from trusted partners like Asigra, Veeam, and Zerto, we ensure that your data is monitored, protected, and stored securely. Our team brings the expertise needed to execute thorough DR planning and business continuity strategies. Instead of risking your company’s critical data, let the experts at Managecast Technologies handle it. We’ll manage everything from setup and retention policies to backup schedules—ensuring your data is protected in the most cost-effective way possible.

General Cloud Backup

Is Backup Tape Dead?

I just had someone contact me and ask my opinion if I thought backup tape is dead. Maybe 6 years ago I would have enthusiastically said “Yes!”, and did so many times. However, after spending the last 6 years dedicated to cloud backup and immersed in the backup industry, my views have evolved on tape. Instead of asking “Is tape dead?”, the proper question is “Has the use of tape changed?”. While tape is far from dead and very much alive, it’s use has substantially changed over the past 5 to 10 to 15 years. In the past, tape was the go-to medium for backups of all types. However, disk has certainly displaced a lot of tape when it comes to near line backup storage of recently created data. Many modern backup environments consist of disk-to-disk backup and then backup data is written to tape after some period of time for longer-term storage and archive. Disk storage is significantly higher cost than tape storage, but for near term backup data the advantages of disk outweigh the cost penalty. For long-term archive of older data, where quick access is not needed, tape is the clear winner. [Read about aligning the cost of data protection vs the value of the data] In my experience, many SMBs have shifted to a disk-to-disk-to-cloud solution with no tape. So, in the SMB one could argue that tape has largely died (or at least diminished greatly). However, at the enterprise-level or those organizations who require long-term retention of backup data, there is no better alternative to storing large amounts of data on tape, and this will probably be the case for the next 10 years or beyond. So, no, tape is not dead, but its use has changed. Interested in learning how Managecast can help your business with its cloud backup and disaster recovery solutions? Fill out this form for more information!

Asigra

Asigra Reporting “Cannot Allocate Memory” During Seed Import

We have DS-Systems running on Linux and we connect the Windows Seed backups to a Windows 7/8.1 machine and then use CIFS to mount the Windows share to Linux. The command we use on Linux to mount the Windows share is: mount -t cifs //<ipaddress of windows machine>/<sharename> -o username=administrator,password=xxxxxx /mnt/seed We were importing some large backup sets with millions of files and started noticing “cannot allocate memory” errors during the seed import process. When the import would complete it would indicate that not all files were imported. At first we thought this was an Asigra issue, but after much troubleshooting we found this was an issue with the Windows machine we were using and was related to using the CIFS protocol with Linux. A sample link to the issue we were seeing is: http://linuxtecsun.blogspot.ca/2014/12/cifs-failed-to-allocate-memory.html That link indicates to make the following changes on the Windows machine: regedit: HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\MemoryManagement\LargeSystemCache (set to 1) HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters\Size (set to 3) Alternatively, start Command Prompt in Admin Mode and execute the following: reg add “HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management” /v “LargeSystemCache” /t REG_DWORD /d 1 /f reg add “HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters” /v “Size” /t REG_DWORD /d 3 /f Do one of the following for the settings to take effect: Restart Windows Restart the Server service via services.msc or from the Command Prompt run: ‘net stop lanmanserver’ and ‘net start lanmanserver’ – The server may automatically restart after stopping it. After we made these changes the memory errors were resolved!

Asigra

Asigra Slow Seed Import

We recently discovered that Asigra DS-System v13.0.0.5 seems to have a serious problem with importing seed backups. This problem exposed itself as we attempted to import 5.5TB of seed data. We then performed additional testing by backing up a small Windows 2008 server. The seed backup was a little under 3GB. On v13.0.0.5 the seed import took 55 minutes. On the same infrastructure, the same server seed backup imported into a v12.2.1 DS-System in less than 3 minutes. In addition we are also seeing the error “cannot allocate memory” during the seed import process even though we have tons of free RAM and disk space. We have notified Asigra and they are attempting to reproduce the problem. Update 12/4/2015 In testing, and working with Asigra, we have found that if you create the seed backup without using the metadata encryption option then the seed import speed is acceptable and imports quickly. Update 12/8/2015 Asigra released DS-System v13.0.0.10 to address this issue. Testing shows it does indeed solve the speed issue. Thanks Asigra!

Asigra

Asigra BLM Archiving – Align the Value of Your Data With the Cost to Protect it

Years ago, we treated all data as being equal. Every piece of data originated on one type of storage and remained there until it was deleted. However, we now understand that not all data is created equal. Some data types are more important or accessed more frequently than others. Backup Lifecycle Management (BLM) is a concept that helps organizations manage data more efficiently by storing it on one system initially, then migrating it to lower-cost storage systems as it ages. This strategic data management approach can reduce storage costs while ensuring critical data remains accessible. Understanding Asigra Backup Tiers Data Classification and Storage Tiers DS-System – Business-Critical Operational Data Business Critical Data such as files, databases, and email systems necessary for daily operations should reside in the DS-System Tier. This tier is optimized for speed and accessibility, ensuring that your mission-critical information is always available. BLM Archiver – Policy-Based Retention for Aging Data Large file servers or repositories containing older data can be migrated to BLM Archiver. The primary advantage is cost savings, as this system automatically moves older data into lower-cost storage tiers based on pre-configured retention policies. At Managecast, we help analyze your data to identify the optimal protection methods suited to your recovery needs and budget. There are many strategies to protect your business’s data by aligning its value with the costs to protect it. BLM Cloud Storage – Low-Cost, Long-Term Data Storage BLM Cloud Storage is a cost-effective solution for rarely retrieved files, typically those older than one year. Large data sets, ranging from 250GB to multiple terabytes, can be moved to long-term storage to ensure compliance and maintain records while reducing storage expenses. Storage Solutions for Rarely Accessed Data Older data can be grouped into long-term cloud storage, making retrieval simple when necessary. Customers can choose between Amazon S3 Cloud Storage or Managecast Enterprise Cloud Storage for scalable, secure storage.

Scroll to Top