I would still suggest that you work with those users though. Maybe have a brainstorming session with them to explain the issue and then work out a standard way of working. You could even look at something like Share point so that you can keep the meta data in a proper system rather than in the filename.
The trick that finally got them to understand was when they tried to save a document at the end of one of these paths and it broke. Or when they tried to open one and it kept giving them an error. They have to see the error of their ways on their own. Now, I have them preaching to other users about the dangers of long file names and absurd symbols in the file and folder names. It may take explaining to them that there are no backups if you do it wrong. Either change or live with the consequences of ignoring the rules.
Maybe even put it in terms they understand. IT Director of a law firm here -- I feel your pain!!! We don't yet do cloud backups but instead use Synology DiskStations as backup repositories in Veeam. As far as accessing or working with long file paths, I wish there were a magic bullet as we deal with it here too, but so far I simply have not found one.
They then rename a parent folder to have a longer name then suddenly they can't access, delete, or otherwise interact with files they already saved. We have used this also for quick backups to external disk or other quick hodgepodge backups.
No folder path issues there and Veeam restores work well. Consider Veeam for cloud backups as it does support many cloud backups assuming you use paid version. If you are not a large firm, Veeam Backup Essentials is priced very reasonably and works as well as their full version except for socket or number of VMs limit.
Let's say you have a server named server1. I wish there was a magic wand. We've been dealing with this a while and if anyone here knows of a full fix please let us know! EDIT: I re-read your original post and see it is a small group with no servers. It may seem rudimentary but robocopy scripts can work great for this purpose, as long as you can "map" or "subst" the cloud drive. Might be the simplest and cheapest fix. This is, unfortunately, a management problem, and not one that can be fully solved from a technology standpoint.
Management needs to be made aware that current practices undermine or remove the ability to backup and restore critical business data. It's that or you'll need some form of document management system that abstracts the underlying file structure away from users. Either way your users will have to learn something new. One of these options is free, the other is not. However, whichever way you go, you will need management buy-in in order to change user behaviour.
What you really need is a document management system where the metadata being encoded in the path and file name is stored beside a simpler path and filename or no visible paths at all. It also helps during discovery or simply searching for files. Many DMS actually house the files in a relatively flat structure and categorization, classification, and chronology all become metadata. This brings several benefits:. I have a client that uses Paperport!! It was NOT designed for this.
They've complained about performance of the software for years, and after MANY discussions, tests, experiments and such have finally realized that Paperport is the problem they have to fix.
They have yet to fix it. Appreciate the time. In a larger firm, I'd definitely have more resources to throw at this, but where we are for a lot of my solo practitioner clients is a PC with Veeam Agent for Windows backing up to a local external drive, and consumer cloud backup software.
I've thought seriously about just uploading the Veeam files to BackBlaze or trying to convince them to buy a Synology just for backup and letting the NAS handle the uploads to B2. I'm still thinking about it. I sent an email to the worst offender today and we'll see what the response is.
I'm exploring options with the users. Good Luck! My similar "Ouch! Lyttek Oy. What is the goal? Backup systems may have prerequisites. If you don't meet the prerequisites, then the product will not work for you. What's a bigger killer? Can Symantec support restoration successfully? OS is windows server What I think happens is there are some software components that can read and write deeper depths but other components that can't which then results in the possibility of creating data in a depth that is not officially supported by the operating systems and therefore not by us either.
In also means that some programs will be able to access those deep depths and some won't. I can remember before I worked at Symantec so probably with NT4 or Windows operating systems someone copied a directory sturcture that was something like characters deep into a directory structure that was already characters deep. Even though the copy worked we had all sorts of random issues with the result.
With regards Backup Exec we officially support what the oeprating system officially supports, anything else might work but equally might give you problems and you would have to test both backups and restores yourself. Communities View more.
Turn on suggestions. Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type. Showing results for. Search instead for. Did you mean:. Sign In. Here you can specify what files you want copied and how many revisions you would like to keep. For example, you could set up BackupChain to keep just one copy of each file, as shown above. However, you could also keep more than one copy. This type of retention is unique in BackupChain and can be used in various different ways.
There a quite a few solutions you can craft with this server backup software and particularly the screen above, regardless whether you use it on Windows 10 or Windows Server R2, the features for file server backup are the same. Simple answer: Windows Explorer is quite lacking, even on Windows Servers. On Windows 10 they put even more fancy graphics but the underlying issues are still the same.
Copy commands may fail and it might simply close the window without even telling you it failed! It does the same when it finishes, so what happened, did it succeed? Also when things fail and when it does display a message, it may sit and wait there forever for an answer. What we really want is a copy job that is reliable.
0コメント