Blog Before me sits a window which contains what is supposed to my third and final piece on managing user data. To get here has taken weeks of research, several test environments, seemingly endless conversations and debates with co-workers and an awful lot of reflection on the decade I’ve spent trying to solve this problem.
While I can thankfully say that the end of this process has left me feeling less lost than the beginning, clouds of doubt still hang over my head. When I consider the problems of dealing with user data in the networks I will be personally responsible for over the coming years, I fear that the advancements in user data management simply haven’t kept pace with the real world.
During my career in IT, I have found that most everyone who ever gets involved with planning tends to bring unneeded complexity to the table. Ideas get bashed around, plans become more and more complex until finally someone has to stop the process and ask: “What is the bare minimum required to make this work, what do we currently have, and where’s the gap?” Start small, work your way up, keep it redundant and reliable, and don’t add a layer of complexity unless you absolutely have to.
The problem is these user data questions aren’t an outright technological problem. When I stare at the whiteboard and try to perform the same trick here, I realise that this is where some “blue sky thinking” would really come in handy.
Turning my usual thinking process on its head, I’m going to look at the most complicated situation I can realistically envision in my environment, and work backwards from there. (I generally prefer to avoid this approach because reality has a nasty habit of eventually coming up with something more difficult and complex than I could possibly have foreseen.)
My nightmare scenario involves a particularly grumpy and impatient remote user who, if something is going to go wrong, it will go wrong first there. This user is above menial and trivial things like saving files in a designated location (such as an H:\ drive,) instead choosing whatever default location the program presents. The user has terrible clickitis; not only will they never read all the way through an IT policy document, (or even one of our emails,) but they click “yes” to the first thing that pops up on the screen. They are “that user” who is simply impatient that the computer is “getting in their way”.
This user lives in another province; hundreds of kilometres from any IT staff, on the end of an ADSL connection that I am pretty sure has a length of connectivity provided by two cans and a stretched out piece of old liquorice. The user travels to the weirdest parts of the continent, ending up behind firewalls whose rulesets were devised by soulless sociopaths as some form of aggravating systems administrator from a distance.
The user visits company locations on a semi-regular basis and complains loudly if their “user experience” from home isn’t as good as the staff in-office. This user will sometimes they have a requirement for real-time collaboration with their co-workers. This is something that I have only figured out how to provide by offering up a virtual machine inside the corporate network to remote users; sometimes you need to be “on-net” to get certain things done.
Since this user happens to be incomprehensibly in the CEO’s favour, they can and will ignore any and every policy IT creates. Just for fun; I have to do it all with the software and hardware I’ve already got.
If you’ve been keeping up with my previous articles, you’ve probably guessed that the answer to this riddle will involve folder redirection and roaming profiles. The real trick, as I mentioned before, is how to get a user who never shuts down or logs off their laptop to synchronise the files back to the server.
There are things I can do to trigger synchronisation whenever I feel like, but I still need more granular control. The computer needs to be able to tell when it is on-net or behind a reasonably high-speed VPN connection and thus synchronisation on a regular basis is a great idea. The computer also needs to be able to tell when the connection is so slow that synchronisation would be a very bad plan.
Enter the “slow link” group policy settings. I won’t bore you with details Google can provide for you, but suffice it to say that a “slow link” is determined by the speed of the connection between your computer and its domain controller. It’s a configurable setting that, since it is a GPO, means it can set per user, per computer, or per group. Offline files and folders (and by extension folder redirection) can be configured not to synchronise if it detects a slow link.
Dispensing with the theory, what this means in the real world is that with the right scripting and some clever GPOs I can force the laptop to check whether or not it has a “slow link” every time it attaches to a new network. If the link passes muster, then synchronisation is triggered and the user’s files are backed up to the server.
Let’s extend this a little and throw in computers that are to be run as demo units at trade shows. My luck is so amazing that my previously discussed nightmare user is often the individual in charge of getting the demo computers up and running at a trade show. Not all trade shows have decent internet available; many have some form of Wi-Fi, but it can occasionally be like trying to suck oatmeal through a straw.
There are two kinds of demos we can do, based on the presence or absence of reasonable internet. As you have surmised, the onsite personnel expect that their training in how to cope with a lack of internet will boil down to “it will work exactly the same as if there were internet.” The dilemma is that “I have internet” versus “I don’t have internet” isn’t in this case something I solve with the replication abilities of offline files and folders. We prefer doing our demos with access to the internet because it allows us to get the most up-to-date datasets, imagery and other things that make our demo good. If we don’t have access to the internet, it would require not just offline copies of certain files, but a completely different profile.
I can’t rely on the onsite staff to have the time and training to make judgement calls about the quality of the internet connection, so I turn to super mandatory profiles. In my previous article, I explained that mandatory profiles were essentially “read only” profiles.
Super mandatory profiles are identical, with the exception that they will simply refuse to log on if a slow link is detected. This is exactly what I need to solve my problem. I can create a profile for these demo systems that contains all the links, folder redirections and so forth that depend on stable, reasonably fast internet. I can also ensure that if that internet connectivity isn’t available, it will simply refuse to log on. If it does refuse to log on, the onsite staff can use an alternate logon that will contain links to local datasets and folders.
It’s certainly not a perfect solution to my problems; I’m going to have to add some scripting and scheduled tasks to the mix to keep having the system continually poll for a “slow link” during the day. If one is detected it will have to pop up a warning to the on-site staff letting them know it’s time to log off the super-mandatory user and use the local one. Still, this is a solution that can solve a fairly unique problem using nothing more than what is already built into active directory. An honourable mention needs to be made here regarding what I believe is the most important tool in the Windows system administrator’s arsenal: resultant set of policy (RSOP.msc.).
Active Directory offers plenty of ways to organise your user and computer objects. You can place them in various types of groups or within organisation units. These can of course be nested, with the end result that a given computer or user object could have many layers of group policies applied to it. Active directory isn’t exactly designed for the uninitiated, and the rules regarding inheritance and precedence of group policy objects can make understanding what is actually going to be applied to your user confusing.
RSOP.msc will show you exactly which policies are being applied to your user. More importantly, it will tell you which GPO “won” and ended up being the one to exert its policy. If you open an individual GPO setting, RSOP.msc is even kind enough to explain the precedence results that led to that GPO being declared the winner. There isn’t time or space here to discuss it at length, but suffice it to say that if you are working on an even moderately complex user data management scenario, RSOP.msc will end up being your best friend.
As I bring my set of user data management articles to a close I find myself reflecting once more on the overall state of this most stressful aspect of systems administration. In the past ten years, we’ve come a long way in what’s possible.
Still, I don’t have to look beyond the boundaries of my own organisation to find scenarios that can’t be resolved within the “accepted’ paradigms of user data management as determined by Microsoft. The latest iterations of Windows have solved many bugs that plagued me for the past decade, yet this isn’t the year 2000 anymore.
Microsoft can’t act like it is the only real player out there. People are going to start using Mac, iPads, Google Whatever, cloud storage, and yes, even Linux.
While Microsoft has made minor inroads into interoperating with others, I am faced with the realisation that they aren’t at all prepared for a mixed-environment world. Even when using technologies and operating systems designed by Microsoft, interoperating only with Microsoft, I find myself having to resolve certain issues with scripts and scheduled tasks. Given that we don’t use anything particularly exotic in our environment, as a systems administrator looking into the future this leaves me with some very serious reservations.
My server environment is already a mixture of RHEL, Microsoft and VMWare. The applications I must support span decades and the user devices that will be accessing these applications and storing local data have a potential variability that frightens me.
The company I work for has less than one hundred employees, and yet even as the systems administrator of such a small organisation I am coming face to face with these realities. User data management is hard. It’s hard even within a Microsoft monoculture. Faced with the reality that very few people will have the luxury of software monoculture even in the near future I am certain only of one thing: It’s about to get a lot harder. ®