Hello! I'm Defron and this is my blog.

Data Privacy Day: Passwords

Part One in a five-part exposé for Data Privacy Day

Data Privacy Day: Smartphones

Part two in a five-part exposé for Data Privacy Day

Data Privacy Day: Web Browsing

Part three in a five-part exposé for Data Privacy Day

Monday, September 18, 2017

My Operating System Journey

My current primary desktop: KDE Neon LTS

It's interesting to think back on how my sentiments about what an OS should or shouldn't be have changed over the years. My current opinion is "Every OS sucks, it's a matter of choosing the one that sucks the least for you", which for me is currently KDE Neon LTS, but more on that later.

I'm relatively young, The first computer I ever got was an old hand-me-down Compaq that's probably still collecting dust in my parent's garage. I was around 11 at the time, and it came with Windows 95 on it. My dad had a copy of Windows 98SE available for me to upgrade it, so upgrade it I did! It was cool having a computer in my room, though when I wanted internet, I had to run a 150ft ethernet cable from my computer all the way to the other side of the house to plug it in (After a few months of doing this, I was able to finally convince my dad to properly install ethernet for me). Those first couple of years were pretty uneventful. I just did normal computer stuff that anyone my age would do (pokemon, neopets, etc) but from the comfort of my own room rather than fighting over the family PC. The Operating System didn't matter much.

Computers came and went. I used Windows XP and then for Christmas one year, I got my own brand-new computer running... Windows Vista. Honestly, I should have been much more grateful than I was. The computer was actually pretty decent and properly specc'd to run Windows Vista, which was the biggest problem with the OS at the time. However, being that it was a new computer (meaning it didn't come with much software) and being that I was a poor high school student at the time it did leave me down the wonderful world of open source via OpenOffice.org, and later Linux (Downgrading to XP wasn't an option for me, so that just left Linux). The first Linux distro I tried was Ubuntu 7.04. It worked, and things were good, but I never fully switched. For me, at the time, Windows Vista did what I needed with the least amount of hassle.

Somewhere along the way I was given a hand-me-down laptop that had all sorts of problems with Windows XP on it and no recovery disks. I put Lubuntu on it and I was pretty happy. It served me well for a while there. As I continued in my education I came across a need to use MS Access, so I got a new laptop with Windows 7. I put Kubuntu 10.04 on it and enjoyed a dual-boot. While using this laptop, I was pretty evenly split between Windows 7 and Kubuntu, probably because I made booting into Windows 7 so painful to improve security by using TrueCrypt with the boot image stored on a flash drive, so I could only boot into it with the flash drive plugged in.

Then I built my own desktop... and I felt no need to put any linux distro on it. I continued to play with Linux distros like Lubuntu, Arch, and Fedora (and various Windows Server OSes) in virtual machines for the next few years, but I felt no reason to switch to Linux for my main OS. Windows 7 did what I needed. It was the beginning of the pragmatic approach to my OS choice. I built the PC specifically for gaming, so why would I use Linux? Windows did what I needed at the time better.

Somewehre along the way I decided I needed a NAS, and decided to build my own. I had already used both Linux and Windows server OSes in my professional work at this point, and knew instantly that I wanted it to run Ubuntu

Then came Windows 8. Oh how I hated Windows 8. The start screen and deep integrations with Microsoft's cloud were incredibly off-putting. For those reasons, I never upgraded to Windows 8, nor Windows 8.1. However, I was still into gaming and I thought Microsoft might straighten out by the next release.

Then I saw Windows 10. I instantly knew there was no future for my personal computing in Microsoft's OS. I have no interest in Microsoft's cloud or having my information collected by yet another company. I started a clock; I wouldn't be using Windows as my primary OS after Windows 7's End of Life.

In the end, I beat that timeline easily. I no longer play games all that much, and those I do play work well enough in Wine. I finished finding alternatives for software I need and regularly use earlier this year and switched to Linux full time. In my next PC build I plan on buying components that will allow me to do VGA passthru, but it's more of a "just in case I want to".

For the distro, I settled on KDE Neon LTS because I quite like the KDE stack and Qt framework and quite like the Ubuntu base. Is it bug-free? No, but I've yet to use an OS that is bug-free. It does what I want the best, though. Which, what I want is easily getting the software I want, automated patching of just security updates and some specific other core packages like Google Chrome, and good documentation with a decently-sized online community that has some buy-in from the corporate software world (I don't live free of proprietary software). Fedora and Debian, while fine OSes, just don't quite meet all my needs without too many additional hoops. Arch's bleeding edge was initially appealing, but I care more about staying up and staying secure than I do about having the latest version of nmap. Also, while I know some people love AUR, I personally love automated security updates much more than I liked AUR. None of that is meant to bag on your OS, as they are all fine linux distros, just not the right one for me.

Oh, and yes, I've used MacOS too. both as a Hackintosh and on official hardware. I personally was not a fan of the UI, especially the UI of Finder. The way many default system packages were really old (like Bash) and the general philosophy of Apple were also big negatives for me. Whenever I use it, I only feel productive when in a terminal, and even then I just am meh about the experience.

So I guess I could be classified as an Ubuntu fanboy if you want. Until something else checks off all my checkboxes and gives me a good reason to switch, I plan on staying with KDE Neon LTS.

Wednesday, September 13, 2017

Life After Crashplan (AKA My New Backup Plan)

If you use Crashplan, you should have gotten an email that starts out like the above, just like me, around August 22nd. Crashplan was a pretty decent service that did what I wanted (offsite backups) without too much hassle at a price I was willing to pay. It wasn't without issues (buggy, memory-hungry java client being the biggest), but I felt I was getting my money's worth out of it. Crashplan Home, unfortunatly is no more. Once your current subscription runs out, that's the end of it, which for me is this upcoming November.

So, if you are like me, you need a new backup plan, or at least to modify your backup plan. Below outline my needs for a backup solution:

  • Robust file history
  • Relatively set-and-forget
  • Optional encryption
  • Off-site backup
  • Headless server support
  • Linux support
Crashplan checked all those boxes. It had one of the nicest file history options, allowing me to choose what version of a file to restore from (almost) any point in the past. It runs as a background process all the time and would email you the status of your backup regularly (though it is always a good practice to periodically verify your backups to make sure it's working correctly through a test restore). The more effort a backup takes to do and maintain, the less likely you are to do it. For encryption, it provided a plethora of encryption options to be as user-friendly or secure as you could hope for. The paid home service came with an unlimited cloud backup option to satisfy your off-site needs as well. Finally, while the default isn't headless, it could be set up with a little effort, and linux support was better than the major competition in that it actually existed. Of course, that doesn't matter since it's going the way of the dodo. So began looking for a replacement I did.

For local backups and remote backups via SSH, Borg backup is the one I found the most full-featured. I am sorry for any Windows users out there, but it's not really officially supported on that. Mostly everything I own runs some linux distro, including my NAS, which is where I store most of my files. Borg backup can be slow, but then again so was Crashplan if you didn't have enough RAM.

The problem is, at this time, I don't have a sufficient remote server I can ssh into for remote backups using Borg. Based on my current usage, and planning for the future, I decided 5TB is the sweet spot, and I'd be willing to pay up to $150/yr for it. If I had a family member somewhere sufficiently far from me with fast enough Internet, that would be my first choice. Unfortunately most of my family  lives within the same area and are plagued by the same environmental risks as myself. As such, this doesn't make for a good option.

As mentioned above, most of the big names don't support Linux for backups. If you are on a consumer Windows version or MacOS (so Headless and LInux support don't matter), then Backblaze can meet your needs (the only caveat on robust file history being that it doesn't keep deleted files past 30 days, something to be aware of). Their Linux "offering" is B2 would cost over $300/yr, plus additional charges for file restoration. No thanks. So if you are on Linux like me, what are your options for a cloud backup?

I've narrowed it down to three options that fit within my $150/yr price range:
  • Crashplan Small Business: This one is mentioned in the Crashplan email. Existing users can switch over their existing plan to small business for free and keep data intact (if less than 5TB). After your current subscription expires, it will be discounted significantly for one year, and then finally will be $10/mo/device after that. On paper it seems like a pretty good deal (if you just need to back up one NAS like me), but after being burned by Crashplan on their Home plan, I do not consider this viable in the long-term.
  • Google Drive Unlimited through GSuite Business + rclone: If you have GSuite Business already (which I do), it's just an additional $10/mo/user to get unlimited Google Drive space added to it. The catch? According to the terms, you need 5 users to get unlimited space, otherwise you are limited to 1TB. While currently Google does not enforce this limit, they could at any time. As such, at the current time, I do not consider this viable in the long-term either.
  • hubiC + rclone: For around $60/yr hubiC offers 10TB of storage, which is more than enough for my needs. Like Google Drive, on Linux you will need to use a third party tool, of which there are a few, but rclone is the one that looks the most promising to me. The catch? They throttle, hard. 10mbit/s is the max speed you will see, so uploading a large amount of data is going to take some time.
So what one will I be going with? For now, Crashplan Small Business until my subscription runs out, then I plan on switching to hubiC + rclone. Rclone lets you keep old versions of files and offers encryption, and hubiC offers enough storage for my needs. While the throttle is annoying, I never saw super-great upload seeds through Crashplan either.

In the future, if Google Drive for GSuite Business ups the limit for <5 users to 5TB, I will definitely switch to that. I will also continue to look for a good option for an off-site "server" (most likely just something like a raspberry pi with two mirrored 6TB drives) that I can set up with SSH and 5 TB.

Tuesday, January 28, 2014

Data Privacy Day: 2014 Edition

It's once again January 28th, which means it's once again Data Privacy Day! This is a great day to read up on privacy matters, check your privacy settings on various websites, see what web and mobile apps are doing, etc. Below various relevant links along with my own articles from last year's Data Privacy Day.
  • MyPermissions has shortcuts to the permission and privacy settings for many popular web apps. They also have a mobile app for alerting you of when new apps are getting info from your account.
  • It's a great day to visit the Electronic Frontier Foundation and read up on the latest privacy concerns and what you can do to protect yourself. The last year saw huge violations of privacy by various government and non-government organizations. The EFF is all of our friends in fighting for our privacy.
  • Android 4.3 users had the option to more easily review app permissions and selectively allow them via App Ops. Unfortunately if you're running 4.4.2 you can't any longer (hopefully the feature gets re-added in the near future)
  • Google released an official Device Manager for Android that allows you to locate, lock, and wipe your lost phone. Less third-party apps is always a good thing and it's free. Losing your phone is a huge potential privacy problem.
  • Facebook privacy settings are always changing, and facebook is still a big part of many people's online presence. Lifehacker has a great writeup on Facebook Privacy.
  • A comprehensive guide to passwords by me.
  • Most of us have a smartphone, and it usually holds a lot of personal info. Keep it safe.
  • Keeping things private on the web is no easy feat, but here's my guide on it.
  • Keep private information secure by encrypting it.
  • If your network is compromised, everything you do on it is potentially compromised too. Here's how to keep safe on networks.
Remember to spread the word about Data Privacy Day. Awareness is step #1 and the more people aware of Data Privacy Day, the bigger of a success it can be and the better off everyone can be.

Friday, January 10, 2014

What's So Good About A Tablet?

Aka: I Got A Nexus 10

At the beginning of this year, I bought myself a Nexus 10, specifically the 32 GB model. I have long been an advocate of the tablet, but had yet to get one. Why? It wasn't that I didn't want one, but rather that I was waiting on the wrong company. I kept waiting for Asus to get their act together in terms of quality assurance, but with the botched release of the Transformer TF701, I finally realized they weren't going to. As such, after the holiday season, I went out and got the Nexus 10, and have been quite happy with it since then.

Tablets Are Better Than Laptops

This has basically been the foundation of my argument since tablets first started getting popular. For the vast majority of tasks, tablets outperform laptops by a significant margin. Tablets, generally speaking, have a better battery life. They also are lighter, more portable, and for your standard daily tasks, I'd even argue that they are faster than your average laptop. The Nexus 10 has a unique feature of having front-facing speakers, meaning that sound clarity and quality on it outshine, not only most other tablets, but even most laptops too.

The Internet Has Made Everything Equal

XKCD covered this a little over 2 years ago. It's pretty true, especially for most of us most of the time. Email, communication, even productivity and office suites all exist in our web browsers these days. Needing a specific application from a specific platform is getting rarer and rarer for most people. Obviously this isn't the case for everyone, but it is for many people. Also, I think for many of us technology people, we have another option: remote desktop and administration.

It's actually because of remote access that I think I am able to enjoy my Nexus 10 as much as I do. Just this last October, Microsoft released an official RDP app for both Android and iOS. There have been many RDP apps before it (I personally have used Jump Desktop, which I still like and offers some advantages over the official one by Microsoft), but what makes this official app unique is that it supports RD Gateways and RD Web Access. On the Linux front, NoMachine is readying for a release of NX for Android and iOS. Currently there is an alpha build for Android users available for download. I personally am looking forward to trying this out soon. Of course there is the standard fare remote desktop apps, but I'm a DIYer and generally untrusting of the cloud. Through any of these, you can access, not only the programs of your desktop, but also the raw power of it, from wherever you need it.

10 Inches is Big Enough

OK, bigger is better, but not when it comes to portability. This is actually why I don't think I could ever use a laptop as my main computing device: 15'' or 17'' isn't comfortable to me. However, for Android, combined with the full screen only style of tablets and the App switcher, it works. On the go it works quite well, in fact. I have relatively small hands. Combined with the auto-correct, I have been pretty comfortably typing out this entire blog post on my tablet. I plan on getting a Bluetooth keyboard in the long-run, mainly to help with sshing a bit more comfortably, but this works quite well once I enable the PC-style keyboard layout. I do my heavy lifting on my desktop still, but when I want to be lazy, I can just use my tablet and read or surf the web without getting up.

Obviously this isn't for everyone. The hardcore on-the-go gamer won't be able to do it, but most others can. A designer type can remote in and access much more powerful equipment to do multimedia editing when on the go, and those in need of Windows software even when on the go can get the Surface Pro. For these reasons I feel that tablets are better than laptops, and I am quite enjoying mine, and that's what's so good about tablets: they're better than laptops for most people's needs. If you have been on the fence about a tablet, or are lugging around a laptop everywhere you go still, I say give a tablet a shot as your go-to mobile device and set up some sort of remote access to your desktop.

Thursday, June 20, 2013

Proper Computer Infection Triage

It's been a looong time since the last time I've cleaned up an infection. I do my best to keep it from happening on my computers that run Windows as well as those I manage. Of course eventually something gets through. Yesterday (my day off since I work Saturdays), I got a call from the office about a computer acting weird. It turned out it was the System Care Antivirus rogue software. According to file timestamps, infection occured around 1:34 PM. I was notified approximately 10 minutes later.

Like a stroke, I believe that fast action is important, and the mnemonic FAST still applies, albeit a bit differently, as it relates to triage instead of identification:
  • F[ull stop]: Once you think you're infected, don't do anything with the computer.
  • A[lert (me)]: I need to know ASAP
  • S[hutodown]: I'll do it if I'm there. Otherwise Get that computer turned off asap (usually a hard poweroff. I'd rather have a single PC damaged than risk malware spread over the LAN)
  • T[ime]: This one stays the same. Time is important. The longer a virus has to act, the more extensive the damage, and the less likely infection removal will be the right course of action. Files may be irreversibly damaged without a reinstall and you just never know.
As hinted  in that last one: I'm a strong believer in disk imaging. Clonzilla is awesome; RedoBackup is great for those who want a GUI. Haven't gotten around to messing with FOG, but it's definitely a project that interests me (and of course Windows 7 and 8 can create an image natively as well as some Server-side ways via Windows Server). I don't usually deal with viruses, because it takes more time to clean up the mess than it does to restore an image. This time was an exception. The EHR software we use at work had been upgraded, along with some other programs on that particular computer. I hadn't imaged it since these upgrades (my bad). It would have taken longer to install those programs (as with some of them, all other clients need to be exited before a new client can be added) than it would have taken to clean it since triage had been followed. Had triage not been followed, I probably wouldn't have tried. Triage really makes that big of a difference in my opinion.

So now to explain FAST.

F -- Full Stop

Many infections start out as a simple file that was able to execute itself in the %AppData% area of your computer. It doesn't have much permissions yet and damage is usually not that bad. It will then try to trick you into giving it more power by clicking on something. By stopping everything and not touching anything, you can in many cases stop the virus in its tracks. This wasn't the case this time. It looks like the virus used an exploit in Adobe Flash Player to infect a bit worse -- more on this later.

A -- Alert (me)

If you aren't computer savvy, now is the time to get help. In the case of my office, I'm the one alerted. The sooner the problem is brought to the attention of others, the more easily it will be resolved. If you are at work, please note this: you will not be able to keep an infection a secret. Eventually it will come out. All you are doing is putting your coworker's computers, and the business at risk. Tell someone and tell them fast. I am quite proud how well it was handled. The person whose computer was infected told the office manager, who promptly instructed her to call me, just like it should have been done. It was beautifully handled.

S -- Shutdown

This one goes along with Full Stop. If the computer has been truly compromised (which the alerted person should be able to tell), then it's time to power that bad boy off. Some malware will try to stop this. Solution: Hit the switch on the power cord. A hard power off is much better than other computers getting infected. I instructed the coworker to turn off the PC and it stayed off until I got there. She was given a laptop to work on in the mean time.

T -- Time

Time is of the essence in an infection. Just like in a real medical emergency, triage is designed to quickly ascertain the severity of the problem. The longer it takes to triage, the more at risk the bad cases are. In the case of a computer infection, the longer it takes to get a PC squared away the worse it is going to be and the less likely cleanup will be at restoring a PC to its former glory.

I have two time counters: The first one is time from infection until the end of triage. I give this 30 minutes. If more than 30 minutes have passed and the PC is still being actively used, most likely that infection is going to be in every nook and cranny of the PC, maybe even jumping across the network. The second time is cleanup time. This one is 60 minutes. If no progress on cleaning up the infection has been made in one hour, it's probably time to wipe and start from scratch. Infection cleaning is a race against the clock in every aspect. The longer you spend cleaning, the more appealing the wipe-and-reinstall method will be. I find one hour to be a good compromise. If I've made good progress and everything seems in order by then, I'll continue cleaning up the infection. If I haven't even come close to getting it under control then it's time t wipe and reinstall.

My Case

As mentioned, this was my first cleanup in a long time. My first cleanup in years in fact. It was quite pleasant. Or at least as pleasant as a cleanup can be. I contribute most of this to the triage method described above. None of the network shares were infected and the PC is back up-and-running.

The infection appears to have been due to an outdated Flash Player install. I don't know how that happened. It should have been updated, but wasn't. The user had visited a website (the website in question seems to have been compromised. It doesn't appear to have been a malware website) and then wham, the popup of System Care started sceaming its alerts at her. She did the best next move by telling the office manager. The office manager than told her to call me. I got the call. I told her to shut down and she got a laptop. I arrived the next day and started cracking.

First, it was taken out and brought back to my office room. It was disconnected from the network and booted up into safe mode. I had done my research beforehand and quickly deleted the files and removed the registry entry related to the malware via the command prompt. Total time? A few minutes (spent more time getting the PC to my room than deleting these files). All was looking good so far, so I booted into Windows normally, expecting the worst. I booted in to windows and it wasn't bad. I was able to launch things and it wasn't a problem. First I launched the antivirus software (Vipre Business). Lookie there! It had caught two files. It may not have been a full success, but it did catch part of the malware and was probably why it was so easy for me to delete it manually with no problem (along with the blitzkrieg tactics in my removal methods). So now it was time to get some better malware scanning software and get the AV up to date.

Before I did that, I noticed that there was an Action Center alert. Apparently the malware had disabled Windows Security Center Service. I went into services to try and re-enable it when I saw it didn't exist. Uh-oh. Looks like the malware did more than I initially thought. Most likely it deleted a few registry values, causing the service to disappear. Knowing that, I decided to gamble on a System Restore. I consider System Restores a gamble because many malware programs will infect them, so when you restore them, you just end up restoring the malware too. I thought "if this doesn't work, I won't have wasted much time and I'll just wipe and reinstall" that way I don't waste much time. I chose a restore point a bit older in hopes that an older restore point would lower the chances since this malware had all of about 20 minutes before I deleted its core files and registry entries.

It was successful! Security Center and firewall and everything were all back after the system restore, and still no traces of the malware! So now I needed Internet access.

Even though I had done some cleanup and everything was looking good, I'm far too paranoid to just plug this computer into the LAN after it's been infected. It won't get LAN access until I've given it a clean bill of health. So what to do? I don't have a secondary Internet connection to use. This is where my Quarantine LAN comes into play. Using a DD-WRT router and some iptables rules, I made it so the desktop could connect to the Internet, but not to any computer on the LAN. Using a different subnet for the quarantine router and blocking the DHCP pool of the servers outside of it, I guaranteed that this computer couldn't infect my LAN even if it was filled with the nastiest of nasty malware (which is wasn't at this point). So then I went on and installed some more antimalware stuff and updated all the cleanup tools to the latest definitions and versions. CCleaner took care of any temporary files, with me cleaning up some it missed manually. the antimalware software was humming along, removing traces in cached and temporary files every now and then. It was the log in Vipre that informed me that it seems Adobe Flash Player related. The computer this entire time hasn't been exhibiting any infected signs: everything was running fine and nothing weird was going on. After a few runs with the various scanners, things were coming up clean. HijackThis logs were clean of anything worrying too. I put the PC back and that was it.

I ended up spending a few hours due to me being cautious. Rebooting, rescanning, scanning with all sorts of things. It eats a lot of time, but I wanted to be confident in my clean bill of health before I put it back in place. During the time I was also running scans on the network shares and other computers just to be safe.

In the end, it was a pretty successful cleanup story. And for the future? Well, I might implement Click-to-Play for flash content now.

Monday, April 1, 2013

World Backup Day

No April Fooling here.

World Backup Day was yesterday. It's a little-known holiday, in fact it only came into being in 2011. It's got a pretty catchy slogan of "Don't be an April Fool. Backup your files. Check your restores." I'm a firm believer in backups. It's practically an art and not one most people understand. There are many things some people consider backups but aren't at all. A backup has to satisfy certain qualifiers to be considered a true, blue backup:
  • A backup is not a Mirror. Mirrors copy file corruption and don't version files. Human error is always the biggest source of data loss in my experience, and a mirror doesn't protect from the human factor
  • RAID is not a backup. This goes with #1. Things like RAID1 (which is mirroring) and pairity-based RAIDs are sometimes thought of to be a backup. They are not. RAID was designed to increase performance or availability. It doesn't protect your files beyond that needed to do one of those two things.
  • A backup is versioned. That is to say, the data in a backup represents files as they were some time in the past. A good backup has many versions of the same file from you to choose where to restore. A system that keeps only one version of a file is the weakest kind, but still has one important distinction over mirroring: changes to original files don't instantaneously replicate to the backup. This allows for a restoration Windows to correct from human error.
  • A backup is verified. An unverified backup is nothing more than a hope. You hope the data in the backup is good. You hope the data in the backup is what you need. Until you verify those facts (which can only truly be done by doing a test restore) you never know if your backup is actually working as you intend it to. This point is so important that it's in the slogan for World Backup Day. It's also the most overlooked aspect of a backup.
Those are the key aspects of a true backup. Besides that, a backup should be automated (human error is the greatest problem, and it's so easy to postpone backing up). A backup should generate feedback/reports. Exception reporting is when the backup software tells you that something went wrong and is the most common type of feedback from backup software. Software may also come with statistical/general/informative reporting where it'll tell you that everything is going fine. What's good about that is it means if you don't get that report than you know that something is wrong (so wrong that it's affecting the reporting feature).

Going further down the path to backup enlightenment is that backups should be stored on different mediums and securely stored in different locations. When you have this, you have backup nirvana. Power surge fries your hard drives? You've got tape/cold storage backups to restore from. Your house burns down? You're data's still safe. The more distant the better. Cloud backup services have made off-site backups an achievable possibility for the average person. Tape's not really a viable solution for your average person, but flash storage, optical media (DVDs, BDs), and cold storage (unplugged hard drive) all are. The problem with two of those three is that you can't automate it on the same level. Flash storage, too, is still expensive. I also consider cloud storage a storage medium.

Security of backups is the most important thing when dealing with off-site backups. Encryption is the most obvious way to do it. Any good cloud backup solution will encrypt your files. The best will give you options on how to encrypt. Of course good ol' sneakernet of an encrypted hard drive (maybe with TrueCrypt) is also a viable off-site secure backup strategy.

Of course you don't need to do all this at once. Backups can be improved over time. The whole idea behind World Backup Day is to get people started and moving in the right direction.

Backup Software

I consider backup software to be like Operating Systems: they all suck, so it's a matter of picking the one that sucks least for your purposes. Try out different programs and see which ones work best for you. IMO, the best backup software I've used has had a client-server relationship like BackupPC (been meaning to try out Bacula). It works well for my purposes, but obviously not ideal for others. Below are some free backup programs to try out and get started with:


Windows is what most people use. Unfortunately it's the one whose free backup software I've found most lacking. There are some gems I've enjoyed from time to time though:
  • File History (Windows 8): File History is the one great feature of Windows 8 IMO and I truly think it is amazing. Unfortunately I'm not crazy about the rest of Windows 8 on the desktop paradigm and I doubt many of the users of Windows 8 use File History.
  • Windows Backup and Restore (Windows Vista/7): The predecessor of File History. It offers a lot of nice features including the ability to do a full system image. It's pretty efficient and I've never had it goof up, but setup is a bit clunky (Which is where File History really improved on) and depending on the version you have, you can be quite limited in backup destinations. Windows 7 Home Premium lacks network backups, for example.
  • Cobian Backup: Cobian Backup offers a lot of neat features and a pretty good scheduler. The main problem is the lack of an integrated restore feature makes continuous incremental backups quite painful. Instead it's best to do sets of incremental backups, say keeping 9 sets of 10-day incrementals (keeping 90 days worth).
  • FreeFileSync: Yes, mirrors aren't backups, but FreeFileSync is unique among file synchronization tools in that it offers the ability to keep files deleted/changed and moving them to a different  timestamped directory, creating a basic file versioning system. It's no-frills which will undoubtedly appeal to those after a simple (feature-wise) solution.

Mac OS X:

Anyone read this use  Mac OS X? I personally don't and so the only Mac OS X-only backup program I'm aware of is Time Machine, which is an included utility. It's quite nice for local backups. If you don't want to buy an expensive time capsule, you can use FreeNas for networked backups.


Linux has backup software galore. rsnapshot, rdiff-backup, a million others, and many different GUIs. Here are some GUI backup programs for Linux:
  • Back In Time: Based on some no longer maintained backup programs and inspired by Time Machine, it is a very nice backup program.
  • LuckyBackup: A very feature-rich backup program offering GUI options for many advanced features not commonly found in a GUI. NOTE: technically it's cross platform, but I don't consider it to be stable enough to trust it with data and list it as cross-platform.
  • fwbackups: Designed to be simple, it largely succeeds. Doesn't have the bells and whistles others offer, but it gets the job done. NOTE: technically it's cross platform, but it doesn't play nice with UAC on Windows.


There are some great cross-platform backup programs out there. I'll note that many require java to run. Here are my top three:
  • Areca Backup: A very advanced backup program. It offers many settings to satisfy anyone. This comes with a steep learning curve and a desire for experimentation to uncover them all, though.
  • CrashPlan: A friendly backup program. The free version is ad-supported. It offers the unique ability to back up to a friend's computer over the Internet without an involved setup. This is a very nice feature for people untrusting of the cloud The subscription version also offers a cloud backup option. The Windows version doesn't require Java, Linux and Mac OS X version does.
  • JBackPack: Java-based GUI for rdiff-backup and encorporates EncFS functionality for file encryption.
Finally there are cloud-based backup programs. I won't go into detail here since I don't have any experience with them besides CrashPlan. I chose CrashPlan because it's cross-platform and so far I've enjoyed it. I don't trust my super-private files to it (it has nice regex filtering) and it gives a weekly report of the backup. I've heard good things about BackBlaze, but it doesn't support Linux which is a no-go for me (I list these two in particular because 1. I use CrashPlan and 2. both are supporters of World Backup Day). Others are probably good too. I will say DO NOT USE CARBONITE. They seriously throttle your speed. That kind of practice is deceitful and hurtful to customers and such practices should not be acceptable. Also note that other online storage and sync platforms like Dropbox and Google Drive really aren't backup solutions, but they may suit your purposes fine so long as you realize they aren't really catering to the backup market.

Thursday, March 7, 2013

Oracle iSQL*Plus Overview

This post is a bit different from my usual ones. It's specifically aimed at classmates in one of my classes I'm taking right now where we are learning Oracle Database. I found out the school has an iSQL*Plus web access license and it's publicly accessible (so you can use it from home), so anyone can use it any time.

iSQL*Plus is a web interface to SQL*Plus. Pretty much everything will work in iSQL*Plus that will work in SQL*Plus, but not quite everything.

Login and Use

 Logging in to iSQL*Plus is similar to logging in to SQL*Plus, except you use your web browser. You'll navigate to the iSQL*Plus web server and enter your username, password and connect identifier (if applicable, for the school iSQL*Plus server it is). It works with any browser. I've used it in the latest releases of Chrome, Firefox, and Opera with no problem. The login screen looks like this:

Once you've logged in, you'll be presented with a nice web interface. Here's me executing a simple command:

 As you noticed, by default it'll display the output as standard HTML of your commands below the enter field. Also by default it'll paginate the output, so you only see a number of lines at a time, 24 by default. This can be changed in preferences, but first on the main workplace UI:

Your SQL Commands stay in the box until you clear it by either manually deleting it or using the "Clear" button.

The "Execute" button executes all the commands in your textbox, by default showing the output below.

The "Load Script" button will take you to a page where you can browse for a *.sql file to upload. After you've located it, press the "load" button and it'll put the contents of the file in the text box. It's kinda pointless since you can just copy and paste the contents faster most of the time.

The "Save Script" button will auto-generate a file called "myscript.sql" and present a download prompt, which is pretty nifty.

The "Cancel" button will terminate any running commands, useful if your commands are being unresponsive.

You may notice a "History" tab on the top-right corner. This will store a history of your last few executions of commands. You can then reload them, which is useful if you accidentally hit the "clear" button. Do note: the History is cleared upon logout.


 Preferences are where you can change various default settings. Most notably are:

In Interface Configuration:
  1. The amount of scripts to save in "History" from the current session (by default 10).
  2. The default size of the script input text box (pointless in modern browsers that let you expand it yourself)
  3. Whether to display the output below the text box or generate a downloadable html file (this is the closest equivalent to spooling, which isn't available in the iSQL*Plus interface)
  4. Option to have everything display on one page or multiple pages (and set the number of lines per page)
In Script Formatting:
  1. Option to have line numbers (kinda nifty)
  2. Option to display commands in the output by default (equivalent to running SET ECHO ON at the top of every execution)
There are many other options too. Here's Oracle's page on iSQL*Plus preferences and the equivalent SET commands

Differences Between iSQL*Plus and SQL*Plus

Most things in SQL*Plus will also work in iSQL*Plus, as I've noted. One notable exception is spooling. If you try to spool you'll see this:
spool on 
SP2-0850: Command "spool" is not available in iSQL*Plus 
Any other command that can't be run will similarly kick up an error like that and the rest of your script will execute properly. Spooling doesn't work because, obviously, you cannot set a save location on the server from your web browser. that'd be a security nightmare! The closest equivalent is to have the output generate an isqlplus.html file instead of displaying the command results below the text box.

By far, my favorite feature is that you can edit multiple lines very easily in iSQL*Plus and recall history. It's much nicer in that respect than SQL*Plus is. Likewise, the commands don't disappear after execution, so if you made a small mistake, it's easy to fix and re-execute. Very nice.

The other nice thing is any user can set preferences for many set commands, which can save you time.

The main downside is that there are no TNS alias, so your connect identifier has to be the full, proper connect identifier normally contained in the TNS.ora file

You should also be careful of something: iSQL*Plus doesn't care if you don't end SQL commands with a semicolon. If you plan on using this elsewhere or submitting it, you'll need to make sure you include them, as otherwise it may not work for someone else.

One thing to note is that iSQL*Plus has a pretty aggressive timeout/auto-logoff setting. So if you just leave it open for some time, you'll probably be forceably be logged out, so do make note of that.


Well, that's pretty much all there is to iSQL*Plus. Just make sure to log out. It's a great way to practice your SQL off-campus.

Saturday, March 2, 2013

Expanding C: Partition on Win2k3 and Remote GParted

As I mentioned in my last post, the C: partition on the Windows server at work had become completely full. I immediately did some temporary stopgaps to hold her over until I could properly repartition her. Today was that day.

Repartitioning modern Windows (Server 2008+, Vista+) is no problem, as you can do it with the included disk management utility. XP and Server 2003(R2) are different, as the disk management utility isn't nearly as capable. This server is a Windows Server 2003R2, so I had to use a third-party utility. For home users, the EaseUS Partition Manager family is pretty good. For a corporate server, though, it's $160. I wasn't approved for spending that (For good reason, the server is slated for replacement in a year or so, so the money would have been wasted in the long-run), so had to go with free options.

I ended up going with two different tools: one for shrinking the data partition and one for expanding the system partition. I used GParted to shrink the data partition and ExtPart to expand the C: drive. The reason being that Windows doesn't really like GParted and sometimes it'll require a repair action from the Windows disk when messing with the C: partition. I didn't want to deal with that and ExtPart is a small, simple, free utility for extending a partition (hence the name) from within Windows.

The day started at 8:40 am. I fired up CloneZilla and cloned the hard drive. If there was a power outage or some other freak accident during the repartitioning, I could then simply restore the image in a short time. I always recommend imaging your system before repartitioning for this reason. There are lots of disk cloning tools, I like CloneZilla. I tried doing this the day before, but the version of CloneZilla I had didn't work with my server's RAID card (a SAS 6/iR). I brought a freshly burned copy of the latest CloneZilla release and it recognized my drive just fine. This ran until 12:35 pm.

Next it was partitioning time. I inserted the GParted Live CD and got busy. Unfortunately Dell thought it was a good idea to make the data partition an extended partition. This means I'd have to do an extra action: First shrink the data partition and then shrink the extended partition it resides in. This means it'll take more time. I figured it to be done by 2:30 pm originally, but this would make it take longer. I figured it'd be done by 5:30 pm (it ended up beating my expectations by finishing at just shy of 4:50 pm). I didn't want to stay at work until 5:30 pm, time to get remote access to the GParted Live CD.

This actually proved pretty easy. First I configured the network, which GParted Live includes a nice desktop shortcut to do. Next I opened up a terminal. GParted Live is based on Debian so I did a sudo apt-get update && sudo apt-get x11vnc... This didn't work. Turns out GParted comments out the repos from sources.list. So I did a sudo nano /etc/apt/sources.list and un-commented the repos.

So now to install and run x11vnc:
user@debian:~$ sudo apt-get update && sudo apt-get install x11vnc
user@debian:~$ x11vnc -forever -display :0
The installation pulled a few packages besides x11vnc: libvncserver0, libxssl, openssl, tcl, tcl8.5, tk, tk8.5, and x11vnc-data. It only ended up taking 11.2 MB of more space, so no big deal, my server has plenty of RAM.

The forever flag tells x11vnc server to remain running after a client disconnects. Without that, as soon as you disconnect the first time, x11vnc will stop running. I planned on connecting a few times so I could do periodic check-ins on how GParted is progressing. The display :0 flag tells x11vnc to show the current session instead of creating a new one. It would be useless to VNC in to check on GParted's progress if I was given a new session. I also didn't want to risk x11vnc disconnecting and me being SOL, so I also decided to enable ssh on GParted. This is simple.

First, we need to set a password and configure hosts.allow so I can ssh in. This is done with sudo passwd user to create a password for the user 'user'. Without this you'd have to allow for passwordless login for ssh, which would require more configuration. Easier to create a simple password. Next you need to edit hosts.allow by doing nano /etc/hosts.allow. Add sshd subnetblock. (don't for get the period!) to the end. In my case it was sshd 192.168.1. that I needed to add. Now I just restarted networking and started ssh

sudo /etc/init.d/networking restart && /etc/init.d/ssh start

End result? I drove home at 3:00 pm and started checking on it. I use LogMeIn to remote into the office and then fired up TightVNC to check on GParted

 And then a bit later I saw:

Success! Done! Well, with the GParted part. I drove to the office to finish it up, since GParted will hang after ejecting the CD, and the server itself hangs on a setup in the preboot environment due to an alert I cannot disable.

Upon Reboot, Windows did a consistency check on the data partition, no biggie. I then rebooted again and was almost done. Now I need to expand the C:\ drive with ExtPart
C:\Documents and Settings\Administrator>cd C:\Dell\ExtPart
C:\Dell\ExtPart>extpart c: 15325
That's the default directory ExtPart creates when "installed" (it's a self-extracting archive when you download it). As far as the extpart syntax: it's simple. You specify the drive letter and then the amount you want to expand it in megabytes. In my case, that's c: and 15325. The end result? 26.9 GiB C: partition and 16.4 GiB of free space. I call that a rousing success for an honest day's work.

Tuesday, February 26, 2013

Inconsiderate Behavior

I thought today was going to be a good day: completely over the flu I had caught, night class was canceled, and my work load wasn't large (despite taking the majority of last week off)... I was mistaken.

When I check my work email I see the usual day-to-day emails, but one caught my eye. It was from my Spiceworks install and the title read "C: has less than 5% remaining on [one of the work servers]"... heart sank. Goodbye care free day, hello hellish day full of trying to figure out what's going on.

I turn to the server (running Windows Server 2003R2). I've always been well aware of the storage problems of the C:\ partition on this. This wasn't the first time the C:\ had filled up, in fact. Back when I first started working, it filled up due to an out of control program producing a five-mile long error log. The system, like most in the office, predate me. It was bought through a "value-added" retailer. The VAR decided it was a good idea to partition C:\ to only have 12 GiB of free space. I probably should have done something about it back then, but it being a Windows Server 2003 server, there's no integrated option to shrink one partition and expand the other. Budget was, as always, $0 and uptime was considered critical, so I freed up 4 GiB of space and called it a day. Of that, last week (some 3 years later) I had 2 GiB free space remaining. The server didn't ever really get any new software installed beyond security updates and I ran her lean and mean to reduce the chances of some log file going crazy. As I saw it, the remaining 2 GiB of free space would last me to this summer, where I planned to finally shrink the D:\ partition and expand the C:\ partition. It won't make it to then due to some inconsiderate behavior of an outsider.

A company, which will remain nameless for now, decided it was OK for them to do an automatic update of their software without telling me. Not only did they not tell me, they didn't announce it at all ahead of time, even on their website. The software also doesn't have the option to disable automatic updates and the fact that it can do automatic updates isn't even a listed feature. The software in question uses Microsoft SQL Server for the backend. Why? I dunno. I guess they thought that was a good idea (I disagree with that conclusion); it didn't use a SQL backend two revisions ago. Part of the upgrade included a forced upgrade to MS SQL Server 2008 (We were on MS SQL Server 2005)... That might be acceptable if I lived in an ideal world where I had each server only do a single role, but I don't because I don't have that kind of budget, working for a small business. The server in question was an archive server for patient records, and that functionality also used MS SQL Server. The update also required .NET Framework 4.0, which I had no need for up to then, so I didn't have it installed (free space being a premium, after all).

None of this would have been a problem, had I been given prior notice  of the update. If I was told ahead of time that this update was coming, I could have done something about it, and there would be no issue. Instead due to the company's inconsiderateness, I find myself with... 85 MB of free space on the C:\ partition.The .NET Framework update was also still running and complaining about a lack of free space (obviously). I had to cancel that.Next step is getting me some breathing room and call the company up. They give me the usual company blah about it, don't even apologize for not telling me about the upgrade beforehand. Told me they couldn't revert the upgrade so my only option was to clear up the space myself or uninstall the software.

Uninstalling it is very tempting, but I'll need to get my boss's approval before I can do that. In the mean time, I have a good feeling that the software is hosed and useless. The services related to it wouldn't start up, so I disabled them, crippling the software to dead status anyway, so at least it's not a further threat. I scavenged for free space and was able to get back to a bit over 800 MB. At least now it won't fail over from a single hiccup. That'll buy me the time to defrag the other partition (which is running smoothly so far, but since it was an archive server, still has a ways to go), shrinking it, and expanding the C:\ partition. If my boss doesn't approve some money spending, that'll mean downtime as I boot off GParted to shrink. I'll expand the C:\ partition with extpart from Dell so Windows get too grouchy.

A number of factors lead to the current situation, but the one thing that definitely shouldn't have been the case and would have made the world of difference is if I was told about this major upgrade beforehand so I could prepare for it.

At least Spiceworks is doing its job properly.

Thursday, February 14, 2013

Blog Status Update

It's been about a week and a half since my last blog post, so I thought I'd fill everyone in on what's going on. I knew this would happen eventually, but was hoping I'd have been able to post a bit more before it does.

I'm juggling school, work, this blog, and a few other projects right now. At the beginning of the semester it wasn't an issue, as school work wasn't very demanding. Now, though, school work is requiring more and more of my time. I'm a straight-A student and would like to maintain that, as such I need to focus on my studying and so had to cut back on my blogging. On top of that I got sick and am going back and forth to the dentist, taking up even more of my free time. Oh, and my first midterms are the next two weeks.

I definitely won't be able to continue a post a day like I did for that little bit, but I am going to try to do one or two posts a week. I have plenty I'd like to post about, but it's too time consuming to do so right now. In the immediate future, though, this blog is going to be running at a lower priority than I'd otherwise put it at. It's school, work, one of my other projects, and then this blog. Hopefully in two week's time, I will be able to pick up the pace again, though. Until then, I might be able to accomplish one a week, we'll see.

I'm still getting my feel for blogging, too, so it's definitely not worked out in my schedule. In the mean time we'll just see how this all goes.