Patrick Lambert - Freelance Content Creator Follow me on...
Twitter DA Google+ Pinterest

Newest blog entries

  • Making cross platform binaries
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Fri, Aug 15 2014 19:16:40 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    I don't consider myself a software developer, even though I've been writing code for well over 15 years now. I started on Linux, using the pico editor and coding in C. I produced things like a text editor, an IRC bot, and so on. This old code will most likely not compile anymore but it's still available on this site. Still, it's always been a hobby, not something I wished to become a full time gig.

    Since then, I've done everything from web development work in PHP and JavaScript, making Batch scripts, to playing with Visual C# and lately, a lot of Perl coding. Like many geeks, I use various systems, from Windows XP all the way to 8.1, Windows Server and Linux, both in VMs and out on the cloud through SSH or RDP. So I began looking for a way to make cross platform coding easier, but still retain the ability to produce native binaries. I've always hated Java from my short-lived experiments with the language many years ago, so JIT techniques left a bad taste in my mouth. I wanted something that could produce no-dependency binary code that could run on any platform.

    This is when I came across Perl packers. I've tried two of them, one free and one commercial. The free one is PAR::Packer, and the commercial one is Perl Dev Kit by ActiveSate. After having some issues with PAR's binary package not liking the version of some of my modules, then refusing to compile, in the end I settled on Perl Dev Kit for its ease of use, the wonderful way it deals with modules, and because it simply works.

    This is what the interface looks like:

    First, a brief overview of what a Perl binary does. Basically, Perl is an interpreted language that has to be run through the Perl library, which in turn loads the modules that are needed. The Perl Dev Kit simply takes the script, the Perl library, and any module your script requires, and packages it all into one nifty executable. Through the interface, you can download this environment of libraries and modules for every target system, and use them to create binaries. This means that on one Windows machine, I can use a single script and output binary files for Windows, Linux, OS X and so on.

    The benefit is obvious. Perl is great because it already is one of the most widespread language out there. It has hundreds of modules making any type of coding a breeze for the developer, and its inclusion into many popular systems means any web site will likely be able to run your scripts as-is. So being able to take that same script and make binaries that anyone can run without even having to download Perl and the required modules is very nice. Also, you only have a single file to distribute, while most modern software come with tons of dependencies that get installed at the same time.

    All is not rosy however. To run, the binary has to unpack its Perl library and modules before beginning execution. So this does mean the start-up process is slower than native code made with Visual C# on Windows, or GCC on Linux, for example. Also, your file size will tend to be bigger since you link the modules that you need statically, instead of having a small binary with a lot of supporting files.

    Overall I'm pretty happy with the result. Speaking about Perl, I would like to say kudos to ActiveState for making PPM, the best module manager I've seen, which simply works, unlike CPAN which seems to fail on every platform I've tried it on (although some Linux distributions have started to include Perl modules in their own software repositories which is ironic). You can see some of the things I've made in the downloads section.

    On a final note, now I'm going to need a way to turn Perl code into native iOS and Android apps, to make this process truly complete. I've actually found a project to do this for Android but it seems in early beta. I can always hope..

  • How to lose a potential user
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sat, Aug 2 2014 17:12:21 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    I saw somewhere an interesting image while I was checking out Twitter on my iPhone, and read that more were available on Flickr. This Yahoo service is an interesting study in undervalued web properties. It was and still is one of the primary locations where people go to share photos, and since Yahoo decided to reinvent themselves, they have been trying hard to get people to use their services.

    Anyways, I thought it was a good opportunity to download the Flickr app and see how things are. Here is how it went:

    * Download the Flickr app, launch it, see the pretty 'getting started' screens.

    * Try to go explore, notice that you can't do anything in this app without logging in. I have no real interest in uploading anything to Flickr, just to explore what others have to offer. Still, I suppose I can't fault them for wanting to convert users.

    * I don't have a Yahoo account, or rather I haven't had one for over a decade, so noticing the Facebook sign on link, I figure it's going to be the fastest option.

    * That turns out not to be the case, as after I'm done logging in, the app informs me that Facebook isn't good enough, and I still need to create a Yahoo account.

    * I decide to use my throw away password, since I'll likely only use it this once to check things out. But my throw away password doesn't have lower case letters. The app informs me that I need both lower and upper case letters.

    * Fortunately I have a stronger throw away password with lower, upper case letters, numbers, and over twelve characters. Then of course the app informs me that my password is no good because it doesn't have punctuation.

    * I decide that I already lost enough time trying to just access some images. I close the app and delete it.

    So what did we learn from this little UI nightmare? Flickr is already known as a place where photos are shared publicly. Why can't someone just explore them without creating an account? Why is the app offering Facebook login as an option when Yahoo decided no longer to accept this method of logging in? And why does Flickr have a password policy stronger than most banks?

    I may not represent the majority, but somehow I suspect that my experience is just the tip of the iceberg behind why Yahoo is a small shadow of what they once were in people's eyes. And honestly, I don't think my requests are unreasonable. I may extend my absence from Yahoo services for another decade...

  • Image editing on the go
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sat, Aug 2 2014 12:14:44 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    In a world where we have powerful computers in our pockets, you would think that something as basic as image editing would be a solved problem. You would think our smartphone platforms should be filled with options of powerful editing software. Yet when I recently looked for something better on iOS, I was sorely disappointed.

    The AppStore does have a ton of editing apps, don't get me wrong. Unfortunately, it may as well only have one. Most of the photo related apps out there are almost identical: Lacking in features, focusing on trivial details. By trivial, I'm talking about those famous filters popularized by Instagram.

    Sure, Instagram was innovative. It still is very popular, and the one touch ability to add strange filters to photos is arguably the reason why. But every app since then that has copied them is not innovative. Even Photoshop Express, the Adobe app which one would think should be made by people who know image editing, is bare. It has less than what I consider to be a minimum for editing software: Cropping, red eye removal, brightness control, sharpen tools and, of course, filters.

    After a long time looking, I finally found a decent app called PicShop that I think worthy of being called image editing software. It has all the basic features of other apps, but also levels, image tilt, blemishes, straighten tools, the ability to add frames, draw on the image and add text. Finally, you can actually choose what resolution to save as, not just with a 'low, medium, high' slider but actual pixels. All things I need whether I prepare images for a blog post, to embed in a document, or simply to make a photo more presentable.

    I find it amazing that most companies figure that the main use of smartphones is to take funny pictures at the club or share photos of your food, and whether it looks purple or is displayed through a sepia lens should be the main feature of their apps. We need more innovation and less cloning.

  • My Wolfenstein review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Mon, Jul 28 2014 16:53:23 PDT - [Permalink] - Category: gaming - by: Patrick Lambert

    Wolfenstein 3D was one of the first FPS games I played, and it had a key role in PC gaming history. So when Wolfenstein: The New Order came out, I already knew I wanted to play it. Like most games, I wasn't about to buy it full price, but surprisingly enough Steam had it for half price a few weeks back, not that long after release, so it was time for me to play.

    I would say that these days, not a lot of action games impress me, but Wolfenstein was pretty good, better than I expected. The gameplay was nice, with weapons feeling as they should, although I'm not a fan of weapon wheels. The graphics are very nice, and there is a lot of attention to detail. The difficulty level is correct, with bosses not relying too much of tricks to be able to kill them, and the time to completion was just right, taking me 16 hours to get through it.

    It doesn't beat something like Mass Effect, on gameplay nor depth, but it certainly is a worthwhile game to play. Overall, I would give this game an 7.5/10.

  • Local Bitcoin wallets have no future
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Fri, Jul 25 2014 14:42:51 PDT - [Permalink] - Category: business - by: Patrick Lambert

    There's no denying that Bitcoin is becoming big, from Dell now accepting Bitcoin as payment, pundits seeing a bright future for the currency, and more. But for the average user, Bitcoin is still a mystery, and can be a giant pitfall if they aren't careful. One of the virtual currency's most fabulous feature is also one of its biggest drawback, and that is the concept of a Bitcoin wallet.

    If you download a traditional Bitcoin client and then start receiving coins, whether from sales, currency exchanges or otherwise, those coins now exist in only one place: in a single file on your hard drive. That fact is immensely important. First, it means that Bitcoin is anonymous and free from external sources. Those coins are yours and no one else knows where your personal wallet is. However, it also means that should you lose that file, then all of your coins are gone forever, and no one can restore them.

    Let's be realistic. People are awful at keeping track of their digital goods. We lose documents and files all the time. Even the biggest organizations can lose crucial financial assets when they are in digital form. Just think what it would be like if losing your ATM card or credit card, arguably one of the most precious item on your person, meant you lost all your money. So if we're going to ask people to save all of their cash in a single digital file, it's going to end up in a disaster, regardless of how many techniques and suggestions we have on how to create an offline wallet, doing backups, and so on.

    This is why cloud wallets are the future of Bitcoin. Not because it's better for the virtual currency, it most definitively is not, nor is it better for security or anonymity, since you're basically trusting some company with your coins and all of your transactions. But simply because of human nature. Today Coinbase, perhaps the biggest site for all things Bitcoin, has introduced a new feature called Bitcoin Vault, which is basically a cloud wallet that can be used to store larger quantities of coins, with all of the modern security features like two-factor authentication and co-signers.

    I think if Bitcoin becomes popular in the mass population, which it has a big chance of becoming, we will need this sort of solution to avoid a disaster. A local wallet can work for people who know the technology, professionals who routinely make backups, but for the average person, secure and easy to use cloud solutions are the future. Despite the potential problems, such as a centralized place where coins are kept, having to trust an external organization, and the potential for breaches being much greater since no FDIC is going to insure your cloud wallet, it may be the only way forward for Bitcoin.

  • What the cloud means for the future of system admins
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Fri, Jul 25 2014 13:28:50 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    On this System Administrator Appreciation Day, I came to wonder about the future of the SysAdmin role and what big challenges we're likely to face. When I first started in IT doing tech support, help desk and writing about technology, it was the early 2000s, where a Microsoft certification meant you were ready for the enterprise, and it was going to be the year when Linux would be big on desktops.

    Now, Linux still doesn't have much desktop market share, but the enterprise side is changing quickly. Cloud computing is the number one topic for system admins and there's been many articles about how every business would move into the cloud, and the IT department would get gutted shortly after. Certainly companies move slowly and the transition process is unlikely to finish by next year, but there's no denying that we're seeing things move to the cloud faster than ever before. So is the job of a system admin a thing of the past?

    I think it's worthwhile to look at history to see what is likely to happen. Any good historian will tell you that things are often cyclic. What happened in the past is likely to repeat in the future. In the early days of computing, end users had very little power. They used thin client terminals, and everything was done on mainframes. Then, as the personal computer became more common in the 80s, and corporations started to give fully featured desktops to people, mainframes became extinct. A similar discussion happened back then as to the fate of those system admins, since now everybody could manage their own PCs.

    Of course, it was hardly the end of IT. Instead, we saw the rise of smart networks to get all those PCs talking together, authentication systems like Active Directory to handle all these devices, web apps to provide services to people, group policy management to enforce corporate policies, and so on. Despite the decentralization of computing, IT professionals thrived. Yet now, cloud computing is making us go back to the previous model, where everything is once again stored in a centralized location. The main difference however is that now they aren't mainframes in the basement of our buildings, but cloud servers in data centers.

    Doing simple things like email or docs in the cloud is easy, but handling everything a business expects such as shared documents, unified communications, license management, government compliance and so on, isn't so easy. System admins will still be required in most cases, even if anyone can sign up for an Office 365 or Google Docs account. Similarly, those data centers still need professionals to keep them running, so these jobs are unlikely to go away, perhaps just be relocated. Starting a virtual machine on AWS may become easy enough for the employee in finance to do it, but when you want scalability, high availability, and so on, you need professionals to handle the back end.

    I also think that this was inevitable. Look at the BYOD phenomenon. Now, most employees expect to be able to remote into the organization from their laptops, or use their iPhone or Android device at work to get email and IMs. Having everything in the cloud makes it much easier on admins, so you don't have to worry about remote access servers, VPN, policy management tools from many different vendors, and all the security that goes with it. By using Office 365, for example, everyone can access their files from any device or any location through the Microsoft clients, with management being centralized in a single location for IT.

    In the long run, I definitively think the IT field will see a decrease in personnel. As things become more automated and devices easier to use, there will simply be less of a need for IT, but businesses will always need to keep some system admins around. So what does it take to ensure you will be the one with a job in this cloud world? Diversity is key. An MCSE certification was fine 10 years ago, but now you need to know about VMware and Citrix to handle virtual machines, you need to learn Azure, AWS and other cloud solutions, and you need to keep up with how all these solutions interoperate, so that when the CFO comes to you asking whether his corporation could save money going from in-house systems to the cloud, you can realistically suggest a migration path to a private, public or hybrid cloud that makes sense.

    I think in the future we'll see an even bigger share of the pie going to consultants and smaller companies that specialize in quick IT architectures, where they can come in, set small or medium businesses up with a cloud setup, and then manage everything for them remotely. If there isn't going to be any critical server running on premise, why should system admins have to sit in a cubicle? Again, diversity, not only of skills but also into business and entrepreneurship, will be key for IT professionals to survive in the ever evolving world of cloud computing.

  • 6 ways to get YouTube on your big screen TV
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Thu, Jul 24 2014 16:45:48 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    Like many geeks out there, I've experimented with various ways to get YouTube on my TV over the years. Why YouTube in particular? Well, while I do use a variety of entertainment sources, YouTube is the one I would say I probably use the most. And according to surveys, it's the same for many people. But while watching YouTube videos can be fine on a tablet or laptop, when you're siting at home, it's much better on a TV, especially since YouTube now has many videos in 1080p and even some in 4K!

    1. Smart TVs: The first and probably easiest way to do it is by buying a so-called smart TV. Most 40 inches and higher TVs are now providing on-screen apps, including YouTube. Whether you go with Sony, LG, Samsung or any of the big TV manufacturers, they all have their own interfaces which are very similar. The problem is that most of them are really bad. Since the whole interface has to be controlled by the chip embedded in your screen, most of them end up being underpowered and slow. The interface can also be wonky and hard to navigate. And if you ever want to change it, well that's pretty hard since it's built right into the TV.

    2. Apple TV: The Apple TV has a YouTube app and was actually one of the first boxes to provide such functionality. The problem is that unfortunately, it seems the app was never updated since that first introduction. The interface is really, really bad, where you can't even see a list of all new videos from your subscriptions.

    3. Consoles: A better option is by using one of the game consoles. If you play games, anyways. The advantage here is that both Xbox One and PlayStation 4 have updated their YouTube apps recently, and the interface is miles ahead of the Apple TV. Of course, the devices are also much more expensive, so this only applies if you need a console for actual gaming.

    4. Set top boxes: There are many other devices out there that are cheaper than consoles and come with a YouTube app. Perhaps the most popular among them is the Roku. Interestingly enough, Roku only recently added a YouTube app, and you will need the latest generation box to use it, but the interface looks quite decent.

    5. Chromecast: If you're interested in going the set top box route, and YouTube is your app of choice, why not go for the option that Google itself provides? Not only is Chromecast built by the same people behind YouTube, but it's also the cheapest option. The main drawback is that unlike some of the other options, it is a bit more limited on features. For example, it comes with no remote, so you have to use a phone or tablet to control it.

    6. Computer connected to TV: Finally, perhaps the cheapest option (assuming you already have a computer) and the most customizable one is to simply plug in your computer to your TV and use it as a second display. This gives you the full web interface, a real keyboard to type searches with, and many more options like browser extensions and actually doing real work on your TV. Of course, it does require you running a cable between the TV and computer, and it uses your computer resources to play videos. This may be something best left for enterprising folks, but it's certainly worth noting.

    Lastly, I would recommend whatever option you pick (except for Apple TV and some of the rarer set top boxes which don't support it), go and check out the pairing page which allows you to link your TV to your YouTube settings. This will add a button at the bottom right of any video so you can play it right on the TV from the web, which is very handy.

    My recommendation is to use what works for you. Don't just go by price or feature set, go to a store and try out the interface. I've found that this alone makes or break your experience when watching YouTube, or using any app on a large screen. It's amazing how annoying typing long search queries using a remote can get, and how many key features some of these devices get wrong, like the lack of a united subscriptions feed on the Apple TV, or how slow some underpowered devices can be to load videos.

  • Don't use Firefox for privacy
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Tue, Jul 22 2014 11:46:04 PDT - [Permalink] - Category: security - by: Patrick Lambert

    A lot of people decide against using Chrome on principle, because they don't want to feed all of their digital fingerprints to Google and their advertisers. I certainly agree with that, and Firefox has always been the browser of choice to sophisticated or privacy minded individuals. Unfortunately, as time went on, Mozilla has turned their key product into a carbon copy of Google's offering, at least when it comes to privacy.

    If you've never run a network monitor to see what traffic your computer is sending out to the web, it's a very worthwhile experiment to do. You would be amazed at how often your browser contacts Google's servers. Let's make things very clear: Firefox is fully embedded with Google's services, it's Mozilla's main source of revenues. Let's see a couple of ways your traffic ends up in their data center, regardless whether you type in your URL bar or not.

    The main way Firefox sends data to Google is when you type in something in the URL or search bar. Every letter you type is sent instantly to Google, unless you went in to modify advanced settings. That means even if you make a mistake, or change your mind and then delete what you wrote before you press enter, it doesn't matter. Everything was still sent. This of course is to provide you with that instantaneous drop down menu that shows you results as you type, but it also means everything has to be sent out for this feature to work.

    Even if you use Bing for searching, or if you don't mind search terms going out, every URL is also sent out to Google for something called phishing and malware detection, a service that scans web pages and detects potentially infected pages. That way, if you go to a URL that may be compromised or trying to infect your computer, a red alert page will show up. But for this to work, that means hashes of all your web visits have to be sent out as well.

    Oh, and in case you're thinking of using the private window feature, also called incognito mode, then think again. Every browser handles private browsing the same way. The goal of this feature is to keep your stuff private from your spouse, your kids, or other people who may access your computer locally. It keeps that tab out of your local history search. But what you do in this mode has exactly the same impacts on the network side. Google still gets to see all of your search queries, every URL you go to, and interact with your browser in all of the same ways.

    Finally, this week Mozilla announced that Firefox would integrate even more with Google. Now, every download you do will also be sent off to Google, in order to make use of their virus analyzing tools. That way, if you download something that Google thinks might be malware, Firefox won't let you access it. Again, a great feature if you need to be protected from yourself, but in order for this to work, it also means even more private data being shuffled out to the advertising giant.

    I guess in the end it's up to you to decide how you feel about all of that. It's easy to just give up and go along for the ride, enjoying all of these benefits and not worrying about the privacy implications. Or you can take the opposite approach and go into settings, disable each and every one of those features, hoping you didn't forget any of them, and then hope your browser won't communicate with Google's servers anymore (it will, by the way). Don't use Firefox if privacy, at least from Google, is something you care about.

    I think Mozilla could be doing a better job at this. It's the same thing as when they revamped the UI. Why is it that we need Firefox to become more and more a clone of Chrome? At the very least, there should be a comprehensive set of options for privacy, a page where it's explained clearly what goes on behind the scenes, and allowing you to turn all privacy-breaking features off with one button. But, I suppose that's not what is making Mozilla money, so I doubt we will see that anytime soon.

  • Most business users still rely on Apple mobile devices
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Mon, Jul 21 2014 12:17:52 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    There's a lot of talk about market share, and how Apple has been slowly losing ground over Android, especially Samsung devices. There are lots of reports pointing towards a myriad of factors, from the slow release cycle of one new iPhone per year, versus the large number of Android manufacturers. Most surveys are also based on fairly arbitrary facts, like devices shipped or asking what type of devices people have at home, not necessarily what they use day to day.

    In Montreal, there are several open wi-fi hotspots in the downtown area where a large amount of office workers congregate during lunch time. Since these are fully open access points, every device freely broadcasts its MAC address to every other device. So all it takes to get a comprehensive list of the devices linked is to connect to the wi-fi. I used Net Analyzer to perform a simple passive scan to see the manufacturers from devices connecting to 3 access points spread around downtown Montreal for 5 consecutive days during lunch time, and compiled the results:

    Now I will say right away that these are not scientific, and are not fool proof. The unique sample size was 647. I would suspect that doing such a survey in tourism areas, or places outside downtown would give different results. But I thought it was an interesting experiment. First I will say that by far, people bring phones and tablets with them to public access points, not laptops. This is likely because again, these access points are mostly frequented by business users who work in downtown Montreal, not tourists who may bring a laptop with them. Then, despite the growing market share of Android phones, I was surprised to see that over half the devices who use Montreal public wi-fi are Apple based.

    In my opinion, it's still the case that people who have the money to afford it, still choose Apple over Android as a majority. As a last interesting tidbit, among the Others category, I found various manufacturers I didn't recognize such as Xiaomi Technology, Aruba Networks and Compal. I also saw a number of Canon and Epson printers, and a single Apple TV.

  • The rule of two
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Thu, Jul 17 2014 15:09:20 PDT - [Permalink] - Category: lifestyle - by: Patrick Lambert

    Many companies are working hard to provide us with a single consolidated portal from which all our digital goods can be purchased, organized, downloaded and viewed. Whether it's Amazon with videos and ebooks, Apple with music and TV shows, or Microsoft with OneDrive and productivity suites. There are a lot of advantages to having all of our digital content centralized like this. It means easier organization, since everything is in one location. Updates are faster, since we only rely on one portal or app to access that content. And we only have one destination to access that content, or provide payment information to in order to buy new releases.

    But centralized content also has one big downfall: You rely fully on a single source, or as the old saying goes, put all your eggs in one basket. We all know what can go wrong when the cloud goes down. If the particular service you rely on for your media, documents, music or storage goes down, and all your eggs happen to be in that particular basket, then you may be off to a bad day.

    This is why I tend to believe in the power of the number two: Two different services for any one type of data. For example, I rely heavily on iTunes. I find that it's a great solution for movies, music, podcasts and TV shows. Regardless where I am, whether I have my TV, tablet or phone, if I feel like accessing some content, it's right there, available. Plus, I can get the latest releases right from home or on the go, without the need to wait for the local store to get that blockbuster film on Blu-Ray or DVD. However, I'm not one of those people who went all out and moved all their DVD collection to the cloud. I still have a significant amount of physical disks, and it's still growing, whenever I happen to find good sales on something I'd like.

    I apply the rule of two to many more things as well. Video games for example is a prime example. While console gamers are just now getting into digital distribution, PC gamers went through that transition years ago. Most PC gamers like me haven't bought a physical copy in a long time, because they always cost more and aren't available as quickly. The only reason to get a physical box is if you want a collector edition with extra swag I don't happen to care about. Steam is by far the preferred platform, but even though it's very stable and incredibly popular, I would still encourage you to use the rule of two. Whether you use GOG, Gamefly, Origin or physical boxes, you just never know what is going to happen to a particular company or platform years down the road. Even spreading your assets between two different cloud providers is a useful endeavor. If the Steam cloud is down, then it's too late to realize you have nothing to play until they fix it.

    Whether we're talking about entertainment, games, ebooks, documents, even things like where you keep your finances or any other type of digital content, I always try to use the rule of two. Some would ask, if I'm not going to embrace the convenience of a single platform for all my content, why use just two, why not spread things around to dozens of platforms? The reason is that this convenience is still useful. I found that opening two apps, two web portals, or looking in two locations for a particular piece of content is easily done. Three or more, and it starts becoming a chore. So if you're thinking about doing the same, what are some options you can use? I'm not going to do full reviews of every cloud service out there, but here are some brands you can start with:

    * For files storage, there's physical media like removable HDs and USB keys, along with cloud services like, OneDrive and DropBox.

    * For movies and TV shows, there's Apple iTunes, Amazon Instant Video, Google Play, along with streaming services like Netflix, and of course physical disks.

    * Gaming depends greatly on the platforms you use. For consoles, if you have the means for it, I would encourage going with the two big ones, Xbox and PlayStation, since they both have a lot of exclusives. For PC gaming, Steam is the obvious one but there's also Uplay, Origin, Gamefly and

    * For productivity, if you're on Windows then I think Microsoft has a decent offering with both the standalone Office suite and Web Office, while Apple has iWorks and has a cloud version of Pages, Numbers and Keynote in beta. Of course Google also has Google Drive which is compatible with both PCs and Macs.

    * Finances is not something you may have thought about, but many people rely on various digital services that their bank offers for things like keep track of assets, budgeting and so on. But then you can also use cloud services like Mint or Personal Capital to link those accounts and do your investment planning from a central location.

    * Email, contacts and calendars also don't have to live in a silo. For example, many people have both a Gmail and Hotmail account, but you can link the two and send them back to your phone or computer, so you can access your appointments, to do lists, and contacts from anywhere, online and offline, something that just a few years ago was only possible in corporate settings using things like Microsoft Exchange.

    There are of course tons more examples that can be given, but I think if one thing is certain, it's that the digital world will be converging more and more. But no service lasts forever, nothing is foolproof, and it's up to you to ensure you aren't left naked in the street when the lights turn off in a data center on the other side of the world.

    Load more

    Categories: art | business | entertainment | gaming | lifestyle | misc | science | security | tech
    View full archive