Patrick Lambert - Freelance Content Creator Follow me on...
Twitter DA Google+ Pinterest

Newest blog entries

  • My Draenor review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Thu, Nov 20 2014 5:28:03 PST - [Permalink] - Category: gaming - by: Patrick Lambert



    It's been a week now since World of Warcraft, a ten year old game, released its latest expansion: Warlords of Draenor. This brought the game back up to over 10 million players, and of course I was one of them. I would say that most of the hype and comments out there are pretty accurate. Draenor is, so far, a pretty damn good expansion. Certainly better than MoP already, and might turn out better than Cataclysm and Burning Crusade as well. Leveling through 90-100 was more fun than the previous few expansions, with quests more varied and more cinematics than ever before. I was apprehensive to the concept of garrisons before release, but I think they turned out well, kind of like our personal daily quest hub. I do hope they will expand on the concept in the future by adding more customization options. I find the dungeons more interesting and varied than the Pandaren ones as well. Gearing up isn't much of a chore at all.

    Of course it wasn't all good. The first two days of the expansion were plagued with stability issues. In fact from a performance standpoint, this was the worse release since vanilla. I understand that an incredible amount of people came back to play it, but they should have been better prepared. As for the PvP world zone, Ashran, it's too early to tell but I do worry about its longevity, with every game so far either a complete stomp from one side over the other, or a standstill in the middle of the map. Located out of the way as it is, I wish they would have just made it into a battleground, and left the faction cities in their originally planned locations.

    Overall I would give Draenor a 8/10.


  • Kijiji scammers
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sat, Nov 8 2014 11:41:34 PST - [Permalink] - Category: security - by: Patrick Lambert

    This is the second weekend I try to sell my iPhone 5 for $280 on Kijiji. It's unbelievable what a terrible experience it is. I can assure you, scammers are alive and kicking, and throughout both weekends, I've had nothing but scam attempts. Tons. Some I didn't even bother replying to, when I noticed how all of the texts I was receiving came from numbers outside the country, never mind the fact that this is supposed to be a local classified board. Those I did contact all replied in the exact same way:








  • The holy grail of cloud privacy is possible
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Wed, Nov 5 2014 4:21:06 PST - [Permalink] - Category: security - by: Patrick Lambert

    When talking about cloud services, privacy is always a touchy subject. The usual saying is that for a cloud service to be useful, the cloud provider needs to have the keys to unlock user data. That's the only way to provide indexing, sharing, sorting and searching features. If the user wants perfect privacy and encrypts all of his data before sending it over to the cloud, then not much can be done on a single blob of cipher text.

    But we've progressed beyond that point, and this argument should no longer be part of the equation. One solution is CryptDB, now several years old and a tested approach, which provides a database system that can be used to do queries on encrypted data. The idea is that rows and columns can still be defined, but the actual text in those cells can be encrypted in a smart way. For example, let's say you store notes in the cloud. Those notes can have tags attached to them. Even if you encrypt each tag with a key that only you possess, so that your cloud provider (or hackers, or spying agencies) cannot decrypt your data, you can still do comparisons between each tag. The server can know whether your encrypted search query matches the encrypted tag, without knowing what the actual text says.

    And that is key to reaching the holy grail of cloud privacy. You don't need to be a privacy nut to demand this level of security. Google is among a few cloud companies really stepping up encryption, but as long as the key remains in the hands of providers, there will always be risks. I think it's passed time that client software become the maintainers of encryption keys, and not cloud servers. Most of the benefits of using the cloud can still be achieved with encrypted query processing.


  • Poor user experience
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Mon, Nov 3 2014 5:57:36 PST - [Permalink] - Category: tech - by: Patrick Lambert

    Last night before going to sleep I decided to look up a Kindle book on my phone. Here's how the process went:

    * Load amazon.ca and start searching, eventually finding the book I want.
    * I click on Sample, the site asks me to log in. I type in my email and long password on the iPhone keyboard in order to even see a sample.
    * I get told that my account isn't allowed on amazon.ca, and that I need to go on amazon.com instead.
    * The site redirects me to the main page instead of the specific book I wanted, having to do the search all over again.
    * Once I find the book once more, I click on Sample, the site asks me to log on once again.
    * I sigh, turn off the iPhone.

    I understand that Amazon has to restrict accounts, that's something publishers force upon them. But the link from one site to the next should at the very least send you to the right page, so you don't have to conduct your search once again. And the site does know that my account exists, so it should also pass on credentials to itself, so I don't have to log in twice in a row. Overall, I feel like this user experience should be better.


  • Civilization Beyond Earth review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Mon, Oct 27 2014 17:04:36 PDT - [Permalink] - Category: gaming - by: Patrick Lambert



    I've always been a Civilization fan. My very first PC game was the first of the series, on four 1.44MB floppy disks. Since then, they mostly improved the genre, but it's also become somewhat stale. With Civilization Beyond Earth, it seems like the developers were attempting to inject some new blood into the series. Unfortunately, I don't really think they achieved much.

    First, this version plays very much like Civilization V. Once you look past the alien landscape, it's still the same mechanics, even the units play the same. As for the added flavor, it's not all good. The fact that you can send satellites up into the air adds an additional layer to the game, but it's a thin one. The rest is mostly cosmetics. For some reason they added a lot more natives, aliens in this case instead of barbarians in previous games, but they are mostly annoying rather than adding anything worthwhile. They also diminished how much diplomacy you can do.

    I guess between this version and say, Civilization 3 or 4, I'd rather play one of the originals. In fact, I find myself playing more SimCity than this right now.


  • Why Perl is still my main language
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Tue, Oct 21 2014 14:51:26 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    There are many programming and scripting languages out there. As time goes on, new ones get created and take flight. Right now, some of the popular newcomers include Node.js, Go and Swift. I've said before that I don't consider myself a programmer, but I still write quite a bit of code as part of my work and also for fun. I like to try out new languages, and in the past year I probably wrote projects in half a dozen different languages. It allows me to do cool stuff like the Rosetta Stone of Coding, for example.

    But between all of these different languages, I still come back to Perl as my main choice, for two main reasons. First, the giant catalog of modules available out there. One issue with new languages is that while they have the advantage of starting fresh and reinventing concepts in a new, imaginative way, they also don't have the array of functions that older languages have. Perl is said to have a module for everything you could possibly want, and if you check on CPAN, you will see what that means. Whether you want to access the Windows Registry, do low level network coding, authenticate against Twitter or do complex math functions, there's a module for that.

    The other reason is its portability. A lot of new projects have their installation instructions say something like Use bower to install. This certainly makes things fast and easy if you already have a development platform installed, but if you don't, you still need to go through the non-intuitive steps needed to get the multiple layers of dependencies. A lot of projects, especially in the open source world, also assume you're on Linux. Things start to break pretty quickly on Windows, OS X or other platforms. Perl is not only native to pretty much every platform out there, it comes pre-installed on most of them. Plus, there are neat utilities like Perl Dev Kit, what I use to create cross-platform binaries, allowing me to release a single executable with no dependencies. Sure it's not the most efficient way to do things, but sometimes it's the most convenient, when you want to use a script on a system with no development library installed, or share it with someone who wouldn't know what a command line looks like.

    This isn't to say that I don't enjoy writing in other languages. Lately I've been playing with Node.js and for certain things, it's very neat. But even though Perl doesn't seem to have that cool factor anymore, and isn't discussed as much on Internet forums, I haven't found a language that can replace it yet. Oh, and as an aside, my least favorite language of all: PowerShell. A thing of nightmares...


  • Is Docker the virtualization of the future?
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sun, Oct 19 2014 3:48:22 PDT - [Permalink] - Category: tech - by: Patrick Lambert

    Virtual Machines have become the norm in the IT industry. Most servers out there have been virtualized in one way or another, whether it's in the public cloud on Microsoft Azure or AWS, or even on-premise, where a single server will often run a dozen VMs for various purposes. But VMs have one inherent flaw: Overhead. Each VM is actually a full operating system running in its own container along with whatever required processes. In many cases this is fine, but as VMs have become more and more ubiquitous, this is sometimes pushed to the extreme. It's not rare to see VMs running a single app. Worse, if the app needs things like SQL Server, IIS, or other common roles, each VM will actually be configured with those things turned on. You end up with many VMs replicating the same roles, when it's not always necessary, and can make things like management and backup quite complex.

    There's been a lot of development done to try and minimize this overhead, like hypervizors managing memory in a way that allows a single page in RAM to be replicated for many VMs when they all need the exact same data, but there are limits to this type of mitigation. That's where Docker comes in. Docker, and other similar systems, takes the concept of a VM and scales it down to the process level. Instead of spinning up a whole operating system, with all of its overhead, in order to run an application, we now have applications that live in their own containers. RightScale has a good article describing the differences between VMs and Docker. In a nutshell, you take an application, its libraries, files and other dependencies, and put them all inside of a container. The result is something that takes less time to start up, and less resources, because all of your contained apps can share a single OS, yet they can still be distributed easily.

    Last week Microsoft announced that Docker support was coming to Windows Server. I think this will help propel the technology, and while the past 10 years have been the years of VMs, I think in a few years we will see that start to shift. VMs are certainly not going away, but instead of having the impulse to start a new VM for every project, I suspect a lot of people will keep a single powerful system running, and spin up containers instead.


  • Six years of renders
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sat, Oct 11 2014 8:15:35 PDT - [Permalink] - Category: art - by: Patrick Lambert

    This week I was doing some work on my web site and I decided to take a look at the database holding my gallery images, get some statistics from all of those renders I've done over the years. In total, this site holds 776 images, which isn't all of the renders I've done, but most of them. The ones I deemed good enough to be shown. The first image was in April 2008, or over 6 years ago.

    The distribution of those images is as follows:

    CategoryImages
    Fantasy282
    Sci-fi120
    Urban164
    Star Wars171
    Fractal5
    Comics34

    Another interesting data point is that even though many of my images contain nudity, it is not by far the majority, only 256 images, or 32%. Finally, the time spread was an interesting revelation as well:

    YearImages
    20082
    2009182
    2010267
    2011167
    201271
    201360
    201427

    My golden years of rendering are behind me? I think it's likely. The main reason is probably inspiration. After so many images, I've rendered the scenes I wanted to, and now too often it feels like making another one would be simply a variation on something I've done in the past. Also, I now have far more hobbies than before, including coding, writing, and so on.

    If you'd like to know more about how I actually create these renders I invite you to check out some of the tutorials I've made, including 3D Concepts Tutorial, The Making of Agony and Vue Lighting Tutorial.


  • Now using GitHub
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Sun, Oct 5 2014 8:29:52 PDT - [Permalink] - Category: misc - by: Patrick Lambert

    Even though I've been writing code for over 15 years, I've never considered myself a developer. I'm a hobbyist who write things because I need them, to learn, or just for fun. As such, and because I've never worked on large enough projects, I haven't used SVN or other source management systems. Lately however, it seems like I've been writing more and more actual apps, albeit small, and decided it was time to publish source code at a central location.

    I selected GitHub because it's pretty much the central location for the majority of online developers. I also like the way their interface works, including the Windows client which is pretty nice to clone and push changes. You can access my GitHub page where I'm going to put most of the code I write as a hobby, usually under the MIT license. It's nothing ground breaking, but if anyone wants to build upon it, feel free to.


    Load more

    Categories: art | business | entertainment | gaming | lifestyle | misc | science | security | tech
    View full archive