Patrick Lambert - Freelance Content Creator Follow me on...
Twitter DA RSS Pinterest

Newest blog entries

  • Server monitoring with instant notifications
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Jan 21, 2015 - [Permalink] - Category: tech

    Logs are core to the function of any IT professional, to keep track of events and find potential problems. But more often than not, those logs are pointless without proper monitoring. A typical Windows system may produce thousands of Event Log entries every day, and then you have logs from everything from web servers, network devices, and so on. That's why complex log analysis solutions like Nagios and Kibana were invented, that can eat up millions of lines of log and produce nice graphs and categorize them in various ways. But sometimes you need something far more simple than that.

    This is a need I had, more precisely a way to monitor backups and file access on a remote web server. Since this was a remote system, I couldn't go with one of those big centralized logging systems, nor did I really have a need to. Instead, I just needed daily notifications on whether backups were being done correctly, and whenever important files were modified. So I opted for a service called Pushbullet that allows you to send notifications for free to any device. They have clients for Windows, iPhone, Android and so on. More importantly, they have an API.

    All I had to do was create an API key and then make a bash script that uses curl to call up the API to send a notification at the end of the task:

    curl -u APIKEY: -X POST https://api.pushbullet.com/v2/pushes --header 'Content-Type: application/json' --data-binary '{"type": "note", "title": "Backup completed", "body": "New backup size is $(stat -c%s backup.tar) bytes"}'

    They do have a Windows client, but it doesn't offer a command line interface yet, so I made my own so events can be scripted on Windows hosts as well. Here's a bit of PowerShell that can let you know about modified files in the past day, and then send you a link to that log:

    $time = (Get-Date).AddDays(-1)
    Get-ChildItem "C:\files\*.*" -Recurse | Where -Property LastWriteTime -ge $time | Sort LastWriteTime -Descending | Format-Table -Property FullName,LastWriteTime -Autosize > C:\inetpub\www\last.log
    pushbullet -title "Last log" -link "http://server.name/last.log" -apikey APIKEY

    You can also configure a trigger to have these notifications go out whenever a file is committed to a document management system, an Event Log entry is made, or CPU utilization goes too high:

    $a = (Get-Counter "\Processor(_Total)\% Processor Time" | Select -ExpandProperty CounterSamples).CookedValue
    if($a -ge 80) { c:\pushbullet -apikey APIKEY -title "CPU trigger reached" -message "CPU utilization is at: $a" }

    This was all pretty simple to do, free, and gives me real time updates on things I need to keep track of.


  • Desktop upgrade
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Jan 17, 2015 - [Permalink] - Category: misc

    It's time for my January desktop upgrade, but since my current system is only a year old, I'm only changing a few pieces this time. I ordered a SSD to replace my primary drive, as I/O speed is always a big bottleneck. I used to have a Kingston SSD a few years back before it died, and this time I'm going with a Samsung EVO 250GB drive.

    The other part I changed was the graphics card. When I built this system a year ago I kept a card from a previous system, so it was getting slightly old. I got an Asus GTX 760 to replace it which runs faster and much cooler. Here are the results:



    This should keep me content for another year.


  • Why IT tasks can take so long
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Jan 14, 2015 - [Permalink] - Category: tech



    This week I faced a number of issues that I thought illustrated well the concept of why computer related tasks often take so much longer than many non-technical folks may expect. There are cases where the true time frame that a certain task is going to take cannot be determined, not even estimated, such as this one.

    I've installed hMailServer on many systems already. I like the software a lot because it's a very lightweight, easy to use and configure email system, yet has a lot of the features people use products like Exchange for, without all the overhead. Each VM I've installed the software in the past happened to be Windows Server 2008 or 2008 R2. A full installation typically takes around 5 minutes. But this week, I tried installing it in a Windows Server 2012 R2 VM. This took over 2 hours.

    First, I came upon an actual bug in Windows Server 2012. hMailServer requires .NET 3.5 to be installed, which is not the case in Server 2012. But when you go to enable the feature, it fails because of a buggy security update which needs to be disabled in order to enable the feature. Turns out this doesn't work for everyone. It didn't work for me. I then stumbled upon a Microsoft blog post from 2 years ago that describes another way to enable the feature by using the command shown in the screenshot above. That finally fixed it.

    Then the built-in compact database would not work. I'm not sure why exactly, but I didn't spend a lot of time trying to fix it, I just went ahead and installed MS SQL Express, which brought me to yet another issue. When configuring the database connection, regardless of what I tried, I could not make it connect to localhost. I eventually decided to try using the machine's hostname, and that worked. Both localhost and the hostname worked fine in SQL Management Studio, so I'm not quite sure where the fault lies on this one, but that was another half hour lost.

    In the end, the software was installed successfully, but what was usually a 5 minutes job took hours because of various weird and unforeseen issues. So next time your IT professional takes longer than expected to do something, remember it may not be anybody's fault.


  • It's all about the software
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Jan 5, 2015 - [Permalink] - Category: tech



    Big Data has been a popular buzz word for the last couple of years, and it will keep being an important concept. With more data, you get more information, and potentially may make better decisions about a number of things. So far we've mostly been concerned about gathering and storing this information. We've seen advances in databases like Google BigQuery and the NoSQL engines. But to be truly useful, data has to be interpreted, and I don't mean processing it to create nifty looking graphs. So far we've only barely touched the surface of what can be done with the proper interpretation of data. To make sense of so much information, humans can't be the ones doing the interpretation, software will have to do the work.

    Let's look at a simple example, a dataset of weather information such as temperatures, rainfall, cloud cover, and so on. By itself, various results can be extrapolated, such as weather predictions. But when taken in context, that same data can become far more useful. What if a software process was able to also take population data, geographic maps, movement tracking, and could predict people that are about to be trapped in a flood, hurricane, or other extreme weather phenomenon? This requires far more data inputs and processing power than a single person can provide, but the proper software program can.

    This type of automation is inevitable because in many cases, humans can't make all the decisions, whether it's monitoring hazardous materials in unsuitable locations, or making adjustments to a space probe millions of miles away from Earth. Intelligent software processes will need to become the deciders, and for that to happen they will need to gather data, process it, contextualize it, draw conclusions, then act. By its very nature, a software process with enough smarts along with the proper data, can draw much more precise results in far less time than human beings.

    Many people fear the singularity, or the idea that machines will turn against humans when they become self-aware. But the singularity will not be a machine, it's not going to be Terminator style, where killer robots turn on us, instead it's far more likely to be like the Geth in Mass Effect. The Geth are 100% software, with hardware platforms being used like disposable vehicles from which they can move freely, millions of processes networked and constantly thinking, analyzing problems and coming up with solutions. I think the coming years will be all about the software, and making it as intelligent as possible, and I can't wait to see where it leads us.


  • 2014 end of year summary
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Dec 31, 2014 - [Permalink] - Category: lifestyle



    Another year behind us, and as usual it's time to reflect. 2014 was a mixed year. I moved onto a much higher pay grade, managed to create a lot more code and acquire new skills, so that's good. However I've been sick more often than before, almost as if I'm getting old..

    Thinking about 2015, I'm thinking it's going to be mostly a continuation of all that. There are things I'd like to do however, like get some more certificates. I haven't done any certification in several years now and while I do learn all the time, I think it's important to keep up to date, especially in the IT field. Maybe something different this time, I'm thinking Microsoft SQL Server.

    On the hobbies side, the year has been pretty good for PC gaming, with titles like Dragon Age Inquisition coming out pretty decently, along with quite a few good movies too.

    Here's to one more year in the bag.


  • A complete and in-depth overview
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Dec 26, 2014 - [Permalink] - Category: tech

    Today I read a fascinating article that rang true in more ways than one about how complex IT systems have gotten at large organizations. When talking about opaque systems speaking to other opaque systems from different vendors, my favorite quote was: "Nobody has a complete and in-depth overview any longer."

    Having worked in IT for some time now, I can attest to how true this is. Any moderately complex organization gets set up and then grows organically, and before long you have systems relying on other systems to function, and no one remains fully aware of how it all works. I started working for a new job last year at a company making such complex IT software, and even though technology is our bread and butter, there's been many cases of people asking "Why do we do it this way?" only to be answered "Because that's how it works." I personally love to know how things really work. Many of the utilities on my downloads page were written as a learning experience. I'm always slightly uncomfortable just writing a line of code that doesn't seem right, just because that's what everyone else does.

    Let me give you an example. We have a library that we use at work which makes API calls from various systems to a target server. We often wrote code against it, each new employee learning from an older one how to use it, but no one actually knowing what went on behind the scenes. So one day I decided to find out, and a few days later, with the help of a network monitor and a bit of scripting, I went through the SOAP API that the library was actually masking. It turned more than one head that I managed to do this, even though it was trivial once I understood what was actually going on. But that's the problem, SOAP is an open API, writing code against it is easy and doesn't take much time. Digging into the opaque box to figure out what went on was the hard part.

    I understand the need for this opacity: business reasons, plus as the IT world becomes so much more complex, it's unrealistic to have every part of the gear understand everything else. But I feel an overview shouldn't be too much to ask. Catastrophes like the one barely averted in that article are only going to become more prevalent, with root cause analysis, a synonym for reverse engineering complex black boxes remaining the hardest part of the equation.


  • Dragon Age: Inquisition review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Dec 22, 2014 - [Permalink] - Category: gaming



    Back in 2011 I had been disappointed by Dragon Age 2, even though I loved the original. So of course I came upon Dragon Age: Inquisition apprehensively. But I'm happy to say DA:I is much improved from DA:2. First, this has to be the most visually appealing game made so far. The environments are spectacular, which is ironic since they were one of the lowest point of the last release. The regions are surprisingly vast and diverse. The game has deserts, forests, mountains and plains. Everything about the environment is well done.

    The story is, I would say, on-par with Dragon Age 2. It's good, but not at the level of Origins. You get a ton of dialogs with your companions, and many interesting group events like the cards game. Romance is also well implemented, as we have now come to expect from Bioware games. As for the quests out in the world, many of them are interesting enough to keep you going. Speaking of going, this is also the longest Dragon Age game they made. The vastness of the zones and sheer amount of things to do made me play for over a hundred hours.

    There are some negative points however. The combat is very much inspired from DA:2 as opposed to DA:Origins, and that's somewhat too bad. Also, the game has bugs. Several times I encountered a non-existing NPC, an unreachable map point, or a quest that refused to complete. Fortunately most of them can be fixed by logging out and back in. Overall, still a very positive experience. I would place it above DA:2 but still below DA:Origins on my charts.


  • HP Envy x2 review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Dec 11, 2014 - [Permalink] - Category: tech



    A very large tablet type device recently caught my eyes in a local Future Shop. It ended up being the HP Envy x2. It's actually a so-called detachable, basically a giant tablet with an attachable keyboard cover. It has a 15.6" screen and runs Windows 8.1. After a few weeks of checking it out, I decided to buy it.

    The first thing to notice is the CPU, an Intel M-5Y70, along with 8GB of RAM. It's a very fast machine, snappy to use and without much lag at all. Most Windows tablets have abysmal performance, and even many laptops will lag at simple tasks like scrolling through modern apps. This one feels very much like a PC. The full HD screen is also very nice, with a wide viewing angle and bright colors, typical of a high end tablet. The 500GB hard drive is also a rare thing to find in such a thin device.

    On the topic of Windows 8, I've never been a fan, I still run Windows 7. I've given the start screen a try, but even for a fully touch device (I use it almost exclusively without the keyboard) I still ended up installing Start8 and ModernMix. I actually use a lot of modern apps since they work better with a touch screen, but I much prefer to always have the taskbar at the bottom to switch between apps rather than the clumsy start screen interface.

    On the downside, the volume key on the side is very clunky, and I have to resort to the on-screen volume display to achieve any precise control. The device also came with the usual lot of crapware, but I got rid of most of them. And of course it only has the Intel on-board video chip. I certainly would not use this as a full desktop replacement personally, but for my uses which is watching videos, browsing Reddit, Twitter, some web sites and email, all of which is done through modern apps, it's a great device.


  • Configuring a web app under IIS
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Dec 7, 2014 - [Permalink] - Category: tech

    This weekend I've had to figure out how to properly install and give permissions for a CGI script to run on Windows IIS. Surprisingly enough there is little information about this out there, especially if you run something other than ASP.NET.

    By default, any CGI runs under a very restricted user called IUSR. This user can't write to files, access the Registry, or do basically anything at all locally other than writing to the temporary folder, which makes it impossible to store configuration information of any type. For that you need to assign the script to a local user account.

    Here I will show you how I solved the issue using the command line, so this can be scripted. First, create the account and add it to a group with the proper privileges:

    net user %USER% %PASSWORD% /add /y
    net localgroup "Administrators" %USER% /add /y


    Obviously, replace %USER% and %PASSWORD% accordingly. Then, configure four items in IIS:

    %windir%\system32\inetsrv\appcmd set site "Default Web Site" -virtualDirectoryDefaults.userName:%USER% -virtualDirectoryDefaults.password:%PASSWORD%
    %windir%\system32\inetsrv\appcmd add vdir /app.name:"Default Web Site/" /path:/%FOLDER% /physicalPath:"%CD%"
    %windir%\system32\inetsrv\appcmd set config -section:isapiCgiRestriction /+[path='%CD%\app.exe',allowed='true']
    %windir%\system32\inetsrv\appcmd set config /section:handlers /accessPolicy:Execute,Read,Script


    The first line configures all upcoming virtual folders to use the new user we just created. Then, we create a new a new virtual folder %FOLDER% which links to the path %CD%, in this case the current directory. This means that if type this under C:\my-app\www and %FOLDER% is set to my-app, then accessing http://localhost/my-app will load the files in that local folder.

    Next, we add an entry under ISAPI Restrictions in IIS to allow the execution of app.exe. Once again, the name of your app would have to be used here. Without this line, IIS will not allow your CGI to run unless you turn on the ability to run unspecified scripts.

    Finally, we grant the proper policies to all handlers. This is needed because in this example, we use an exe file, and by default the CGI-exe handler is not enabled. Once again these lines can be adapted to any type of CGI script, from binaries to PHP, Perl, etc.

    Remember also that IIS has to be restarted or it will not see the newly created user:

    iisreset


    Once this is all done, assuming you have IIS properly installed with the CGI and ISAPI Extensions features, your script should run fine.


  • My Draenor review
  • [Tweet] [Facebook] [Google Plus] [LinkedIn]Nov 20, 2014 - [Permalink] - Category: gaming



    It's been a week now since World of Warcraft, a ten year old game, released its latest expansion: Warlords of Draenor. This brought the game back up to over 10 million players, and of course I was one of them. I would say that most of the hype and comments out there are pretty accurate. Draenor is, so far, a pretty damn good expansion. Certainly better than MoP already, and might turn out better than Cataclysm and Burning Crusade as well. Leveling through 90-100 was more fun than the previous few expansions, with quests more varied and more cinematics than ever before. I was apprehensive to the concept of garrisons before release, but I think they turned out well, kind of like our personal daily quest hub. I do hope they will expand on the concept in the future by adding more customization options. I find the dungeons more interesting and varied than the Pandaren ones as well. Gearing up isn't much of a chore at all.

    Of course it wasn't all good. The first two days of the expansion were plagued with stability issues. In fact from a performance standpoint, this was the worse release since vanilla. I understand that an incredible amount of people came back to play it, but they should have been better prepared. As for the PvP world zone, Ashran, it's too early to tell but I do worry about its longevity, with every game so far either a complete stomp from one side over the other, or a standstill in the middle of the map. Located out of the way as it is, I wish they would have just made it into a battleground, and left the faction cities in their originally planned locations.

    Overall I would give Draenor a 8/10.


    Load more

    Categories: art | business | entertainment | gaming | lifestyle | misc | science | security | tech
    View full archive