Every now and then you need to check if your servers or client computers have pending updates. You can generate a simple list of this with Powershell. I have created a script for this on my Github named ListPendingWindowsUpdates.ps1. Here is a quick breakdown of the script, feel free to use and modify it anyway you like. Please comment below what you ended up doing with it.
We always need to declare the functions of the Powershell script first but I will dig into the only function of this script below and start with the locally executed code. This script have one locally executed part and then a function that is executed on each and every server/client it lists. For this to work you need to run the script with domain admin rights. Both to access the Active Directory and to remote execute the code on each server/client.
Working with folder and share security is to often treated as set and forget. A good practice is to run daily jobs to check, report and reset permissions on shared folders and home directories. There are several ways to do this but it can easily be done from Powershell. This can also be used when migrating between servers and access needs to be added or removed. Here is a few useful code snippets when working with folder access and shares in Powershell.
New Relic is a very good tool to monitor you servers and applications with a bunch of metrics och features. If you landed on this page you probably already use it so I want go into any more details on it. There are things I love about it and things I hate about it, the way it is with most tools you come across in your day to day work.
For monitoring we use the Health Map filtered to Hosts and related applications which gives us a great overview of the overall condition of the servers and the web applications running on them. Currently there is no customization for the sorting or the layout and no kiosk mode for a proper wallboard. When building a good wallboard for your support or NOC you want to add additional information and be conservative with the real estate, you want to fit all the information on one big screen so you get all the information you need in one glance. This is where New Relic doesn’t deliver as good as it does on other parts.
After using Jira Service Desk for a while we ran into a problem with automatic re-opening tickets when a customer replied to a closed ticket. The automatic re-open worked fine but any attachments where just thrown away. Usually customers respond back with a screenshots, logs or similar information needed to further investigate the ticket.
At the end of the comment that triggered the re-open and contained an attachment the following error message was added: Failed to add the following attachment to this issue because file attachments are disabled for the system.
Did a lot of research online without finding any solution, a lot of people had the problem but no suggested solutions. After some testing I realized that when the e-mail gets ingested by Jira Service Desk the contents of the e-mail is added as a comment and any attachments to the e-mail is attached to the case. After this occurs the ticket is automatically re-opened due to the customer comment. A closed or resolved ticket can not get new attachments and since Jira Service Desk is trying to attach the file before the ticket is re-opened it fails.
The simple solution to this is to add a property on the Closed workflow step for the Jira Service Desk project. If you edit your workflow and click the Closed step then open up the Properties, here you can add a new property key jira.issue.editable with the value true. This will allow closed issues to be edited which will make sure that the attachments are added during the re-open process. The only drawback with this is the Edit button being visible on closed cases.
Setting up Microsoft Internet Information Services (IIS) SMTP service is pretty straight forward for simple implementations. It hasn’t really keept up with time and I’m pretty sure not to many people use it anymore. Working with an older implementation in a system that used distributed SMTP on each and every IIS server I realized we needed to centralize it so we could secure it properly. This included re configuring an old IIS SMTP server and then add a bunch of aliases to make sure the server accepted all the incoming e-mails.
You can use Unifi Controller from your computer to configure and monitor your Ubiquiti access points but a cloud key is much nicer. The Unifi Cloud Key is basically just an ARM computer running of an SD-card. Sound familiar? So what’s the difference between that and a Raspberry Pi? Not much besides memory and price. It more or less costs three times as much and the extra memory is not necessary for a small office or home installation. The Unifi Controller doesn’t only take care of your access points but also firewall and switches if you use Unifi gear. In my case I have a Ubiquiti Edge Router X as a firewall and that doesn’t play with the Unifi Controller. At the same time it has a very nice UI as is and have 5 separate ports for different LAN’s while the entry firewall for Unify has 3 where one is WAN and one is for voip. In this article I describe how to setup Unifi Controller on a Raspberry Pi, provision the AP and then keep the Unifi Controller in a different subnet from the WLAN. I also show how to setup a guest wifi on a separate subnet.
I love my Raspberry Pi projects and I run a lot of specialist “mini” servers at home doing everything from torrent sharing of Linux distros to media streaming and media playing. But all Raspberry Pi’s and other single board computers that rely on SD-cards sooner or later comes to a point where they trash the card and doesn’t boot again.
Every time I run into that situation without remembering exactly what was running and how on the particular Raspberry Pi. I want backups, not just the backup I usually do right after installation but a last night backup or similar. So I put up an NFS share on my NAS to store the backups, it will work just as well with a USB stick connected directly to the Raspberry Pi. Here is a step by step guide how I automated the backups on all my Raspberry Pi’s. This script will create a complete image of the SD-card while the Raspberry Pi is running. You can just write that image to a new SD-card and pop it into the Pi and it will be like nothing happened!
A common need is to restore the latest production backup to a test system or user acceptance test system on a regular basis. Depending on your system (database) size this can be time consuming. You would prefer to have this done during the night right after the backups run. If you don’t have any third party solution for backups where this feature is built in it can be a bit tricky. The reason why is that the automated backups in a Microsoft SQL Server maintenance plan have somewhat unpredictable names.
Solution to this problem is a simple T-SQL script that you can put in a maintenance plan, to run every night or just on schedule as you please. The script grabs the latest backup an makes the restore.
The script can be downloaded from Github – T-SQL Automatic restore of latest backup
I ended up finally transferring everything from my old gmail.com based Google account over to my custom domain Google account. I have been running them in parallel for ages switching between them to access different services. I read a number of blog posts and forum discussions on how to do this, also took a look at Googles own documentation.
According to Google you can merge two Google accounts but only if both our in the same organisation. And since my “new” one was the only one actually in an organisation and the other one isn’t this wasn’t an option. I couldn’t figure out, from the documentation alone, if merging two gmail.com based Google accounts are possible or if it has two be two proper organisation/custom domain accounts.
Started out with Google Drive by sharing all my files from one account to the next. Then I could just copy the files on the new account. The issue I ran into was that then all the filenames started with “copy of”. I didn’t want that so I started looking at other options.
By just selecting all the files and select download I received a zip file with all my files. Most of my files has been created on Google Drive with the built in apps and they got converted to excel and word respectively when downloaded. Then I downloaded the Google Drive app for windows, made sure I turned on the “convert uploaded files to Google…” option and then just dropped all the files in there.
This was a bit more tricky then Google Drive. I found a nice option of sharing my entire repository between the accounts and could set the new account the automatically save the shared photos to it’s of repository. It did how ever take ages so I quit that and just used Google Takeout to download two huge zip files with all my photos. I then configured the Google Drive app to backup the folder with the unzipped content and add photos and videos directly to Google Photos.
Yes I lost all my albums but the improvements of the Google Photos Assistant gives my correct suggestions for creating albums matching my trips and events daily now.
StartSSL certificates isn’t trusted by several major browsers anymore and will probably lose all credibility and disappear from the market completely. In it’s place we have seen Let’s Encrypt growth explode for the last 18 months. This post will cover some background and how to use setup Let’s Encrypt on your Amazon EC2 Apache based server.