Tuesday, 7 April 2020

NuGet Restore failing in Azure with Error parsing solution file

I recently came across a problem where builds were failing in Azure DevOps when performing a NuGet restore for the solution.

The error details were:

2020-04-07T08:05:03.8535680Z [command]C:\hostedtoolcache\windows\NuGet\4.1.0\x64\nuget.exe restore d:\a\1\s\MyProject\MyProject.sln -Verbosity Detailed -NonInteractive -ConfigFile d:\a\1\Nuget\tempNuGet_41515.config
2020-04-07T08:05:05.3883943Z NuGet Version:
2020-04-07T08:05:05.3886378Z MSBuild auto-detection: using msbuild version '' from 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\bin'. Use option -MSBuildVersion to force nuget to use a specific version of MSBuild.
2020-04-07T08:05:05.4539665Z System.AggregateException: One or more errors occurred. ---> NuGet.CommandLine.CommandLineException: Error parsing solution file at d:\a\1\s\MyProject\MyProject.sln: Exception has been thrown by the target of an invocation.
2020-04-07T08:05:05.4540531Z at NuGet.CommandLine.MsBuildUtility.GetAllProjectFileNamesWithMsBuild(String solutionFile, String msbuildPath)
2020-04-07T08:05:05.4541882Z at NuGet.CommandLine.RestoreCommand.ProcessSolutionFile(String solutionFileFullPath, PackageRestoreInputs restoreInputs)
2020-04-07T08:05:05.4542419Z at NuGet.CommandLine.RestoreCommand.d__37.MoveNext()
2020-04-07T08:05:05.4542827Z --- End of stack trace from previous location where exception was thrown ---
2020-04-07T08:05:05.4543213Z at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
2020-04-07T08:05:05.4543673Z at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
2020-04-07T08:05:05.4544134Z at NuGet.CommandLine.RestoreCommand.d__30.MoveNext()
2020-04-07T08:05:05.4544520Z --- End of inner exception stack trace ---
2020-04-07T08:05:05.4545738Z at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
2020-04-07T08:05:05.4546231Z at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
2020-04-07T08:05:05.4546606Z at NuGet.CommandLine.Command.Execute()
2020-04-07T08:05:05.4546965Z at NuGet.CommandLine.Program.MainCore(String workingDirectory, String[] args)

I then ran a build that had run successfully before (I ran it against the same commit) and that had the same error, pointing me in the direction of the Azure hosted agent being the issue.
I then was able to confirm that the Azure agent had been updated to version 20200331.1 (this can be found in the Initialize Job step of the build).
After checking the GitHub repo for the build agent it confirmed that Visual Studio 2019 had been updated on that version of the agent.

After some research I realised that the version of NuGet.exe it was using was quite old and that NuGet should ideally match the version of Visual Studio (and more importantly MSBuild) you are using:
  • 4.1 of NuGet.exe matches Visual Studio 2017 Update 1 (15.1)
  • 4.7 of NuGet.exe matches Visual Studio 2017 Update 7 (15.7)
  • 5.0 of NuGet.exe matches Visual Studio 2019 (16.0)
  • 5.4 of NuGet.exe matches Visual Studio 2019 (16.4)
So in my case running NuGet.Exe version 4.1 to restore a Visual Studio 2019 project isn't a good idea.

To resolve this issue add a new task to your Build pipeline (NuGet Tool Installer) and set it restore a newer version of NuGet: For a YAML pipeline add:
- task: NuGetToolInstaller@1
    versionSpec: '5.x'

Or for the GUI type:

This will then insure that you are using the correct version of NuGet which should stop that error at least!

Hope that helps!

Friday, 13 September 2019

Using the Pi-Hole with Windows

If you haven't heard of the Pi-Hole it is a great tool.  It is a DNS server (actually it is more than just that) which can run on a Raspberry Pi that simply blocks out adverts while you're browsing the web.  While some Ad-Blockers are browser add-ons this takes a different view, it simply stops the adverts from being loaded before they reach the browser.
Effectively setup the Pi-Hole on a Raspberry Pi then update the DNS settings on your Router so that it uses the Pi-Hole and then all devices connected won't see an advert as every time one is attempted to be loaded the Pi-Hole handles the request.  It's great.

But what about your laptop?  They're meant to be taken with you, so you'll see adverts when your elsewhere.

Thankfully the Pi-Hole also offer Docker images, meaning all that you require is Docker For Windows to be installed on your laptop.

So what do you need to do?
  1. Install Docker For Windows.  I'm not going to detail all of the steps but the Pi-Hole image requires a Linux container (which is handy given the size of Windows containers).  Downloading Docker For Windows requires you to create an account (or login) to Docker.
  2. Ensure that Windows containers and not the default, as to set it up we need to embrace Linux.
  3. Download the Pi-Hole image, to do this open PowerShell and run:
    Docker Pull pihole/pihole 
  4. This will take a couple of minutes (not long) to download the Linux container with the Pi-Hole installed.
  5. Create the following directories on your machine:
    • C:\pihole\
    These are locations that the PiHole image will use to store files that will remain (for when upgrading the container to a newer version of PiHole)
  6. Run the following command to start the Pi-Hole image:
    docker run -d --name pihole -p 53:53/tcp -p 53:53/udp -p 80:80 -p 443:443 -v "c:/pihole/pihole/:/etc/pihole/" -v "c:/pihole/dnsmasq.d/:/etc/dnsmasq.d/" -e WEBPASSWORD=vRz0n36IWF --restart=unless-stopped pihole/pihole:latest
  7. I strongly suggest that you use a strong password, as the web interface to the Pi-Hole will require this to login.  You can now browse to LocalHost in a browser and you should see a page showing that the PiHole is running, although no requests are currently going to it (so it won't actually be blocking any adverts).
  8. Docker may ask you for an account to share files on your C drive (or wherever you placed them).
  9. Finally, you need to update the DNS setting for your connection to block adverts.  To do this
  10. In File Explorer right click on Network and select Properties
  11. Click on your connection
  12. Select Properties in the dialog
  13. Then select TCP/IPv4 and then properties
  14. Then set the DNS settings to be (as the Pi-Hole container is running on your laptop).
  15. Click Ok to dismiss the dialog boxes and you're done.
  16. To see the interface for PiHole type localhost into a browser.  Click on Login and enter the password (in my example vRz0n36IWF but please change it!).  
  17. Adverts are now being blocked!

Monday, 10 December 2018

Running Jenkins from a Docker image in Azure

I am (relatively) new to Docker and I want to know some more about Jenkins so I thought I'd use Docker to run the latest version of Jenkins.  This is a warts and all step through my progress.

I've heard that Docker containers can run in Azure without a virtual machine but I wanted to understand how it all works so I decided to create a Windows 10 virtual machine and install Docker on that.

I created the Azure VM (using the UI), using the Windows 10 N (x64) image.

Once the machine had been created I then installed the Desktop edition of Docker, which can be found here:

Note:  You need to be logged in to be able to download.

Once it had downloaded (550MB) I ran the installation:

I went with the default options and click Ok to let it unpack the files:

After that had completed the installation it wanted to log out, I went for a reboot:

Once the machine had rebooted I logged in and after Docker had started I was presented with this message:

As Docker uses Container and Hyper-V Windows features I'm only to happy for this to be setup automatically for me.

Once this had completed the VM had rebooted Docker prompted me to login with my Docker account.

Ok, so at this point Docker is installed and the VM has all the components to run Docker containers.

Next step was to get Jenkins running!

As I wanted the data to persist I created a folder structure on the VM; C:\Gruss\Docker\Jenkins.

I opened an Administrative PowerShell windows (not sure if I needed to run it with Administrator privileges or not) and ran the following command:

docker run -p 8080:8080 -p 50000:50000 -v C:\Gruss\Docker\Jenkins:/var/jenkins_home jenkins/jenkins:lts

I am no Docker expert but to breakdown the command:
  • -p 8080:8080
    • This maps port 8080 on the VM to port 8080 within the container
  • -p 50000:50000
    • This maps the port in the same way above
  • -v C:\Gruss\Docker\Jenkins:/var/jenkins_home
    • This creates a volume mount so that information in the container can be persisted after reboots etc.  In this case I'm storing the data in C:\Gruss\Docker

As the Docker image was not local to my Azure VM Docker proceeded to download the container for me:

Docker informed me to store the configuration on 'C:\Gruss\Docker' folder it needed permission to do this:

Having clicked 'Share it', an account was needed:

At this point Docker spat out an error message:

C:\Program Files\Docker\Docker\Resources\bin\docker.exe: Error response from daemon: driver failed programming external connectivity on endpoint goofy_lederberg (deaba2deeea0486c92ba8a1a32740295f03859b1b5829d39e39eff0b24613ebf): Error starting userland proxy: Bind for unexpected error Permission denied.

This is stating that it could not map port 50000 on the local machine, possibly because it was in use.  I ran netstat to list all of the ports that were in use:
netstat -a -n -o
Nothing was using port 50000, something strange was going on.

I was able to start the VM removing the ‘-p 50000:50000’ but I’ve assumed it needs to map this port in order to work correctly.

Coming back the following morning (after shutting down the vm) all was resolved so perhaps a reboot was all it needed?
Ok so now I’ve run the command and my docker container is running! (wahoo!!!)

Open a browser on the VM and go to:  Http://localhost:8080

However, as I started the container removing the port 50000 mapping I don't have the administrator password to start Jenkins.

However, it states that it is available by browsing to /var/jenkins_home/secrets/initialAdminPassword

As I started the container previously the Admin password isn’t now shown in the output when starting the container (it is only shown the first time), so I now need to browse the local file system of the container to get the password.

To do this I opened a new PowerShell window.
The Docker ps command lists the running containers and it also lists the ‘name’ for it, which in my case is called dreamy_bhabha.
With that I can exec a command on the image:

docker exec dreamy_bhabha cat /var/jenkins_home/secrets/initialAdminPassword

I've since found that I could have browsed to the C:\Gruss\Docker\Jenkins\secrets folder but where is the fun in that?

Typing that password in allows Jenkins to start: 

I went with the option to install the suggested plugins and off it went:

 Once they were all installed I was prompted to create the first admin user:

After creating the user Jenkins seemed to crash for me, as I was presented with a blank page.
Trying in an incognito window showed the login screen but after logging in I got the blank page.
To resolve this I stopped the container:

docker stop dreamy_bhabha

Then restarted it:

docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts

Note:  This will give me a new name for the container.

Opening a browser allowed me to login and see that Jenkins is now working:

Next step will be creating a pipeline in Jenkins!

Friday, 8 December 2017

Moving TFS 2015 databases to a new server

We have an on-premise Team Foundation Server (2015) which we use for our source control.  Everything is installed on the one vm; TFS and SQL Server (also SQL Reporting Services and Analysis Services but we don't actually use them).
Our plan was to move the database to a new server, including the TFS Configuration database.  This was so that we could uninstall SQL Server.

The steps taken:
  • As it was a virtual machine we were able to create a snapshot of it before we moved anything, always good just in case a rollback is required.
  • Login to the vm as an account with full TFS permissions
  • Open the administration console window and select Team Project Collections, then Detach the collection.
  • Close the administration console when completed.
  • Perform a SQL backup of the TFS_ databases.
  • Restore the SQL backups to the new SQL Server
  • Grant the TFS Service account DBO access to the databases and add any logins.
  • Stop IIS, open a command prompt:
    •  IISReset /Stop
  • Stop the TFS Services:
    • Team Foundation Ssh Service
    • Visual Studio Team Foundation Background Job Agent
    • Visual Studio Test Controller
  • Open a (elevated) command prompt
    • CD C:\Program Files\Microsoft Team Foundation Server 14.0\Tools
    • TfsConfig registerDb /sqlInstance:<NewSQLServer> /databaseName:Tfs_Configuration
    • Note: If the command prompt hangs, give the TFS Admin account higher SQL permissions. 
  • Open the administration console window and select Application Tier, then Reapply account.

  • Disable the SQL Server services (to be sure that it is not being used).
    • SQL Server
    • SQL Server Agent
    • SQL Server Analysis Services
    • SQL Service Integration Services
    • SQL Server Reporting Services
  • In the TFS Administration Console select Team Project Collections and Attach the collection:

Wednesday, 12 April 2017

Stop remembering passwords!

I have heard a number of times recently of people unwittingly giving away their passwords or reusing passwords on different websites.  With the number of websites being hacked increasing if one site that you have created an account with uses poor security then your email address and password will be known, hackers will then try other websites with that username and password.  If you reuse the password they will have access.
I recently heard that someone had their Facebook account compromised (as they reused the password) and they tried to get money from their Facebook friends.

So don't reuse passwords!  I mean it!

Ok, but this means
"I will need a different password for each website that I use, that's ridiculous, I can't remember them all!"
This is where Password Managers come in.

So what is a Password Manager?
A password manager is an application that remembers all of the passwords for you.  Most of them will automatically register when you have logged in or changed a password to a website and it will pop up and ask if you want to save or update it.  This works really well when you are using a PC but not so well (in my opinion) when on a mobile device.
All of the passwords that are stored in the password manager need to be secured by, yes you've guessed it - a password!
However, I dislike the phrasing here, they should be protected by a passphrase rather than a single word.  The longer a password the more secure it is, using numbers and special characters does help but it is the length that makes it harder to crack.

So when creating a passphrase it should be something that you'll remember and fairly secure and contain letter, numbers and digits.
This video (from Google) gives an idea of how to create secure passwords:

There are a number of different password managers around but I'll limit this to just three (as I don't have enough time to review them all).

KeePass (free)
KeePass works really well if you log in from one device all the time and you want full control over where your passwords are stored, personally I think it also works best on a PC rather than a mobile device.

It works by creating your own vault which is where you store the passwords and you have to maintain them.  It requires you to remember to add the passwords it the vault and update them if the password has changed.
You can create folders and store the entries where you like.

The downside to KeePass is when you want to use it with multiple devices such as mobile phones.  As the vault is stored in a file for it to be on a mobile device this needs to be available using Dropbox or a similar tool.  When I did this I had problems with the database being overwritten as it wasn't in sync and I lost entries.  This led me to look for another tool.

LastPass (free and paid for mobile use)
I was introduced to LastPass by a friend and I've been quite happy with it.
When I first installed the extension into Chrome it took me through a process and took all of the passwords from Chrome and had an import mechanism to extract any passwords from Chrome and importing from KeePass was fairly straight forward (from what I can remember now).
When used on a PC LastPass will automatically populate the username and password if it knows the password, so normally I just need to hit login and the job is done.
Also when I need to create an account it abstracts away the password process and automatically stores the new entry if you choose to.
If you want to use LastPass on a mobile device you need to upgrade to the premium version, the cost for this is $12 for a year.
The mobile version uses an App (free to download) that has a browser in built that has the functionality to populate usernames and passwords when you browse to website in the same way it does when you use a PC.
If you are using a separate mobile app you need to copy and paste the password which involves flicking between the apps which is a bit of a pain but I believe this is a common problem between all password managers (maybe less so if the app uses google or facebook logins).
With LastPass it is possible to arrange how the sites are stored and to create a shared area, allowing a single username and password to be used by two or more people.

I feel obligated to say LastPass has recently come under some criticism as some faults have been found in the way they store passwords:  ttps://www.theregister.co.uk/2016/07/27/zero_day_hole_can_pwn_millions_of_lastpass_users_who_visit_a_site/
Personally I think any review of security is a good thing and LastPass have been very quick to respond and resolve the issues raised.

1Password (free and paid for)
I haven't actually used 1Password but my understanding is that they are very similar to LastPass.
The premium version is $2.99 per month (billed annually) so it is fair bit more expensive than LastPass but it is recommended by a number of people including Troy Hunt who is a industry recognised security researcher.

I urge everyone to use a password manager and not to re-use passwords.
With so many websites being hacked if you reuse a password it won't be long before someone else knows your password.

Have I been Pwned?

If you suspect or are paranoid Troy Hunt has a website where you can enter your email address and it will inform you if it has been exposed by a breach (a website that has been hacked).

As a final word, don't trust anything!  If an email or webpage looks to good to be true, it probably is!
If someone on facebook suddenly asks you to take payment for something on ebay and send them the money, think twice and speak to the person.

Tuesday, 11 October 2016

Docker on Windows - Don't use TP5

I'm new to the world of Docker and being from a Windows background I've not really had the opportunity to have a look, that is until Windows Server 2016 which has native support for Docker.  Wahoo!

As Windows Server 2016 is now RTM I thought I'd go and have a play on Azure.

Unfortunately this is where I found that Azure doesn't (yet) have a RTM version of Server 2016:

As a result I decided to go for the Technical Preview 5 image and enable Docker myself.

Adding the Windows Feature was straight forward:

Then obviously as it is Windows it required a reboot.

I then ran:

I also added C:\ProgramFiles\Docker to my Path variable.

I then registered the service:

The started the service:

At this point I got an error:
Start-Service : Failed to start service 'Docker Engine (docker)'.At line:1 char:1+ Start-Service docker+ ~~~~~~~~~~~~~~~~~~~~

After Googling I found that you can't use Windows Server 2016 TP 5, you need to use the fully fledged version of Windows Server 2016 (which isn't available in Azure).
This is stated on this page (under Pre-Reqs):  https://msdn.microsoft.com/en-gb/virtualization/windowscontainers/quick_start/quick_start_windows_server

For fun I thought I would try and upgrade the Azure vm I was running but this didn't seem to work, leaving me with a dead vm.

Hopefully a RTM version of Windows Server 2016 will be on Azure soon.

Thursday, 25 August 2016

Upgrading Octopus Deploy from v2.6 - Give it some memory!

I've been tasked with upgrading Octopus Deploy to the latest version, this is for many reasons but mainly to look at the replacement to snapshotting, channels.

To test the upgrade before it is applied to live I have been using a test environment which has the same instance of Octopus that we have in live, 2.6.

The first thing to say about Octopus 3.x is that it no longer uses a NoSql database (RavenDB), it now uses SQL Server.  This has widely been blogged about but from what I've seen the SQL data structure that it uses is still similar to a NoSQL database with a NVarChar(Max) column filled with JSON.

The installation of Octopus 3.3.24 is straightforward and no really note worthy, the wizard run after installation will create the database and provide an empty installation of Octopus Deploy.
After this has been installed the next step is to migrate your existing database using a backup (with the master key).

Clicking on the "Import Data" brings up a wizard which allows you to select the Octopus backup file and enter the Master key.

The preview option will simulate the process but unfortunately the "Task logs" option does not work with the preview mode.

Our production backup file is 750MB and we use Octopus for all our deployments in our CD pipeline so we have a fair amount of deployment data.

The process to import the task logs takes a long time, I got the memory increased on our test server to 16GB and run the process and it had not completed after 17 hours.  It had consumed all of the memory but not particularly the processor.  It is the step of upgrading the documents that appears to be taking the time.

Upon Googling I found that there is a parameter than can be used to execute the upgrade process so that it limits the history that is brought over; -maxage=

This made the command line:
"C:\Program Files\Octopus Deploy\Octopus\Octopus.Migrator.exe" migrate --instance "OctopusServer" --file "C:\Octopus\20160729-140413.octobak" --master-key "abcdefghijklmnopqrstuvwxyz" --include-tasklogs -maxage=60

After checking the Migration log file I found the last entry was:
2016-08-13 20:57:07.7216      1  INFO  Step took 00:00:00s2016-08-13 20:57:07.7216      1  INFO  2016-08-13 20:57:07.7216      1  INFO  Convert documents
This didn't get updated and all of the memory on the machine (16GB) was pretty quickly consumed.
As this was running on a VM I left the machine running for a few days and the log file didn't get updated.

After raising a call with Octopus Support we found that the process requires quite a lot of memory.
Giving my VM 32GB RAM seemed to allow the migration to complete (in 20 minutes) although it was still very close to maxing out the memory.

In short, if you have a large Octopus Raven database (ours was approximately 3GB - when Windows counted the size of the Raven database) you'll need a lot of memory to upgrade, maybe more than 32GB!

Thanks to Vanessa Love (@fly401) for all the help!