I’d never really felt that impressed with VS for Mac, feeling it was always a bit of a second class citizen - which is understandable I guess; Microsoft have Visual Studio 2022 for their own operating system after all so it was probably quite surprising that they released the Mac IDE at all.
Now, Mac users are not being left out in the cold by Microsoft - oh no. They are are suggesting Visual Studio Code for use on the Mac (and Windows and even Linux) which, it is said, can be augmented with powerful extensions to support C#, Blazor and even MAUI development.
But this all felt a little ….. I don’t know, cobbled together to me. I wanted something a little bit more … focused on the job I need it to do. Not a general purpose system that could be tailored for any number of use cases.
Enter Jetbrains Rider for Mac (and Windows and Linux) which in my opinion, despite being not being a free product, is a very worthy replacement.
My first experience with Jetbrains products was ReSharper and I have to say that I wasn’t impressed because it appeared to have a massive impact on Visual Studios performance. When I was working with client hardware which had ReSharper installed I quickly found myself disabling it. Any productivity improvements I may have gained using it would quickly be lost waiting for Visual Studio to respond with refactorings taking longer than it would if I just typed it myself.
My first experience with the Rider IDE was during the NDC 2020 event in London where Jetbrains had a stand in one of the breakout areas. I had a good chat with the guy there and shared my experience with ReSharper and finding out that it was built into Rider wasn’t endearing me to the IDE. He explained that because ReSharper was built into the core of Rider it would not suffer the same performance issues as it is at the mercy of Microsofts extension framework when running in Visual Studio - something that the Jetbrains guy blamed for the poor performance.
So did I switch to Rider during the conference …. no, I didn’t.
Like many developers changing something as fundamental as their IDE, something they use day in and day out and have the muscle memory to prove it, is not something done on a whim.
I actually put my hand in my pocket some three years later in early 2023 and quickly chastised myself for not making the move much earlier.
Not only did Rider far exceed the capabilities of Visual Studio for Mac but I quickly got to grips with the new IDE and was able to remove Visual Studio 2022 from my Windows systems.
Developers are all different - what impresses one when it comes to tooling may very well not impress another so I won’t try to explain all the features Rider has to offer, but here are some of the highlights for me:
Now I’m not saying that Rider is 100% perfect - it’s not. I get some spurious errors reported via the static code analysis from timme to time and while this does not prevent the code from running it is a minor irritation.
The thing with Rider is that it is a dedicated .NET/C# IDE - that’s what it does and (in my opinion) it does it very well indeed. It’s a paid for product and I get support should I need it - I once raised an issue with Microsoft that unit tests took an age to start to build in VS2022 but never got a response from them, which was pretty much what I expected considering I was not a paying customer.
And we are back to money again and a reminder that Rider is a paid product, currently starting at £119 for the first year but I opted for their dotUltimate bundle at £135. At the time of writing his include five additional tools including dotCover (enhanced Unit Testing and Code Coverage), dotTrace (for performance profiling) and dotMemory - yes, that’s a memory profiler. Now I have to confess, I’ve not used the last two in anger but it’s good knowing they are there and I consider that to be money well spent.
Finally - did I mention the 30 day free trial? Well, I have now.
]]>Now I’d checked out a lot of things before putting my hand in my pocket and shelling out for the Macbook and I was wondering what I’d done - maybe I’d fallen at the first hurdle and should have just bought a ‘regular’ Windows laptop instead.
Well, I wasn’t having that - no, not at all.
As with so many problems in technology, I cannot possibly be the first person to have bumped into the, especially as this is an Apple M3 CPU - the 3rd generation of Apple Silicon, surely the early adopters ran into this.
Well, after trawling the internet and going down some blind alleys I managed to get SQL running on Docker on my Macbook Pro, M3 Pro.
TLDR; Basically you need to install the Apple Rosetta application, make sure you are running the latest version of Docker Desktop and enable the use of Rosetta for x86/amd64 emulation.
I mentioned above that I’ve been using Docker so that I could run multiple versions of SQL Server but I am also using it to run a Gitlab instance on my old iMac (running Ubuntu). The draw for me to use Docker was the fact that upgrading was as simple as stopping the container, pulling the new docker image and restarting the image.
But things aren’t always as simple as this and sometimes things go wrong and as I’m hosting all my own source code now I don’t want to risk losing it all because an upgrade fails.
Yes, I run regular backups and send them to remote storage for safe keeping - but a backup isn’t a backup unless you can restore it - so I need to ensure that the latest backup can be successfully restored before I applied spun up the container against the new image.
The process requires me to create a container running the version my iMac is running, restore my backup, pull the new image and spin up the container again to apply the upgrade. I then check that I can login and compare the last commits on a few projects.
With this being a new Mac I installed the latest version of Docker Desktop and pulled the latest Gitlab image - and that’s when the trouble started because when I did it was displayed with what appeared to be a warning that the image architecture might be incompatible with that of the host system.
This isn’t something I’d seen before but prior to this shiny new Macbook I was running on Intel CPUs - Mac and Windows. This was something new.
It took s bit of trawling around but the solution was pretty simple in the end:
The first step is simple enough - open a terminal and run the following:
softwareupdate --install-rosetta
This is obviously a one-time thing but the following environmental variable will need to be applied during the creation of any container that displays the above architecture warning.
Variable Name | Setting |
---|---|
DOCKER_DEFAULT_PLATFORM | linux/amd64 |
That’s it - click Run and you’re good to go.
]]>Well, today I finally pulled the trigger and order a new MacBook Pro (MBP) which will be delivered in a couple of days. I’m really looking forward to the performance improvements from my 2015 Macbook Pro and my i7 Windows Workstation (custom built in 2010) to the new M3 Pro CPU in my new purchase.
People I’ve worked with in the past will recall me saying things like ‘you can put an Apple device in my cold dead hands’ but times change and views need to change with them, even mine.
But I could have bought 2 or 3 laptops for what I’ve just spent on the new Macbook - why didn’t I just get a decent laptop for say £1000? Well, it mainly comes down to my need/desire to develop mobile applications.
When I bought my current Macbook Pro I wanted to write Xamarin applications for iOS devices and that was all. I still needed my Windows Workstation to do all my other development as .NET was still really a Windows technology and while Xamarin was using the cross platform project Mono to enable me to write apps using C# on the Mac it has to be said …. it was a bit on the flaky side.
Then .NET Core came along and the game changed with development now being possible on Mac and Linux as well as Windows. This meant that I only needed a single device for all my development needs …. but it had to be a Mac if I wanted to do iOS development.
It also has to be said that I’m not a fan of the way that Windows is going. There are a number of aspects to this and they are outside what I want to talk about here - the fact is that I’ve found the Macbook surprisingly easy to migrate over to (Linux experience certainly helped) and while I don’t think I’ll be replacing my Android with an iPhone anytime soon I do feel that Apple & MacOS tick more boxes for me than Microsoft & Windows.
I’m wondering if, as a regular Windows-based .NET/C# developer, my migration to a Mac-based .NET/C# developer might be a suitable subject that might see me blog a little more this year.
The configuration of my current MBP has evolved over time and there is probably a lot of cruff in there so documenting the setup as I go will help me build a reliable, robust development environment that should last me (based on how well my current MBP has performed) until the end of my professional development career.
]]>Although I knew that the contracting market had changed here in the UK I didn’t expect the desolation I found when I re-entered it in April. Back in January when I gave notice to my previous employer the market was slow, but then it always is at the start of the year until things pick up towards the start of the new financial year. However, as April approached there was no signs of this and the days when contractors were getting numerous calls every week appeared to be a distant memory.
That said, I did manage to pick up a contract towards then end of the month, working with a small company in Bristol. The rate was good enough and the contract was set to run for six months so that would get my back up and running wouldn’t it ….?
Well, no - not really.
The contract was to provide services until a replacement for their previous senior developer and the process was expected to last a few months with the further expectation that the successful candidate would probably need to give their employer 3 months notice. But that’s not how it panned out.
As it happened they managed to find a suitable candidate within a few weeks and he was immediately available having been made redundant. I provided a week of handover and after 6 weeks total duration I found myself once again ‘between contracts’.
This is the double edge sword that comes with contracting. We provide a service and when that service is no longer required we move on.
The problem was that the market had still not recovered when I dropped back into it in June and it took me until September before my luck changed and a client I engaged with via my previous Ltd company reached out. I’m now 3 months into a six month contract and it is pretty apparent that there will be an extension - there’s a lot of work to be done. There has been a suggestion of a permanent role in the offing but I’m in two minds about that.
Contracting in the UK is under attack from the Government and HMRC - no question about it. They don’t seem to want us little people running around despite following all the legislation and taxation guidance to the letter. No, because of an acknowledged minority of contractors/freelancers running wild then we must all be treated like criminals to the point where contracting becomes non-viable.
I won’t get into an IR35 rant here as I’m trying to stay positive for the New Year celebrations this evening. Suffice to say that I flip between firmly staying a contractor and throwing in the towel again and returning to regular employment. I’m not getting any younger and could do without the hassle of having to prove that I’ve adhered to the legislation rather than the other way around - apparently ‘innocent until proven guilty’ is an outdated concept here in the UK these days.
Back in June I posted about going ‘All in on Mac’ and I’m currently on the verge of shelling out £3000 (of company money) on a new MacBook Pro to replace the aging hardward I’m currently using. I’ve had to bring my old Windows Workstation out of retirment as the current MBP doesn’t seem to play well with the VPN software required to access my current clients systems. Frequently dropping connections was a major hit on productivity and the Windows system didn’t seem to have the same issues.
I had one of the other developers, who runs a much newer MBP, to run some tests for me and it was pretty clear that it was an issue with either the older hardware in mine or it’s inability to run the latest version of macOS. I have my fingers crossed that the new laptop won’t have the same issues. Mind you, even if it does I ‘need’ a newer device to continue my development with MAUI as along with the latest version of macOS I’m also unable to install the latest version of XCode, required to build applications for iOS (and macOS) devices. I have plans to upgrade my existing apps, The Motorhome Stopover and Smite Scoreboard, by migrating to MAUI so I can at least justify the expence to myself that way.
I’ve already upgraded my Synology NAS to a new DS224+ with two 4TB drives and some additional RAM for good measure. Not only is the unit much, much quicker than the old one but it’s also capable of running Docker and I was keen to bring a few select services in-house without all the faffing about when it came to configuration and upgrades.
So far I have Gitlab up and running and have to say I’m pretty happy with it. I’ve worked through the process of backing up and restoring as well as upgrading when a new Docker Image becomes available.
This is all part of my plans to reduce my usage of services like Github and Azure DevOps - not because they are necessarily evil (some may argue otherwise) but it’s my code and I should be capable of managing it myself.
I’m also considering moving away from 1Password to a self-hosted Bitwarden instance (although that is a little on the scary side) and spinning up my own Mastodon instance as the future of the one I’m currently using is in doubt.
How I’ll get on - only time will tell I guess but as long as I can keep the little grey cells working the better, so I may as well do something useful to give them some exercise.
I also deleted my OnTheFence Twitter account (or deactivated it anyway - and yes, I know it’s called X now) after going cold turkey for about 6 months. Using Mastodon has certainly made my life calmer.
You know, I’m not going to promise to blog more - I’ve done that previously and frankly I’d just be setting myself up to fail.
Fact is, I’m thinking that I’m the only one reading this blog anyway (I don’t have any analytics wired up) and even though I start numerous posts I normally end up thinking ‘why am I bothering, who is this helping anyway - it’s not like I’m bringing anything new to the conversation’.
Next year it looks like I have some decisions to make, I’m not getting any younger and time waits for nobody.
The biggest one is, will I remain a contractor or will I throw in the towel again and return to regular employment to see out the last 5-10 years or my career?
Right now … I’m not sure either way and I find that slightly disconcerting.
Many people have much bigger problems to deal with right now so if that’s all that’s ‘troubling’ me then I guess I will see the New Year in with a reasonable degree of hope.
Wishing everyone a great 2024
]]>As frequently happens, a proof of concept investigation soon becomes something you rely on but it still needs some of the basics sorting out - in the case of my Gitlab instance this means docker image upgrades and a backup/restore process.
I have to say I was both surprised and pleased to see the frequency of the docker image updates but so far I’ve not installed any of them as I need to investigate and verify that I can backup and restore everything in the event that an image update goes wrong or I have a hardware failure of some sort.
With the iMac running my ‘production’ instance and Gitlab not apparently supporting a Docker installation on Windows that only leaves my MacBook Pro (MBP) to perform my investigations.
The first thing was to install and configure Gitlab via Docker on the MBP …. that should be a walk in the park yes?
Well, as I’m writing about it you can correctly assume that wasn’t the case.
I already have Docker Desktop running on the MBP so once I’d identified the correct Gitlab image to fetch, the community edition of 16.2.4, I updated the same script I use on the iMac to download the image, configure volumes on the local filesystem and spin up a new container.
docker run -d -p 22:22 -p 443:443 -p 80:80 \
--name gitlab \
--restart unless-stopped \
--hostname localhost \
--shm-size 1024m \
-v /Users/dave/gitlab/config:/etc/gitlab \
-v /Users/dave/gitlab/logs:/var/log/gitlab \
-v /Users/dave/gitlab/data:/var/opt/gitlab \
gitlab/gitlab-ce:16.2.4-ce.0
Now, when the container first spins up it will perform the installation and configuration of Gitlab and all the additional components and services such as postgresql, redis, nginx etc and this can take a bit of time. Add to that the fact that even when starting a configured container it can take a while for everything to start up (I’m talking maybe 5 minutes or so on the older hardware I’m using) so i was prepared for a bit of a wait … but it soon became clear that there was a problem and the restart
option in the above script had put the process into a bit of a loop.
Removing the restart
option resolved the looping, but of course the container still failed to start, although I could now start to get a handle on what was happening.
Looking at the container logs in Docker Desktop I could see that there was a problem starting the postgresql database.
2023-10-31_19:06:45.43696 LOG: starting PostgreSQL 13.11 on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 11.3.0-1ubuntu1~22.04.1) 11.3.0, 64-bit
2023-10-31_19:06:45.43918 LOG: could not bind Unix address "/var/opt/gitlab/postgresql/.s.PGSQL.5432": Operation not permitted
2023-10-31_19:06:45.43952 HINT: Is another postmaster already running on port 5432? If not, remove socket file "/var/opt/gitlab/postgresql/.s.PGSQL.5432" and retry.
2023-10-31_19:06:45.43992 WARNING: could not create Unix-domain socket in directory "/var/opt/gitlab/postgresql"
2023-10-31_19:06:45.44039 FATAL: could not create any Unix-domain sockets
2023-10-31_19:06:45.44342 LOG: database system is shut down
I did previously have postgresql installed on the MBP when I was working with a previous client and wondered of the error was pointing to a port clash. However, as I had suspected, I had removed this to free up space when my engagement with the client concluded.
I also tried updating permissions on the folders I was using to mount the container volume to but this made no difference. The only way I was able to get the container to start up correctly was to remove the options to bind the volumes linking container folders to host folders. If I did this everything just worked.
I trawled the internet using the error message but didn’t find anything even remotely relevant … until I found this comment in a github issue thread
Basically the comment says that Docker Desktop has recently changed the default technology used fo file sharing from gRPC Fuse to VirtioFS and that changing it back might just fix the problem.
Now that mae sense - bearing in mind that dropping the volume configuration resolved the issue, this change within Docker Desktop itself could be my problem.
Well, there was only one way to find out - I duly opened up the Settings page in Docker Desktop, made the suggested changed and restarted the app.
I deleted the last failed container and ran the script again …. and just like that (after about 10 minutes) I was able to login to my fresh new instance of Gitlab.
So - with the easy bit out of the way, now I can look at restoring a backup I took from the iMac instance.
]]>Love it, hate it or deny it we all need to print something at some time. Whether it’s a boarding pass, a Click & Collect receipt or a form that has to have an ink signature before being scanned and emailed back.
Before you start shouting about ‘you can store all this on your phone old man’ just remember that we don’t always have a signal or infinite battery charge - just saying!
Well, if you are running Windows this doesn’t seem to be much of a problem. Printer manufacturers create drivers and installation packages and utilities to grease the wheels for Windows users. But what about Mac users - it’s that same for them right? Right?
Well, no … no it’s not the same, not by a long shot.
I recently tried to go without a printer after my Canon printer/scanner gave up the ghost, but it was no good - the need to print boarding passes and holiday/travel details forced me to replace it.
I opted for another Canon device, a TS3450 multi-function printer/scanner/copier which states that it is compatible with Mac via AirPrint …. no drivers required.
Well, that’s good and all but what about connecting the damn thing to your Wifi in the first place? If you’re running Windows this isn’t a problem - as long as your system has Wifi that is because if you are using a desktop that only has a wired connection then you are out of luck regardless of the operating system.
But Wifi or not, if you’re using a Mac then there is no way to perform the installation so I had to install the Canon app on my Android phone to get the printer connected to my network. (yes, that’s right - I’m using a Mac but an iPhone? No, that’s just going too far)
With the printer hooked up to the Wifi I should just be able to print at will right … right? Well, not so much. While my MacBook Pro (MBP) detected the printer without any problems I simply couldn’t get it to print anything - I just ended up with a timeout error.
The same thing happened when I tried to scan a document. Opening the Printers & Scanners application and selecting the Scan tab I just ended up with another error.
At this point my Windows Workstation and Surface Pro could print without any problems as could my Wife’s Chromebook so there was clearly something particular about the Mac that was causing the problem.
To throw more confusion into the mix, if I connected the MBP to the wired network using a USB ethernet adapter I found I was able to print (but was still unable to scan) - it looked like it was an issue with the Wifi configuration on the MBP.
I run an always on VPN on all of my devices and am aware that sometimes these can get in the way when accessing resources on the local network. Unfortunately disabling the VPN connecting (even uninstalling the software altogether) didn’t resolve the problem though.
For a while I resigned myself to the fact that I would need to print and scan from Windows (or a Chromebook - pffft!) but it was always a little splinter in my mind. This should be possible - it’s just printing surely.
To cut a long story short - after periodic internet searches I found a clue in an Apple forum post … and it centered around IPv6. Basically a Mac user said that he had resolved the issue by disabling IPv6 on the printer (I don’t think his was a Canon).
Armed with this information I found that I needed to connect to the printers administration interface, over the network, and login. Simple huh!?
As a techie finding the IPv4 address of the printer was pretty straightforward but it makes me wonder how ‘normal people’ manage this.
Typing this into a browser opens up some basic information about the printer status and the option to login.
Clicking on the Login button opens a page requesting the appropriate password - so …. what’s that then?
A quick internet search, revealed that this was simply the serial number of the printer itself (which certainly beats a default password of 0000000).
Now the admin portal isn’t the quickest website I’ve ever used that’s for sure but when the page did load I was able to find the option to turn off IPv6 (but the path to it was a full 9 clicks).
(in case you are interested the path to the page was: System info and LAN settings => LAN settings => Advanced Setup => Confirm Advanced Setup => TCP/IP settings => IPv6 => Enable/disable IPv6 => Disable => OK
)
With this setting in place covered the bases and rebooted my MBP and lined up a print job - and to my complete shock, the printer sprang into life and my document was printed.
After opening up the Printers and Scanners app on the MBP I was also able to scan the document back in - SUCCESS!
I’ve been printing and scanning quite happily for a few weeks now but there was another need to access the printers admin portal. By default the printer will go into standby after two hours of being idle but regardless of what system I was printing from, i.e. Windows or Mac, I needed to manually wake up the printer before trying to print anything. Now this is fine when I’m in the home office but if I am working elsewhere in the house I would need to walk up and press the power button, walk back down again and send the print job …. yeah, yeah, I know …. a major hardship!
Surely there’s an option for ‘Wake up on LAN’ in the admin portal … well, yes … sort of.
Logging in and navigating to Printer settings => Energy saving settings
, selecting ‘Auto power on’ and clicking OK was all it took.
So, that’s that - all sorted and my migration to the world of Mac is still on the cards.
]]>Well, for a number of reasons that I may post about later it’s going to be a Mac (yes, if you know me I know what I said but times change) but there was still a fly in the ointment - Visual Studio for Mac.
Visual Studio for Mac started life as MonoDevelop and despite what that link says about it, it most certainly does not employ “many of the same tools as its Windows counterpart”.
While it might be a step or three away from Notepad it is light years from Visual Studio 2022 for Windows. Although I’ve been using it for a while it’s mainly been to spin up a Xamarin iOS application in the simulator with much of the development being carried out on the Windows systems.
During a recent contract I needed to be on site a couple of days a week and as the rest of the development team was using MacBook Pros so it made sense that I used my, somewhat aging, Macbook Pro. But this meant that I would be doing full blown ASP.NET Core development using Visual Studio for Mac - and it wasn’t a great experience.
Not only did it show that my Mac is certainly beginning to struggle it also highlighted the gulf between Visual Studio for Mac and Visual Studio 2022 for Windows.
Now I know that there are extensions for Visual Studio for Mac and that I can probably get pretty close to the Windows version. But that’s not the point is it? What happens when (and it will be when) one of these extensions stops being actively developed? What happens when something changes in .NET and the extensions don’t support it or are slow to do so. I mean, we’ve all experienced that with nuget packages being abandoned haven’t we.
Well I don’t want to be left in a situation like that with the principle tool I use for work. So was this a show stopper for me going over to Mac?
Well, no - it isn’t.
Enter Jetbrains Rider - a full blown, cross platform IDE for .NET and it turned out to be a game changer.
Rider may seem like the new kid on the block but it’s based on the Jetbrains IntelliJ IDE which is a mature IDE for Java and Kotlin so it has some pedigree behind it. Now I had some reservations about an IDE with reSharper built into it as I’ve had issues with that particular tool in the past where it brought my system to a crawl when installed as an extension to Visual Studio.
But Rider is a different kettle of fish. It’s not trying to integrate reSharper with a third party tool, i.e. Visual Studio, it’s built into the heart of the system and performance is certainly not an issue, even on this old Macbook Pro.
When I opened the clients solution in Rider the code editor just lit up with refactorings and other potential issues (nothing dramatic but VS for Mac certainly didn’t highlight three quarters of what Rider did).
From potential null references and unused variables to things like Use 'nameof' expression to reference parameter 'password' name
and Inline 'out' variable declaration
- things that the eye may gloss over when scanning the code. Small improvements that can be made to make the code more maintainable and/or easier to read.
It could be said that I, as an experienced developer, should see all these things anyway without them being pointed out - but if that was the case we’d all be writing bug free code using vim. We all need a helping hand from time to time. Mind you, it should also be remembered that I didn’t write any of this code.
The refactorings in Rider are very powerful, being driven by reSharper, and most can be accessed quickly via keyboard shortcuts so you don’t even have to take your hands off the keyboard which only increases your productivity.
I opted for the dotUltimate bundle which includes the additional dotTrace, dotCover and dotMemory tools. At a difference of just £15 it was an absolute no-brainer and I would have happily paid that for dotCover (a tool to view and assess unit test coverage) on it’s own.
So, the move from Windows to Mac is a step closer - all I need to do now is work out what model and spec to go for. I’ve been looking at the Macbook Pro but recently on Mastodon someone mentioned that the Macbook Air would probably be more than capable so I need to check that out.
]]>I had installations for 2008 R2, 2012, 2014, 2019 and (the reason I noticed the others) the latest version, 2022. I also had three versions of SQL Server Management Studio installed - all chewing up hard drive space.
Now, it’s not like I’m a digital hoarder - these installations were required to work on client projects during my, soon to be rekindled, contracting years. While everyone wants to run the latest and greatest technologies, sometimes that isn’t possible so I needed to be sure that I was configured to run their code without compatibility issues.
But surely there is a better way than cluttering up my hard drive …. and that’s where Docker comes in. Sure, Docker isn’t a new thing - it’s been around for years now but while I had a working knowledge of it I’d never had a reason to use it, or at least I didn’t think I did.
Well - that’s not quite true. I had previously used Docker on the Mac I’m using to type this post. Back in January 2020 I was attending an ASP.NET Core 3.1 workshop at NDC London and didn’t know if my Surface Pro would cut the mustard. So I decided to use the Mac, .NET Core is cross platorm afterall, and spun up a SQL Server instance suing Docker so that I could follow along - which I did, without issues.
Turning my attention back to my Windows systems the first thing to do was remove all the installed SQL Instances and tools like SQL Server Management Studio. This wasn’t as easy as you’d think with so many dependancies between one component and another - but I won’t dwell on that.
With Docker Desktop installed the process of spinning up a SQL Server instance was actually pretty easy, even the creation of volumes on my local machine to allow me to persist my database files - allowing me to blow away a Docker container without losing the actual data.
The first thing to do is pull the official Microsoft SQL Server image which is as simple as opening a command prompt and enter the following:
docker pull mcr.microsoft.com/mssql/server
Note that if you are using a Mac with Apple Silicon (currently an M1, M2 or M3) then you will need to change this to fetch the following image instead: mcr.microsoft.com/azure-sql-edge
Opening the Images tab in Docker Desktop will show the image which can now be spun up into a running container that can be accessed via SSMS.
The Docker Hub page for MS SQL Server details a couple of required environment variables that need to be configured.
Variable Name | Setting |
---|---|
ACCEPT_EULA | Y |
MSSQL_SA_PASSWORD | this will be the 'sa' password used to access the instance* |
MSSQL_PID | This will default to 'Developer' if not specified which is what I need so I'll be omitting this configuration going forward |
The next step is to create a container based on this image, configuring the environment variables so that we can access it.
The easiest way to create a container is to click the ‘play’ button in the Actions column of our new image. This will display a dialog where the container setting, including the above environment variables, can be set.
Ignoring Volumes for now I entered a suitable name for the container along with a Host Port value of 1433 (a direct mapping to the default SQL Server port exposed by the container) and set the environment variables above.
Finally, clicking the Run button creates and starts the new container.
On Windows I can open SQL Server Management Studio and connect to the SQL Server instance running in the container just like any other.
I can now create databases, adding tables, inserting, updating and selecting data as normal - but there is a ‘problem’, if it can be called that.
The container is self contained and while data it contains will survive it being stopped and restarted, it will (maybe obviously) be lost if the container itself is deleted.
This may not be an issue, in fact it may be a good thing in certain circumstances, but there are plenty of reasons to be able to save the database files outside of the container and allow the container to access database backup files saved on the host systems filesystem.
This is where Volumes come in.
Volumes allow file locations within the container to be linked to a file location on the host systems filesystem.
For the purposes of my development environment I want to be able to define a location to hold backups (.bak files) and another to hold the data and log files (.mdf and .ldf files).
I started by creating a ‘ContainerVolumes’ folder with ‘SQL_Data’ and ‘SQL_Backups’ folders within it. With this in place I can now spin up a new container with the appropirate volume configurations in place. I could add the volume to the container I created above but containers like this are essentially disposable so I don’t really see the need.
Before I can configure the volumes I need to know where within the containers filesystem I need to hook my ‘ContainerVolumes’ folders to. So, taking into account that the SQL Image I’ve pulled from Microsoft is based on the Ubuntu flavour of Linux I can’t just assume it will be the same locations as a Windows SQL instance.
As it happens it’s really not that difficult and while Linux gurus out there will roll their eyes at my approach, I’m all in favour of experiment and observation - and I can use the container above to work it out.
If I wanted to restore a database from a backup using SSMS it would, by default, expect it to be within the instances ‘Backups’ folder buried deep in the Windows filesystem. Attempting to kick off a restore while connected to my container opens the following path /var/opt/mssql/data
which isn’t what I expected. It does however give me a place to start, and a hint as to where I should expect the database files to end up.
Looking a little futher up in the filesystem I located /var/backups
which marries up a little better with it’s Windows equivalent so I opted for that one for my backups.
To confirm that the instance will store it’s database files in /var/opt/mssql/data
I just used Docker Desktops built-in terminal, accessed via the Terminal tab on each Containers details.
Running a couple of basic Linux commands to move into the /var/opt/mssql/data
directory and list it’s contents I could see a number of .mdf and ldf files relating to the master
and model
databases.
So, now that I know what needs to be mapped and to where I can click the ‘play’ button next to the SQL image, configure it in the same way as before, but this time I can include my Volume mappings.
Now I can drop a regular SQL .bak backup file into my ContainerVolumes/SQL_Backups folder, trigger the Restore process within SSMS, select the /var/backups
folder and finally select the .bak file I’d placed there.
I can also create databases within the container with the .mdf and .ldf datafiles being saved to the ContainerVolumes/SQL_Data folder.
Now I don’t have numerous versions of SQL Server chewing up hard drive space or CPU cycles for that matter as they were all running instances! I have reclaimed around 15GB on my primary drive and have the flexibility to spin up/down different SQL versions as required.
]]>When the UK went into lockdown I was engaged with a startup company but with all the uncertainty at the time they decided to pull the pull on the project, leaving me looking for another contract just as the contracting market all but dried up.
I wasn’t that worried at the time, thinking I could use the time sharpening my skills and work on some pet projects. However as the months rolled on and with changes to the tax regulations for contractors looking to make it much less viable a decision had to be made.
When I reached out on LinkedIn saying that I was considering permanent roles I was flattered when a previous client contacted me and offered me a position working on the same project with the same team.
A little over a year later I was asked to take on the Technical Team Lead role, managing the developers, looking after their training requirements and progression within the company.
It’s now a little over a year since I took on that role and I’ve decided it’s time to move on and while I’m primarily considering a return to contracting I’m not ruling out another permanent role. So I’ve had to give two months notice and have offered to remain until the end of March - so 10 weeks in total. This means I’m unlikely to secure anything for a few weeks yet as clients normally want contractors to start within a few days, maybe a week. But this doesn’t mean that I going to be sitting around idle.
The thing about contracting is that you are normally working with the latest and greatest technologies and this keeps your skills up to date.
My experience of regular employment is that options for advancement are more limited. The technology stack is essentially locked and upgrading things like .NET version is difficult to justify when everything is “working fine as it is”.
So that’s where I am now - I’ve fallen behind, or at least that’s what it feels like and I need to address that. It could just be a touch of Imposter Syndrome (or self doubt as we used to call it - it’s not a new thing) but I have a good track record and I’m driven to make it work.
Fortunately I have a Pluralsight subscription so I’m going to be hitting that pretty hard over the next few weeks - .NET 6/7 as well as C# 10/11 are all in the frame along with things like Razor Pages, Blazor and the Xamarin replacement, MAUI.
I also have to spin up another Limited Company and while that is a fairly straightforward process there are a lot of moving parts and I’m having a problem with one of the very early steps - coming up with a new company name (inability to name things is a developer trait).
So it looks like 2023 is going to be a challenging year - but you know what, after everything that’s gone on with lockdown and the frustrations that led me to hand in my notice, I’m ready for it.
]]>In this post I’m going to need to take a look at my existing, somewhat aging hardware with a view to not only making sure that I can run all the required tooling but also take into account that I’m in the process of moving house, downsizing now that the kids have all left, and will have a much smaller office so won’t have enough space for all my current equipment - which also needs to be downsized. Like most developers, I have accumulated quite a lot of ‘kit’ over the years and I’m never keen to just get rid of it - but I simply don’t have the room for it all.
Currently I have:
Currently the iMac is the only device I seldom use on a regular basis - I may well restore the latest version of macOS that will run on it (Maverick I think) and either sell it on or give it away. One down.
The workstation is essentially my daily driver and has been for over a decade. Running an i7 3770, 32GB RAM, SSDs and until recently a GeForce 350 GPU (long story). Running Windows 10 Pro and connected to a 32” curved Samsung primary display and a 24” Asus secondary it has served me well (and frankly still does). I’ll be loathed to get rid of it.
But, it won’t run Windows 11 and while I personally don’t really care about that there is the issue of Visual Studio compatibility. Currently Visual Studio 2022 requires Windows 10 as a minimum, i.e. it will not run on Windows 8 anything earlier.
So, what about Visual Studio vNext? What operating system requirements will that have - will it support Windows 10 or will it require Windows 11? If it’s the former, then there isn’t an issue but if it’s the latter then I have a problem - but ultimately Visual Studio will, undoubtedly adopt a requirement for Windows 11. The clock is ticking.
This ultra portable device is a great piece of kit and surprisingly capable - running Windows 10 and all the current dev tools I need, e.g. Visual Studio 2022, SQL Server + Management Studio, MySQL + Workbench etc.
The problem is that, despite being a much newer device, Windows 11 is not compatible with the Surface Pro - so I have the same problem as I have with the workstation, ultimately I won’t be able to run Visual Studio.
I am aware that I can ‘force’ install Windows 11 on the Surface Pro, and maybe the workstation, but am I just storing up trouble for later?
This replaced the iMac when it stopped receiving operating system updates - and hence stopped receiving XCode updates (XCode being required to build iOS applications - even if they are written with Xamarin or MAUI).
Well, the MacBook Pro is now in the same position as the iMac it replaced - it’s no longer receiving macOS updates so it stuck on Monterey. What I don’t know is how long Xcode will support this version.
As soon as XCode drops support for Monterey then the MacBook Pro will no longer be able to be used to build iOS apps that the App Store will accept (apps need to be built using at least a specified version of XCode and iOS).
At the time of writing the current version of XCode is 14 while the current version of iOS is 16. Apple state that apps must be built with at least iOS 15 using at least XCode 13.
Looking at Visual Studio 2022 for Mac the miniumun macOS version is actually Catalina (that’s two Operating Systems back from Monterey to save you looking) so it doesn’t look like that will be a problem - for now and assuming the same cadance.
Taking the above into account, even if XCode 15 doesn’t support Monterey it looks like the MacBook Pro currently has some life left in it even if it’s ‘only’ for general web development.
The high level options are Windows or Mac - if I opt for the Mac then there is are only really a couple of options, both MacBook Pro models varying by screen size and CPU (and of course price).
If I go for Windows then the choice is must wider but I’d probably look at the Dell XPS 15, HP Spectre or Lenovo Thinkbook.
But there’s a problem; if I go for Windows then ultimately I’ll be unable to build iOS apps when the MacBook Pro becomes unsupported by XCode. The question is, does that matter? If it does then are there viable alternatives I can use to perform my builds, e.g. AppCenter or MacInTheCloud.
On the other hand if I go for the Mac then, as a developer using Windows for some time, I’ll need to get up to speed with the Mac way of doing things and make sure that I can do everything on a Mac that I can do on Windows. I know I can run SQL Server on a Mac using Docker but I don’t know what I don’t know ;-).
So that’s what I intend to do for the next month or two - in conjunction with the development plans from my previous post I will try to work on the Mac as much as I can and see where I get to (attempting to blog my findings of course).
]]>