Saturday, 15 June 2019

A few thoughts on the recent Moe bus re-timetabling

Photo: LVBL Volvo B5L bus outside the Latrobe City Moe Library and Service Centre. Photo Credit: Volvo
Photo: LVBL Volvo B5L bus outside the Latrobe City Moe Library and Service Centre. Photo Credit: Volvo

Roughly two years ago, PTV completely overhauled the bus network in the Latrobe Valley, operated by Latrobe Valley Bus Lines. Amongst other changes, the new routes and timetables split the Newborough loop out of the Moe to Traralgon trunk route (Route 1), thus creating two local Newborough routes (Route 14 and the new 15), as well as streamlining Route 1 to run more reliably, regularly, and for longer hours. Whilst many Newborough locals were opposed to the changes, I personally found them to be of considerable improvement. The new Route 15 provided service where there previously was none, and the Route 1 streamlining meant you could get to Morwell from Moe/Newborough in roughly 15 minutes, and Traralgon in around 35.

Even though I spend most of my time nowadays in Melbourne, the bus network in the Latrobe Valley still remains of considerable importance to me. It’s how I got around when I lived there, and now it’s how I get around when I’m in the area visiting family or friends. As such I have been able to generate an array of opinions on how the service is run, with most of them not being positive. I’ve communicated before, both to LVBL and PTV, about the various faults I find in the operation of the service; such as the lack of timely connections to trains Moe Station, or Route 1 buses arriving late into Moe Bus Interchange with local town routes departing before the Traralgon bus arrived. Writing to LVBL simply resulted in them asking me to redirect my complaint to PTV, and writing to PTV garnered basically no tangible response to my concerns; they simply stated that they’d keep my concerns in mind rather than opening a dialogue.

So understandably, I was both excited and hesitant when I heard that PTV are due to enact a new set of bus timetables for the Latrobe Valley on the 23rd of June. Would we finally see a more logical set of routes and timetables for Moe, with proper connections with trains and other bus routes, or would we see just a general expansion to service with the same faults still present?

Sunday, 19 May 2019

C# Dirty Delegate Hack

Just a quick one for today. For a uni C# assignment I’ve had to implement a multi-delegate that processes a list of data, with the delegate consisting of three disparate functions which I also had to implement, according to a unit test spec. Data has to flow between each step of the delegate to produce the correct output. Now the usual way to do this would be to use pass-by-reference to alter the original data in-place, however in this circumstance it was not possible as the tests required the signatures of the delegate methods to pass-by-value. After thinking for some time, I came to the following solution. Beware, it’s fairly horrible.

In the classes containing the delegate methods, I define a helper function;

public static List<List<string>> S(List<List<string>> oldData, List<List<string>>newData) {
     for (var i = 0; i < oldData.Count; i++) oldData[i] = newData[i];
     return oldData;
}

Then in the delegate methods, I wrap the results expression in this helper function, passing the original list as the first parameter

public List<List<string>> StripWhiteSpace(List<List<string>> data)
     => S(data, data.Select(row => row.Select(val => val.Trim()).ToList()).ToList());

This then results in the new data being returned from the method (satisfies the tests) as well as the original data being replaced (allows the data to flow to the next delegate method). How does it work? It comes down to the fact that the data I’m working with here is non-primitive; passing it between methods is implicitly by reference rather than by value. It did take me a while to reach this conclusion, my initial thinking was that I’m working with lists of strings, and strings are primitive and therefor pass-by-value. When I realised I was actually working with Lists, which are non-primitive objects, it occurred to me that I could write a helper method to modify each member of the original List in-place. Because the new data is being placed back into the original List object, the same object of which is referenced later on, the data carries itself forward into those next methods of the delegate.

Saturday, 11 May 2019

Open Live Writer, Blogger and Google Drive now working

IMG_4791

Photo: Open Live Writer running on Windows 10, taken at Docklands Melbourne. This image was published to this Blogger blog with Open Live Writer.

Beginning from the next release of Open Live Writer (0.6.3), Blogger users will now be able to successfully post images to their blogs again. After being affected by the issue myself in my own usage of the software, I took it upon myself to develop and deliver a fix. Today, Open Live Writer will, instead of uploading to Google Photos using the now non-functional Picasa API, upload to Google Drive, publicly link-share the photo, and then embed the direct URL to the image within your post. This all happens automatically from within Open Live Writer, there are no work-arounds or hacks at play.

Monday, 8 April 2019

An afternoon wasted and an exercise in partition frustration

…or “What to do if you lose Mac OS after resizing your Windows Bootcamp partition”

TLDR; Scroll to the bottom of the post where I outline the steps I took to make my Mac partition bootable again without any data loss.

I was meant to spend this afternoon completing an assignment for my course due Thursday. Instead, I nearly lost all of my programming projects dating back to when I began to take programming seriously at age twelve. This is your typical "always have and check backups story"; something I thought I was immune to – until now, at least.

It began when I decided I needed to expand my Mac’s Windows 10 Bootcamp partition. I thought this would be an easy task; I’d shrink my APFS partition from Mac OS, as Disk Utility would be the beast utility to accomplish that task, and then expand my Windows partition from Windows, as I don’t trust Mac OS to expand NTFS. Shrinking the APFS partition went swimmingly, it took about 15 minutes and apart from Disk Utility freezing, seemed to complete without a hitch.

The next step was to expand my Bootcamp NTFS partition to fill the unallocated space on the SSD. I decided to undertake this task in Windows, rather than booting a Linux live USB. The first tool I tried was the built-in Windows Disk Management, however it quickly became apparent that it cannot resize a currently mounted partition, like the Windows C: drive. So off to the internet I went, in search of a partitioning tool that could somehow accomplish this task. I landed on EaseUS Partition Master, it was a program I had heard of before and it seemed to be fairly reputable. It advertised itself as a safe way to manage partitions. This turned out, in my case, to be false.

Friday, 5 April 2019

Introducing PTData–easily obtain Victorian public transport data in a tabular format

Today marks the public release of my newest project, PTData. PTData makes obtaining data from the PTV API easy by displaying it in a tabular format, linking the data together via clickable links, and making all of it available for download as CSV. PTData can currently request lists of routes, lists of stops, lists of departures and a service’s stopping pattern, with more features planned for the future.

My initial intention for this project was to make it easier to import bus timetable data into Excel. Copy/paste just wasn’t formatting into the cells correctly for me, so that led to me entering in the times manually. This quickly became a tedious and frustrating process, which was what initially spurred me to start the project. Almost exactly two weeks later, and here we are.

The site sports quite a simple design as it currently stands, and that for the most part has been intentional. In ways I’m nostalgic for the bare-bones web design trends of the early 2000s, but at the same time it serves the practical purpose of being information-dense. In regards to the technology powering the site, I’ve kept it quite simple as to help with getting the site off the ground as quick as possible. The site itself is written in Ruby, utilising the Sinatra web framework and Dylan Shaw’s ruby-ptv-api gem. As for why I chose Ruby and Sinatra, it’s pretty much what I’m most comfortable with for developing web applications of this nature. I can see myself changing languages and frameworks going forward should the need arise.

There is currently no dedicated database storing the information from PTV permanently, although I do have the rendered pages being cached by nginx and then later Cloudflare to ease load on the API. I do plan to integrate a database eventually, and this would open a whole world of opportunities in regards to working with the data. On a surface level, it would allow for the rendering of a traditional ‘timetable’ view, but going further it would make possible some interesting analysis. Average delay per route, how time of day affects the network, etc.

The site’s code is open-source over at https://github.com/nvella/ptdata, and I do welcome any issue reports, ideas, or pull requests. The codebase is very rudimentary as it currently stands, however I do quickly see it becoming a rather complex project with numerous backend services performing different tasks.

By now you’re probably wondering what I’m going to use it for. For a long time I’ve wanted to perform an analysis on the bus timetables and connections in the Latrobe Valley. From experience I’ve found the inter-modal and inter-route connections to be quite poor, but with this tool I can now state with exact numbers just how bad it actually is. Stay tuned for a future blog post…

Sunday, 31 March 2019

Dealing with composite primary keys and EntityFramework scaffolded controllers

A recent uni assignment has required me to implement an ASP.NET Core REST API using EntityFrameworkCore on a pre-existing database. Whilst EF Core tends to generate controllers for models with single primary keys just fine, I’ve found that it has no support for composite primary keys. This requires you to alter your controller classes to add the support manually, which in my case involved the following;
  • Alter the paths of the individual Get, Put, and Delete endpoints to contain two (or more, depending on how many keys are in your composite) parameters, and then alter the method parameters to match. This requires careful consideration as there are many different ways you can structure the parameters in your route.
  • Alter every usage of _context.nameOfModelHere.FindAsync to contain all components of your primary key. Look at the model’s Entity definition in your database context class, specifically the usage of entity.HasKey, to determine the order in which to list the components in your key to FindAsync.
  • Alter the nameOfModelHereExists method to take the correct amount of parameters for your key. Adjust the body of the method accordingly to check all parts of the key; I just added && at the end of the existing equality expression, and added more equality expressions.
    • Alter all usages of nameOfModelHereExists appropriately.
  • In the Put method, alter the second if statement to check all the components of your primary key, rather than just id.
  • In the Post method, adjust the CreatedAtAction call in the return statement to contain all the components of your primary key.
  • I would also recommend updating all the auto-generated comments that EntityFramework put in to keep things consistent.
Below is my implementation of these steps made to a controller titled BookCopies. The components of the primary key are int CopyCount and string ISBN.

Monday, 18 March 2019

Back in action…

So I’ve decided to start blogging again. You should be able to expect a myriad of topics from me, mainly with a focus on software development and my love for public transport.

In terms of the tools I’ve decided to equip myself with; the blog itself is hosted on Blogger, and I’m authoring the posts with Open Live Writer, giving the whole experience for me a very late-2000s feel. I’m not sure for how long I’ll be able to put up with this for.

Until next time…