ACM 2008 SCUSA Wormholes- 2nd Edition

Last time we left our time travelling hero’s, they were using a repeated applications of the Bellman-Ford algorithm (at time complexity O(|V| * |E|), where |E| is on par with |V|, and number of requests is probably also on the order of |V|, giving a total of O(|V|^3)). Our valiant programmer desperately seeks to apply an the Floyd-Warshall algorithm to the problem, just so that the time spent reading the analysis of algorithms book isn’t wasted.

The secret to successfully applying the Floyd-Warshall algorithm to this problem is the same secret to getting reasonable performance out of the previous solution. You can’t look at each individual year, you instead must look at time spans between meaningful events. That brings the number of vertices down from many thousand to at most 301 (starting year + 200 distinct wormhole years + 100 mission years). It has the downside of making the process of finding weights a fair bit more complex:

public int[,] BuildInitialWeights(List distinctVerticies)
{
    int[,] weights = new int[distinctVerticies.Count, distinctVerticies.Count];
    for(int i=0; i<distinctVerticies.Count; i++)
        for(int j=0; j<distinctVerticies.Count; j++)
        {
            if (i == j) weights[i,j] = 0;
            else weights[i,j] = Int32.MaxValue;
        }
 
    //assumes vertices are in order, from first year to last
    for (int i = 1; i < distinctVerticies.Count; i++) { int start = distinctVerticies[i - 1]; int end = distinctVerticies[i]; weights[i-1, i] = end - start; } foreach (int[] years in wormholes) { int i = distinctVerticies.IndexOf(years[0]); int j = distinctVerticies.IndexOf(years[1]); int weight; if (years[1] > years[0]) weight = (years[1] - years[0]) / 2;
        else weight = (years[1] - years[0]) / 4;
        weights[i, j] = weight;
    }
    return weights;
}

Here we’re building a 2D matrix of weights. Informally we’re mapping each year in the distinct vertices list to its index, then using that index as coordinates in the weight matrix. Weights are constructed by first setting all weights to their default values (0 to get to current node, max value for other nodes). Then default weights for moving forward with time are assigned. Finally, wormholes are accounted for. Wormhole accounting is done last as it will be, by definition, the shortest path between two points. This has the advantage that only one path is ever considered between two points, where as the previous solution allowed for multiple paths.

public List BuildDistinctVerticies(List addedYears)
{
    List years = new List(addedYears);
    foreach (EndPoint wormhole in wormholes)
    {
        years.Add(wormhole.Start);
        years.Add(wormhole.End);
    }
    years.Sort();
    for (int i = 1; i < years.Count; i++)
        if (years[i - 1] == years[i])
            years.RemoveAt(i--);
    return years;
}

This is the simple method for building all distinct vertices. It starts by taking all mission and starting years, adds each end point from a wormhole to the list, then sorts it and removes duplicates. Again we get into ‘it would be nice to have a Sorted Set’ (Should appear in .NET 4; or, alternatively, has always been in Java… accursed Java with its nicer Collections libraries…).

public int[,] AllPairs(List missionYears, out List distinctVerticies)
{
    distinctVerticies = BuildDistinctVerticies(missionYears);
    int[,] currentWeights = BuildInitialWeights(distinctVerticies);
    int[,] nextWeights = new int[distinctVerticies.Count, distinctVerticies.Count];
    for (int i = 0; i < distinctVerticies.Count; i++)
    {
        Extend(currentWeights, nextWeights, i);
        int[,] temp = currentWeights;
        currentWeights = nextWeights;
        nextWeights = temp;
    }
    return currentWeights;
}
 
public void Extend(int[,] previosLevel, int[,] thisLevel, int k)
{
    for(int i=0; i<previosLevel.GetLength(0); i++)
        for(int j=0; j<previosLevel.GetLength(1); j++)
        {
            int iToJ = previosLevel[i,j];
            int iToK = previosLevel[i,k];
            int kToJ = previosLevel[k,j];
            if (iToK != Int32.MaxValue && kToJ != Int32.MaxValue)
                thisLevel[i,j] = Math.Min(iToJ, iToK + kToJ);
            else thisLevel[i,j] = iToJ;
        }
}

I’m about 90-100% certain I misspelled previous. Ignoring that, here we can see Floyd-Warshall algorithm implemented. I’m still wrapping my head around how it works; but my best understanding is the ‘extension’ method is simulating whether or not the shortest path from i to j will go through k. There’s a bit more to it, but it’s too early in the morning to worry about that.

The distinct vertices are passed as an out parameter to the caller, so that they have the map from index to year available to them.

List years = new List(missions);
years.Add(startingYear);
int[,] distances = timeline.AllPairs(years, out years);
 
foreach(int destination in missions)
{
    int i = years.IndexOf(startingYear);
    int j = years.IndexOf(destination);
    int toDistance = distances[i, j];
    int fromDistance = distances[j, i];
    if(toDistance==Int32.MaxValue || fromDistance==Int32.MaxValue)
        Console.WriteLine("IMPOSSIBLE");
    else
        Console.WriteLine(toDistance+fromDistance);
}

Finally, the relevant changes to the problem solving method. The down side to using this approach is that there isn’t a direct mapping from year to index; so a linear search is performed to get that mapping. It’s not really an issue, considering what else is being done, but it is a bit unfortunate.

Analysis
Psuedo-analysis might be more accurate. Comparing this to the previous solution, I like the fact that the algorithm seems more straight forward, and that in the end we end up with a massive table of possible paths, should our time travelling agents ever need to do strange other things. It also has a slight performance advantage over the straight version from last time. (It’s about on par with not continuing the Bellman Ford algorithm if no relaxations occur, though that might be in part due to poor implementation of that algorithm).

That said, in its current form, it adds about 30 lines of code, and the indirect mapping from index to representative year feels ungood. In the end, the question for ACM problems is ‘which could you code fastest with the lowest probability of error’, and looking at the two, I think the Bellman Ford would be a bit easier, specifically because it uses a series of edges, rather then a weight table. I could be wrong though, and the performance level does seem to be in favor of Floyd Warshall.

Side Rant
I seem to be rather poor at a very specific bit of contestifying. That is seeing points where you can discard unimportant information. In this problem, it was my attempt to treat every year as a vertex. In a problem about a disk defragmenter, I was tempted to treat the disk as a series of sectors rather then a series of filled and unfilled ranges (would have massively increased computation time to simulate, and probably over complicated the code; didn’t get to find out due to contest ending). I find that this seems like a common theme to me: I constantly attempt the simplest solution that could possibly work, and these problems are more often then not tuned to make that ‘the wrong way’. I suspect this comes from my many years of employment, where you’re not necessarily seeking the best solution, but rather just something that works well enough to get by. That is, of course, part of why I started this crazy blog: to get better at recognizing when I’m going down an incredibly wrong path.

Full Source Code for this solution.

ACM 2008 SCUSA Wormholes

South Central USA has to be one of the odder acronyms in common use… a single character away from SCUBA no less.

The last time I went to ACM, this problem was delegated to one of my team mates, and for one reason or another we never got it solved. I was reading through an Analysis of Algorithms book and came across a section on All Pairs Shortest Path and thought ‘This would apply perfectly and elegantly to this situation’. (I was wrong, but we’ll get to that soon). (Also, judge data can be had at this location, and I have been cheating miserably).

Synopsis
You work at Time Travel Inc. in the payroll department. Sick of keeping up with all the math related to time travel, you contracted yourself (while in college) to develop an application to determine how much a person has aged during their time travel, and whether or not their last journey was even physically possible. More specifically, time travel is acheived by portals which appear at specific moments in time, and exit at other specific moments, either aging or youthifying the human in the process, according to a specific formula; and ignoring the concept that you could die, or be negative years old.

Analysis
This is a graph traversal problem with negative weight directed edges. The goal is to find the minimum weight path that takes you from a start, to a destination, and back to a start; or equivelantly, two graph traversals.

As I said, I had read a bit on All Pairs Shortests Paths and was interested to see how it would handle this. Specifically, I was thinking ‘if you treat each year as a vertex, then each vertex has a weight one edge to the next year; and if a wormhole is present, an edge to that destination year. At this point in time, I thought the input was constricted to a few hundred years near the 2000’s (where else would you time travel to?). The actual range of years is from 1-10,000. 10,000 verticies, 2 10,000 x 10,000 matricies for storing path weights… no. Not gonna work. (As we’ll see in a bit, this is actually a bit of a lie).

Still curious to further my learning, I read up on the Bellman-Ford algorithm, which is a single source shortest paths algorithm. It works on the assumption that the absolute longest path will go through every single vertex in V, and therefore you can determine the minimum cost to get anywhere by going through each edge, relaxing the edge’s destination weight if appropriate, and then repeating that |V| times. Complexity of |V|*|E|. That actually sort of worked, in the sense that it produced an answer for most problems in under a minute. But, under a minute isn’t quite ideal.

To get reasonable response times, you can’t simulate every single meaningless year. You have to focus on only the meaningful ones, and declare edges between them. If you’re in 1950 and want to get to 2050, yes, you can look at it as 100 1 year edges, but it would be less pain to look at it as 1 100 year edge… or, if a wormhole happens to open up in 2012, a 62 year edge followed by a 38 year edge. The downside to ignoring meaningless years is this adds a fair bit of complex non intuitive code, just for the purpose of filling in the ‘time moves forward at a constant rate’ bits.

With that said, you could apply the same reasoning to all pairs shortest path, so long as you only kept track of the vertexes you actually cared about.

Source Code
As this one is a tad bit long, and far from being the greatest, I’m only going to post bits and peices of it.

public static void ProcessProblem(int id, TextReader input)
{
	Console.WriteLine("DATA SET #" + id);
	Timeline timeline = new Timeline();
	int wormHoleCount = ReadInt(input);
	for (int i = 0; i < wormHoleCount; i++)
	timeline.AddWormhole(ReadInts(input));
	int startingYear = ReadInt(input);
	int missionCount = ReadInt(input);
	for (int m = 0; m < missionCount; m++)
	{
		int destination = ReadInt(input);
		int distance = timeline.ShortestPath(startingYear, destination);
		int returnDistance = timeline.ShortestPath(destination, startingYear);
		if(distance==Int32.MaxValue || returnDistance==Int32.MaxValue)
			Console.WriteLine("IMPOSSIBLE");
		else
			Console.WriteLine(distance+returnDistance);
	}
}

Above is the primary problem solving bit. We create a timeline, register wormholes, then go through each mission and determine if the destination can be reached from the starting point, and if the starting point can be reached from the destination. Note that we’re using the max value of an int as a ‘no path’ return code.

class Timeline
{
	private List<int[]> wormholes=new List<int[]>();
 
	public void AddWormhole(int[] wormhole)
	{
		wormholes.Add(wormhole);
	}
 
	public int ShortestPath(int sourceYear, int destYear)
	{
		List edges = new List();
		List vertexes=new List();
		vertexes.Add(sourceYear); vertexes.Add(destYear);
		foreach (int[] wormhole in wormholes)
		{
			vertexes.Add(wormhole[0]);
			vertexes.Add(wormhole[1]);
			edges.Add(new Edge(wormhole[0], wormhole[1], true));
		}
		vertexes.Sort();
		for (int i = 1; i < vertexes.Count; i++)
			if (vertexes[i] == vertexes[i - 1])
				vertexes.RemoveAt(i--);
		for (int i = 1; i < vertexes.Count; i++)
			edges.Add(new Edge(vertexes[i - 1], vertexes[i], false));
 
		Dictionary<int, int> distances = new Dictionary<int, int>();
		distances[sourceYear] = 0;
 
		for (int i = 0; i < vertexes.Count; i++ ) foreach (Edge edge in edges) { if (!distances.ContainsKey(edge.start)) continue; int distance = distances[edge.start] + edge.weight; if (!distances.ContainsKey(edge.end) || distances[edge.end] > distance)
				distances[edge.end] = distance;
		}
		if (distances.ContainsKey(destYear))
			return distances[destYear];
		return Int32.MaxValue;
	}
}

And here we have the meat of the problem. The graph traversal algorithm/timeline class. The first portion of the ShortestPath method is soley responsible for building a list of edges (note, third parameter to edge constructor is ‘isWormhole’). Once that is constructed, the Bellman Ford algorithm is repeatedly applied to get the distance from the start point to every other point. Bellman Ford performance can actually be improved slightly by not continuing to iterate if no relaxations were needed on a given iteration. On my machine, execution time went from about 3s to .5s. (Also, as a fun aside, note the sad lack of a .NET Set class).

That’s all for now. I think I’ll reimplement this with an all pairs shortest path just to see if it would meet time constraints on execution; and make a post about that later.

Full Source Code for First Solution.