ACM 2008 SCUSA Wormholes- 2nd Edition

Last time we left our time travelling hero’s, they were using a repeated applications of the Bellman-Ford algorithm (at time complexity O(|V| * |E|), where |E| is on par with |V|, and number of requests is probably also on the order of |V|, giving a total of O(|V|^3)). Our valiant programmer desperately seeks to apply an the Floyd-Warshall algorithm to the problem, just so that the time spent reading the analysis of algorithms book isn’t wasted.

The secret to successfully applying the Floyd-Warshall algorithm to this problem is the same secret to getting reasonable performance out of the previous solution. You can’t look at each individual year, you instead must look at time spans between meaningful events. That brings the number of vertices down from many thousand to at most 301 (starting year + 200 distinct wormhole years + 100 mission years). It has the downside of making the process of finding weights a fair bit more complex:

public int[,] BuildInitialWeights(List distinctVerticies)
{
    int[,] weights = new int[distinctVerticies.Count, distinctVerticies.Count];
    for(int i=0; i<distinctVerticies.Count; i++)
        for(int j=0; j<distinctVerticies.Count; j++)
        {
            if (i == j) weights[i,j] = 0;
            else weights[i,j] = Int32.MaxValue;
        }
 
    //assumes vertices are in order, from first year to last
    for (int i = 1; i < distinctVerticies.Count; i++) { int start = distinctVerticies[i - 1]; int end = distinctVerticies[i]; weights[i-1, i] = end - start; } foreach (int[] years in wormholes) { int i = distinctVerticies.IndexOf(years[0]); int j = distinctVerticies.IndexOf(years[1]); int weight; if (years[1] > years[0]) weight = (years[1] - years[0]) / 2;
        else weight = (years[1] - years[0]) / 4;
        weights[i, j] = weight;
    }
    return weights;
}

Here we’re building a 2D matrix of weights. Informally we’re mapping each year in the distinct vertices list to its index, then using that index as coordinates in the weight matrix. Weights are constructed by first setting all weights to their default values (0 to get to current node, max value for other nodes). Then default weights for moving forward with time are assigned. Finally, wormholes are accounted for. Wormhole accounting is done last as it will be, by definition, the shortest path between two points. This has the advantage that only one path is ever considered between two points, where as the previous solution allowed for multiple paths.

public List BuildDistinctVerticies(List addedYears)
{
    List years = new List(addedYears);
    foreach (EndPoint wormhole in wormholes)
    {
        years.Add(wormhole.Start);
        years.Add(wormhole.End);
    }
    years.Sort();
    for (int i = 1; i < years.Count; i++)
        if (years[i - 1] == years[i])
            years.RemoveAt(i--);
    return years;
}

This is the simple method for building all distinct vertices. It starts by taking all mission and starting years, adds each end point from a wormhole to the list, then sorts it and removes duplicates. Again we get into ‘it would be nice to have a Sorted Set’ (Should appear in .NET 4; or, alternatively, has always been in Java… accursed Java with its nicer Collections libraries…).

public int[,] AllPairs(List missionYears, out List distinctVerticies)
{
    distinctVerticies = BuildDistinctVerticies(missionYears);
    int[,] currentWeights = BuildInitialWeights(distinctVerticies);
    int[,] nextWeights = new int[distinctVerticies.Count, distinctVerticies.Count];
    for (int i = 0; i < distinctVerticies.Count; i++)
    {
        Extend(currentWeights, nextWeights, i);
        int[,] temp = currentWeights;
        currentWeights = nextWeights;
        nextWeights = temp;
    }
    return currentWeights;
}
 
public void Extend(int[,] previosLevel, int[,] thisLevel, int k)
{
    for(int i=0; i<previosLevel.GetLength(0); i++)
        for(int j=0; j<previosLevel.GetLength(1); j++)
        {
            int iToJ = previosLevel[i,j];
            int iToK = previosLevel[i,k];
            int kToJ = previosLevel[k,j];
            if (iToK != Int32.MaxValue && kToJ != Int32.MaxValue)
                thisLevel[i,j] = Math.Min(iToJ, iToK + kToJ);
            else thisLevel[i,j] = iToJ;
        }
}

I’m about 90-100% certain I misspelled previous. Ignoring that, here we can see Floyd-Warshall algorithm implemented. I’m still wrapping my head around how it works; but my best understanding is the ‘extension’ method is simulating whether or not the shortest path from i to j will go through k. There’s a bit more to it, but it’s too early in the morning to worry about that.

The distinct vertices are passed as an out parameter to the caller, so that they have the map from index to year available to them.

List years = new List(missions);
years.Add(startingYear);
int[,] distances = timeline.AllPairs(years, out years);
 
foreach(int destination in missions)
{
    int i = years.IndexOf(startingYear);
    int j = years.IndexOf(destination);
    int toDistance = distances[i, j];
    int fromDistance = distances[j, i];
    if(toDistance==Int32.MaxValue || fromDistance==Int32.MaxValue)
        Console.WriteLine("IMPOSSIBLE");
    else
        Console.WriteLine(toDistance+fromDistance);
}

Finally, the relevant changes to the problem solving method. The down side to using this approach is that there isn’t a direct mapping from year to index; so a linear search is performed to get that mapping. It’s not really an issue, considering what else is being done, but it is a bit unfortunate.

Analysis
Psuedo-analysis might be more accurate. Comparing this to the previous solution, I like the fact that the algorithm seems more straight forward, and that in the end we end up with a massive table of possible paths, should our time travelling agents ever need to do strange other things. It also has a slight performance advantage over the straight version from last time. (It’s about on par with not continuing the Bellman Ford algorithm if no relaxations occur, though that might be in part due to poor implementation of that algorithm).

That said, in its current form, it adds about 30 lines of code, and the indirect mapping from index to representative year feels ungood. In the end, the question for ACM problems is ‘which could you code fastest with the lowest probability of error’, and looking at the two, I think the Bellman Ford would be a bit easier, specifically because it uses a series of edges, rather then a weight table. I could be wrong though, and the performance level does seem to be in favor of Floyd Warshall.

Side Rant
I seem to be rather poor at a very specific bit of contestifying. That is seeing points where you can discard unimportant information. In this problem, it was my attempt to treat every year as a vertex. In a problem about a disk defragmenter, I was tempted to treat the disk as a series of sectors rather then a series of filled and unfilled ranges (would have massively increased computation time to simulate, and probably over complicated the code; didn’t get to find out due to contest ending). I find that this seems like a common theme to me: I constantly attempt the simplest solution that could possibly work, and these problems are more often then not tuned to make that ‘the wrong way’. I suspect this comes from my many years of employment, where you’re not necessarily seeking the best solution, but rather just something that works well enough to get by. That is, of course, part of why I started this crazy blog: to get better at recognizing when I’m going down an incredibly wrong path.

Full Source Code for this solution.

Leave a Reply


Allowed HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>