[Solved] First fit problem

Today I encountered a very simple problem but shame on me, after more than 10 years as software engineer, I just messed up all things and could hardly figure out how it works. Even with the solution at that moment, I wasn’t satisfied because I just lost the control of my code. I could tell you that’s very, very bad feeling. It’s not the point if the code works or not any more. It’s the point of confidence. When you lost your confidence, everything else collapsed. Sometimes I was just stupid at the wrong moment. 🙂

Now when I keep myself calm and look back at the problem. I can finish it in 15 minutes, have no idea why I don’t get the point before. The task is pretty simple

We have a group of applications consuming resource. How many servers do we need to host all of these applications?

The meaning of resource here is very abstract and can make us really confused. Because if we consider CPU, RAM, hard disk or something like that, our problem will be suddenly more complicated. But if we just think resource as a feature of server object and application object then we can have something much more simple.

class Server {

	public Server()
		Resource = 100;
		Id = Guid.NewGuid();
		Applications = new List<Application>();

	public Guid Id { get; set; }
	public int Resource { get; set; }
	public IList<Application> Applications { get; set; }

	internal bool TryToHost(Application app)
		if (Resource > app.Usage)
			Resource -= app.Usage;
			app.Server = this;
			return true;
			return false;

class Application{
	public Server Server { get; set; }
	public int Usage { get; set; }

Whereas the Usage indicates how much of resource an application needs and Resource is what the server has. When I make everything abstract, everything is getting more clear. In the code above I assume that each server has 100 unit of resource and the application will use x unit of 100 resource.

Now I would like to host the application on the server. I have to find the way to distribute the applications to the servers. Until now I have 2 solutions, first-fit and best-fit as shown as below. Maybe there is another better one.

static void Main(string[] args)
	IList<Application> applications = new List<Application>();
	applications.Add(new Application(){Usage = 50});
	applications.Add(new Application(){Usage = 60});
	applications.Add(new Application(){Usage = 70});
	applications.Add(new Application(){Usage = 40});

	IList<Server> servers = new List<Server>();
	var currentServer = new Server();
	foreach(var app in applications)
		if (!currentServer.TryToHost(app))
			currentServer = new Server();

	Console.WriteLine($"Number of servers:{servers.Count}");

	servers = new List<Server>();
	for (int i =0;i<10;i++)            
		servers.Add(new Server());

	foreach(var app in applications){
		foreach(var server in servers)
			if (server.TryToHost(app)){

	var serversCount = servers.Where(x=>x.Resource !=100).Count();
	Console.WriteLine($"Number of servers:{serversCount}");



The first-fit will look only forward and create a new server if required. It just goes through the application list, allocate the app to the server. If there is not enough resource on current server anymore, it creates a new one and allocate.

The best-fit will look backward and try to take advantage of all servers. He’ll always look up the server list from the beginning for every application and fit the app into that server.

Because looking only forward, the first-fit will always need more servers than the best-fit solution because he takes care of only the current server and skip the others.

For example, when the code above runs, the first-fit give 4 servers required meanwhile the best-fit needs only 3 servers. The disadvantages of best-fit is that we have to define a fixed size list of servers. Mean when the applications get growing up, the fixed size of servers can be overflown.

To solve that we can set the size of server list equals to the size of applications array. That mean we’ll assume that we start in worst case where every application needs to run on a server. Later with the best-fit algorithm, it’ll figure out how many servers do we really need.

What makes me fun is when I recalled this incident: I did solve the problem correctly at that moment. I did implement the best-fit correctly but I forgot the break; line at the second loop. And it gave the false result back and the nightmare began when I thought my algorithm is wrong, try to go another way and get lost myself later.

After that I tried to figure out to understand why I lost my confidence too fast. It’s just a small bug.

I think because I am learning too many complex algorithms in machine learning. They are extremely complex and I know I can never be able to invent them. So slowly I’m just not sure if I’m capable of writing a ‘good’ algorithm. Therefore even that it’s just a small bug, I doubt myself if it’s because of my logical thinking.

I don’t need 10 years of experience to write that code. Even a student can make it right.

Just missing “break;” and it will break your bones. Nice lesson but the price is too expensive. :).

Leave a comment

Your email address will not be published. Required fields are marked *