Job shop scheduling


Job shop scheduling or the job-shop problem is an optimization problem in computer science and operations research in which jobs are assigned to resources at particular times. The most basic version is as follows: We are given n jobs J1, J2, ..., Jn of varying processing times, which need to be scheduled on m machines with varying processing power, while trying to minimize the makespan. The makespan is the total length of the schedule.
The standard version of the problem is where you have n jobs J1, J2, ..., Jn. Within each job there is a set of operations O1, O2, ..., On which need to be processed in a specific order. Each operation has a specific machine that it needs to be processed on and only one operation in a job can be processed at a given time. A common relaxation is the flexible job shop where each operation can be processed on any machine of a given set.
This problem is one of the best known combinatorial optimization problems, and was the first problem for which competitive analysis was presented, by Graham in 1966.
Best problem instances for basic model with makespan objective are due to Taillard.
The name originally came from the scheduling of jobs in a job shop, but the theme has wide applications beyond that type of instance.

Problem variations

Many variations of the problem exist, including the following:
Since the traveling salesman problem is NP-hard, the job-shop problem with sequence-dependent setup is clearly also NP-hard since the TSP is a special case of the JSP with a single job.

Problem representation

The disjunctive graph is one of the popular models used for describing the job shop scheduling problem instances.
A mathematical statement of the problem can be made as follows:
Let and be two finite sets. On account of the industrial origins of the problem, the are called machines and the are called jobs.
Let denote the set of all sequential assignments of jobs to machines, such that every job is done by every machine exactly once; elements may be written as matrices, in which column lists the jobs that machine will do, in order. For example, the matrix
means that machine will do the three jobs in the order, while machine will do the jobs in the order.
Suppose also that there is some cost function. The cost function may be interpreted as a "total processing time", and may have some expression in terms of times, the cost/time for machine to do job.
The job-shop problem is to find an assignment of jobs such that is a minimum, that is, there is no such that.

Scheduling efficiency

Scheduling efficiency can be defined for a schedule through the ratio of total machine idle time to the total processing time as below:
Here is the idle time of machine, is the makespan and is the number of machines. Notice that with the above definition, scheduling efficiency is simply the makespan normalized to the number of machines and the total processing time. This makes it possible to compare the usage of resources across JSP instances of different size.

The problem of infinite cost

One of the first problems that must be dealt with in the JSP is that many proposed solutions have infinite cost: i.e., there exists such that. In fact, it is quite simple to concoct examples of such by ensuring that two machines will deadlock, so that each waits for the output of the other's next step.

Major results

Graham had already provided the List scheduling algorithm in 1966, which is -competitive, where m is the number of machines. Also, it was proved that List scheduling is optimum online algorithm for 2 and 3 machines. The Coffman–Graham algorithm for uniform-length jobs is also optimum for two machines, and is -competitive. In 1992, Bartal, Fiat, Karloff and Vohra presented an algorithm that is 1.986 competitive. A 1.945-competitive algorithm was presented by Karger, Philips and Torng in 1994. In 1992, Albers provided a different algorithm that is 1.923-competitive. Currently, the best known result is an algorithm given by Fleischer and Wahl, which achieves a competitive ratio of 1.9201.
A lower bound of 1.852 was presented by Albers.
Taillard instances has an important role in developing job shop scheduling with makespan objective.
In 1976 Garey provided a proof that this problem is NP-complete for m>2, that is, no optimal solution can be computed in polynomial time for three or more machines.
In 2011 Xin Chen et al. provided optimal algorithms for online scheduling on two related machines improving previous results.

Offline makespan minimization

Atomic jobs

The simplest form of the offline makespan minimisation problem deals with atomic jobs, that is, jobs that are not subdivided into multiple operations. It is equivalent to packing a number of items of various different sizes into a fixed number of bins, such that the maximum bin size needed is as small as possible.
Dorit S. Hochbaum and David Shmoys presented a polynomial-time approximation scheme in 1987 that finds an approximate solution to the offline makespan minimisation problem with atomic jobs to any desired degree of accuracy.

Jobs consisting of multiple operations

The basic form of the problem of scheduling jobs with multiple operations, over M machines, such that all of the first operations must be done on the first machine, all of the second operations on the second, etc., and a single job cannot be performed in parallel, is known as the flow shop scheduling problem. Various algorithms exist, including genetic algorithms.

Johnson's algorithm

A heuristic algorithm by S. M. Johnson can be used to solve the case of a 2 machine N job problem when all jobs are to be processed in the same order. The steps of algorithm are as follows:
Job Pi has two operations, of duration Pi1, Pi2, to be done on Machine M1, M2 in that sequence.
If the minimum belongs to Pk1,
Remove K from list A; Add K to end of List L1.
If minimum belongs to Pk2,
Remove K from list A; Add K to beginning of List L2.
Johnson's method only works optimally for two machines. However, since it is optimal, and easy to compute, some researchers have tried to adopt it for M machines,
The idea is as follows: Imagine that each job requires m operations in sequence, on M1, M2 … Mm. We combine the first m/2 machines into an Machining center, MC1, and the remaining Machines into a Machining Center MC2. Then the total processing time for a Job P on MC1 = sum, and processing time for Job P on MC2 = sum.
By doing so, we have reduced the m-Machine problem into a Two Machining center scheduling problem. We can solve this using Johnson's method.

Makespan prediction

Machine learning has been recently used to predict the optimal makespan of a JSP instance without actually producing the optimal schedule. Preliminary results show an accuracy of around 80% when supervised machine learning methods were applied to classify small randomly generated JSP instances based on their optimal scheduling efficiency compared to the average.

Example

Here is an example of a job shop scheduling problem formulated in AMPL as a mixed-integer programming problem with indicator constraints:

param N_JOBS;
param N_MACHINES;
set JOBS ordered = 1..N_JOBS;
set MACHINES ordered = 1..N_MACHINES;
param ProcessingTime > 0;
param CumulativeTime =
sum ProcessingTime;
param TimeOffset =
max
;
var end >= 0;
var start >= 0;
var precedes binary;
minimize makespan: end;
subj to makespan_def:
end >= start + sum ProcessingTime;
subj to no12_conflict:
precedes > start >= start + TimeOffset;
subj to no21_conflict:
!precedes > start >= start + TimeOffset;
data;
param N_JOBS := 4;
param N_MACHINES := 4;
param ProcessingTime:
1 2 3 4 :=
1 4 2 1
2 3 6 2
3 7 2 3
4 1 5 8;