Recall the job scheduling problem from the lectures: we have a collection of n processing jobs and the length of job i, i.e., the time to process job i, is given by L[i]. This time, you are given a number M and you are told that you should finish all your processing jobs between time 0 and M; any job not fully processed in this window then should be paid a penalty that is the same across all the jobs. The goal is to find a schedule of the jobs that minimizes the penalty you have to pay, i.e., it minimizes the number of jobs not fully processed in the given window. Design a greedy algorithm that given the array L[1 : n] of job lengths and integer M, finds the scheduling that minimizes the penalty in O(n log n) time. (25 points)
Example: Suppose the length of the jobs are 17,3,9, 2, 4] and M = 7. Then, one optimal solution is to run the jobs [3, 4] in the window (0 : 7] and then pay a penalty of 3 for the remaining jobs. Note that we could have alternatively picked the jobs [3, 2] or [2, 4] also but still had to pay a penalty of 3.
Simple bonus credit: Can you design an algorithm that instead runs in O(n + M) time? (+5 points)

Solved
Show answers

Ask an AI advisor a question