Suppose You are a System Administrator. You have access to unlimited no. of servers with same amount of CPU (#of cores) & Memory (GB). On these servers, you have to pack number of applications X , deployed as VMs such that the total cost of ownership without affecting the application performance, is the smallest . Application resource requirements are given in the following format CPU Utilization of computer in percentage for a single core system Memory Allocated in MB You have to pack the applications on the server in such a way that its resource requirement is fully met on that server i.e. we cannot deploy the same application across different servers. Each server has a constant initial cost in terms of
1. Reserved CPU in % for a single core system
2. Reserved Memory in Mega Byte MB From the following equation we can cost of the server: Cost per hr = constant initial cost + ( 1.5/10000 )( cpu_utiliz_os_server ^ 3) + 0.5(mem_utiliz_of_server) where 0 less than equal to cpu_utiliz_os_server, mem_utiliz_of_server <= 100 and constant initial cost = 100 Your task is to find the minimum cost per hr (hour ) for running application.
Constraints:
• 1 <= number of cores <= 10
• 1 <= memory in GB <= 10
• 0 <= number of VMs <= 100
• 1 GB = 1024 MB
Input Format:
First line, It provides the no. of cores, C, in each server that host application.
Second line It provides the amount of memory, M, in GB in each server.
Third line It provides no. of applications, X , to be packed on servers Next, X * 2 lines consists of the following information, on one line each CPU ( Central Processing Unit ) is requirement in percentage of single core for current application Memory requirement in MB for current application.
Last 2 lines consists of the following information Reserved CPU ( Central Processing Unit ) in percentage of a single core system Reserved Memory in Mega Byte.
Output Format:
Least cost per hr for hosting all applications , rounded up to the next integer.
Input:
2
1
4
29.72
366
25.53
163
28.98
206
26.64
506
11.21
176
Output:
289
Input:
3
1
4
26.66
451
25.8
294
26.81
207
26.77
192
8.03
184
Output:
277
Solution: In C
#include<stdio.h>
int main()
{
long long int Total1, Total2, Ans;
long int C[1000], M[1000], cpu_requirement[1000], memory_requirement[1000], cpu_utilization[5000], memory_utilization[5000], reserved_cpu[1000], reserved_memory[1000];
int K[1000], k, s, i;
scanf("%d", &i);
for(s=0; s<i; s++);
{
scanf("%ld",&C[s]);
scanf("%ld",&M[s]);
scanf("%d",&K[s]);
up: for(k=0; k<K[s]; k++)
{
scanf("%ld",&cpu_requirement[k]);
scanf("%ld",&memory_requirement[k]);
cpu_utilization[s]=0;
cpu_utilization[s]=cpu_utilization[s]+ cpu_requirement[k];
cpu_utilization[s]=cpu_utilization[s]*C[s];
memory_utilization[s]=0;
memory_utilization[s]=memory_utilization[s]+memory_requirement[k];
}
if(memory_utilization[s]>M[s])
{
goto up;
}
scanf("%ld",&reserved_cpu[s]);
scanf("%ld",&reserved_memory[s]);
}
for(s=0; s<i; s++)
{
Total1=0;
Total1=reserved_cpu[s]+cpu_utilization[s]+Total1;
Total2=0;
Total2=reserved_memory[s]+memory_utilization[s]+Total2;
}
Ans = 100 + (( 1.5/10000 )*pow(Total1,3))+ (0.5*(Total2));
printf("%lld",Ans);
}
Also Checkout