Hello everyone,
here is what I'm trying to do and what I managed to do. My question is if it is efficient and if I could improve the algorithm any further (without any overheads).
Situation:
1. List<double[]> list.
2. double[] arr.
Let's say
arr.Size = 20;
List.Capacity/Count = 50.
What I need to do is
1. Scan each arr[i] location and get the values
2. combine into array what we got3
3. find the maximum value
So...
tempArray[0] = Max(list[0..K-1], arr_loc = 0)
tempArray[1] = Max(list[0..K-1], arr_loc = 1)
etc
Here is my code:
here is what I'm trying to do and what I managed to do. My question is if it is efficient and if I could improve the algorithm any further (without any overheads).
Situation:
1. List<double[]> list.
2. double[] arr.
Let's say
arr.Size = 20;
List.Capacity/Count = 50.
What I need to do is
1. Scan each arr[i] location and get the values
2. combine into array what we got3
3. find the maximum value
So...
tempArray[0] = Max(list[0..K-1], arr_loc = 0)
tempArray[1] = Max(list[0..K-1], arr_loc = 1)
etc
Here is my code:
Code:
/// <summary>
/// simplest search (ala bubble sort)
/// </summary>
/// <param name="arrayToScan"></param>
/// <returns></returns>
internal double GetMaxValue(double[] arrayToScan)
{
double temp = 0;
for (int i = 0; i < arrayToScan.Length; i++)
{
if (arrayToScan[i] > temp)
{
temp = arrayToScan[i];
}
}
return temp;
}
Code:
internal double[] AggregationMethod(List<double[]> listOf_doubleArr)
{
double[] temp;
double[] aggregatedArrays;
temp = new double[listOf_doubleArr[0].Length];
aggregatedArrays = new double[listOf_doubleArr.Count];
for (int i = 0; i < listOf_doubleArr[0].Length; i++) // 41
{
for (int j = 0; j < listOf_doubleArr.Count; j++) //17
{
aggregatedArrays[j] = listOf_doubleArr[j][i];
}
temp[i] = GetMaxValue(aggregatedArrays);
// Console.WriteLine(listOf_doubleArry[0][i]);
}
return temp;
}