Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -16,17 +16,40 @@
* @param {Array<number>} numbers - Numbers to process
* @returns {Object} Object containing running total and product
*/
// export function calculateSumAndProduct(numbers) {
// let sum = 0;
// for (const num of numbers) {
// sum += num;
// }

// let product = 1;
// for (const num of numbers) {
// product *= num;
// }

// return {
// sum: sum,
// product: product,
// };
// }

//My Analysis report
// the function have two loops which looks similar and can be simplified, they makes the time complexity of the function
// a numbers.length times which is O(n). for the two loops time complexity becomes O(n) + O(n) = O(2n)
// after looping it stores the results on the two variables i.e. sum and product which have O(1) + O(1) space complexity.
// the space complexity related to arr size is unavoidable/ unchangeable and is optimal in this case , which is O(n)
// And The area of inefficiency for this code is on the loop (i.e. looping twice)

//refactored code for better efficiency.
//what i have done here is i use a single loop to change the efficiency from o(2n) to o(n)

export function calculateSumAndProduct(numbers) {
let sum = 0;
for (const num of numbers) {
sum += num;
}

let product = 1;
for (const num of numbers) {
sum += num;
product *= num;
}

return {
sum: sum,
product: product,
Expand Down
30 changes: 27 additions & 3 deletions Sprint-1/JavaScript/findCommonItems/findCommonItems.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,30 @@
* @param {Array} secondArray - Second array to compare
* @returns {Array} Array containing unique common items
*/
export const findCommonItems = (firstArray, secondArray) => [
...new Set(firstArray.filter((item) => secondArray.includes(item))),
];
// export const findCommonItems = (firstArray, secondArray) => [
// ...new Set(firstArray.filter((item) => secondArray.includes(item))),
// ];
// My analysis report
// The function have a hidden nested loop using filter() and includes() This makes it quite expensive for us

// Time Complexity
// .filter() does n operation and .includes() does m operation since it is a nested loop the time complexity would be the product of the two complexities
// O (n * m )
// Space Complexity
// .filter creates a temporary array to store common items O(n) space and "Set" also takes O(m) space. This is un avoidable if we use this nested loop

// The inefficiency is on the hidden nested loop

export const findCommonItems = (firstArray, secondArray) => {
const arraySet = new Set(secondArray);
const commonItems = firstArray.filter((item) => {
return arraySet.has(item);
});

return [...new Set(commonItems)];
};


// Time complexity is O(n + m) which is O(n) complexity from line 28 and o(m) complexity from line 32. i neglected line 27 complexity since it is O(1)
// Space Complexity stays almost same
// and we can say refactoring this code makes it fast.
32 changes: 27 additions & 5 deletions Sprint-1/JavaScript/hasPairWithSum/hasPairWithSum.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,35 @@
* @param {number} target - Target sum to find
* @returns {boolean} True if pair exists, false otherwise
*/
// export function hasPairWithSum(numbers, target) {
// for (let i = 0; i < numbers.length; i++) {
// for (let j = i + 1; j < numbers.length; j++) {
// if (numbers[i] + numbers[j] === target) {
// return true;
// }
// }
// }
// return false;
// }

//My Analysis Result
// Time Complexity - since it have two nested loop of the same size/growth here i have a growth of O(n * n) = O(n**2) .. quadratic growth
// Space Complexity - I didnt see any assigning/modifying stored data scenario so i think the space complexity is just O(1) for the return statement
//The inefficiency of this program is due to the nested loop : it creates redudndant checks to compare the numbers.

//here is the refactored code avoiding that redundancy

export function hasPairWithSum(numbers, target) {
for (let i = 0; i < numbers.length; i++) {
for (let j = i + 1; j < numbers.length; j++) {
if (numbers[i] + numbers[j] === target) {
return true;
}
const seenNumbers = new Set();
for (const num of numbers) {
const complement = target - num;
if (seenNumbers.has(complement)) {
return true; // We found a pair!
}
seenNumbers.add(num);
}
return false;
}
//here time complexity is reduced to optimal level . just one loop which is O(n)
//for the space complexity here i have O(n) as well because of the Set() i used.
//i traded space to have an optimal time complexity
60 changes: 37 additions & 23 deletions Sprint-1/JavaScript/removeDuplicates/removeDuplicates.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -8,29 +8,43 @@
* @param {Array} inputSequence - Sequence to remove duplicates from
* @returns {Array} New sequence with duplicates removed
*/
export function removeDuplicates(inputSequence) {
const uniqueItems = [];
// export function removeDuplicates(inputSequence) {
// const uniqueItems = [];

// for (
// let currentIndex = 0;
// currentIndex < inputSequence.length;
// currentIndex++
// ) {
// let isDuplicate = false;
// for (
// let compareIndex = 0;
// compareIndex < uniqueItems.length;
// compareIndex++
// ) {
// if (inputSequence[currentIndex] === uniqueItems[compareIndex]) {
// isDuplicate = true;
// break;
// }
// }
// if (!isDuplicate) {
// uniqueItems.push(inputSequence[currentIndex]);
// }
// }

// return uniqueItems;
// }

for (
let currentIndex = 0;
currentIndex < inputSequence.length;
currentIndex++
) {
let isDuplicate = false;
for (
let compareIndex = 0;
compareIndex < uniqueItems.length;
compareIndex++
) {
if (inputSequence[currentIndex] === uniqueItems[compareIndex]) {
isDuplicate = true;
break;
}
}
if (!isDuplicate) {
uniqueItems.push(inputSequence[currentIndex]);
}
}

return uniqueItems;
// My Analysis Result
// Time Complexity- It has a nested loop structure, giving it a growth of O(n * n) = O(n²). quadratic growth.
// Space Complexity- A new array 'uniqueItems' is created which can grow up to the size of the input, making the space complexity O(n).
// The inefficiency of this program is due to the nested loop:
// for every item, it slowly rescans the results to check for duplicates.

//refactored code
export function removeDuplicates(inputSequence) {
return [...new Set(inputSequence)];
}
// Here the time complexity is reduced to an optimal level O(n).
// For the space complexity, I have O(n) as well because the new Set can store up to n unique items.
Loading