# Quick Sort. Why does the similar implementation in Python and Javascript differs so sagnificantly?

Following implementation of Quick sort in javascript with 10 000 000 random elements:

```function swap(arr, i1, i2) {
const temp = arr[i1];
arr[i1] = arr[i2]
arr[i2] = temp;
}

function pivot(arr, l, r) {
let pvt = arr[r];
let i = l - 1;

for (let j = l; j <= r - 1; j++) {
if (arr[j] < pvt) {
i++;
swap(arr, i, j);
}
}
swap(arr, i+1, r);
return i+1;
}

function quick_sort(arr, l, r) {
if (l < r) {
let pvt = pivot(arr, l, r);
quick_sort(arr, l, pvt - 1);
quick_sort(arr, pvt + 1, r);
}

}
let arr = Array.from(Array(10000000)).map((_) => Math.random());
console.time("quick_sort");
quick_sort(arr, 0, arr.length - 1);
console.timeEnd("quick_sort");
```

It takes ≈ 1.5s to sort an array.

When I’m testing the same one in python:

```import time
import random
from numpy import random

def quicksort(arr):
qs(arr, 0, len(arr) - 1)

def qs(arr, l, r):
if l >= r:
return
p = partition(arr, l, r)

qs(arr, l, p - 1)
qs(arr, p + 1, r)

def partition(arr, l, r):
pivot = arr[r]
i = l - 1
for j in range(l, r):
if arr[j] < pivot:
i += 1
arr[i], arr[j] = arr[j], arr[i]
arr[i + 1], arr[r] = arr[r], arr[i + 1]
return i + 1

array = random.rand(10000000)
start = time.time()
quicksort(array)

print("Seconds: ", time.time() - start)

```

It takes ≈ 131s to sort an array, if using random.rand(10000000)
And ≈ 60s, if using array = [random.random() for _ in range(10000000)] instead

Where does this difference come from?