Given an array of points where points[i] = [xi, yi] represents a point on the X-Y plane and an integer k, return the k closest points to the origin (0, 0). The distance between two points on the X-Y plane is the Euclidean distance.
- An array of points
points. - An integer
k.
Input: points = [[1, 3], [-2, 2]], k = 1
Output: [[-2, 2]]
Explanation: Distance of (1, 3) is sqrt(10). Distance of (-2, 2) is sqrt(8). (-2, 2) is closer.