Working with big data in Python and low RAM

1153200 pts.
Tags:
Big Data
Python
We have to implement algorithms for 1000-dimensional  data with 200k+ data points in Python. I need to perform different operations (clustering, pairwise distance, etc.). When I try scale all of the algorithms, I run out of RAM. But here's the thing: I need to do this with several computers with low amounts of RAM. Is there a way to do this? I would like to do this in Python. I don't mind the time restraints (5 to 6 hours is good). Thank you!
0

Answer Wiki

Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

Discuss This Question:  

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

Thanks! We'll email you when relevant content is added and updated.

Following

Share this item with your network: