An interactive parallelization framework which is especially useful in configuring data science workload distribution. Eg. supports openMIP, MPI runs on High Performance Clusters
Interactive Parallel Computing with IPython alternatives and similar packages
Based on the "Science and Data Analysis" category
A library providing high-performance, easy-to-use data structures and data analysis tools.
A fundamental package for scientific computing with Python.
A Python-based ecosystem of open-source software for mathematics, science, and engineering.
A Python library for symbolic mathematics.
A high-productivity software for complex networks.
Statistical modeling and econometrics in Python.
Markov Chain Monte Carlo sampling toolkit.
Versatile parallel programming with task scheduling
Python JIT (just in time) compiler to LLVM aimed at scientific Python by the developers of Cython and NumPy.
A community Python library for Astronomy.
Biopython is a set of freely available tools for biological computation.
NumPy and Pandas interface to Big Data.
Data mining, data visualization, analysis and machine learning through visual programming or Python scripting.
Light-weight Python OLAP framework for multi-dimensional data analysis
Business Intelligence (BI) in Python (Pandas web interface)
Cheminformatics and Machine Learning Software.
A toolkit providing best-practice pipelines for fully automated high throughput sequencing analysis.
A collection of neuroimaging toolkits.
A columnar data container that can be compressed.
Collection of useful code related to biological analysis.
Running and testing different Artificial Neural Networks algorithms.
A Python ETL framework
Short for Python Dynamics, used to assist with workflow in the modeling of dynamic motion based around NumPy, SciPy, IPython, and matplotlib.
An MIT licensed systems and controls toolbox for Python.
Manage large and heterogeneous data spaces on the file system.
A library for parsing and interpreting the results of computational chemistry packages.
A low-impact profiler to figure out how much memory each task in Dask is using
A chemical toolbox designed to speak the many languages of chemical data.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest. Visit our partner's website for more details.
Do you think we are missing an alternative of Interactive Parallel Computing with IPython or a related project?
Interactive Parallel Computing with IPython
ipyparallel is the new home of IPython.parallel. ipyparallel is a Python package and collection of CLI scripts for controlling clusters for Jupyter.
ipyparallel contains the following CLI scripts:
- ipcluster - start/stop a cluster
- ipcontroller - start a scheduler
- ipengine - start an engine
pip install ipyparallel
To enable the
IPython Clusters tab in Jupyter Notebook:
ipcluster nbextension enable
To disable it again:
ipcluster nbextension disable
See the documentation on configuring the notebook server
to find your config or setup your initial
To install for all users on JupyterHub, as root:
jupyter nbextension install --sys-prefix --py ipyparallel jupyter nbextension enable --sys-prefix --py ipyparallel jupyter serverextension enable --sys-prefix --py ipyparallel
Start a cluster:
Use it from Python:
import os import ipyparallel as ipp rc = ipp.Client() ar = rc[:].apply_async(os.getpid) pid_map = ar.get_dict()
See the docs for more info.