Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.
Use Keras if you need a deep learning library that:
Keras alternatives and similar packages
Based on the "Machine Learning" category.
Alternatively, view Keras alternatives based on common mentions on social networks and blogs.
9.8 9.6 L1 Keras VS xgboostScalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
9.6 10.0 L1 Keras VS PaddlePaddlePArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice （『飞桨』核心框架，深度学习&机器学习高性能单机、分布式训练和跨平台部署）
9.6 0.0 L1 Keras VS CNTKMicrosoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
9.5 6.8 Keras VS ProphetTool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.
9.3 0.0 L3 Keras VS TFLearnDeep learning library featuring a higher-level API for TensorFlow.
8.9 9.8 Keras VS H2OH2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.
8.9 0.0 L3 Keras VS NuPICNumenta Platform for Intelligent Computing is an implementation of Hierarchical Temporal Memory (HTM), a theory of intelligence based strictly on the neuroscience of the neocortex.
8.0 0.7 L2 Keras VS Pylearn2Warning: This project does not have any current developer. See bellow.
7.9 6.2 L4 Keras VS LightFMA Python implementation of LightFM, a hybrid recommendation algorithm.
7.7 4.2 Keras VS SacredSacred is a tool to help you configure, organize, log and reproduce experiments developed at IDSIA.
7.6 1.3 L4 Keras VS skflowSimplified interface for TensorFlow (mimicking Scikit Learn) for Deep Learning
5.8 0.0 L2 Keras VS CrabCrab is a ﬂexible, fast recommender engine for Python that integrates classic information ﬁltering recommendation algorithms in the world of scientiﬁc Python packages (numpy, scipy, matplotlib).
4.3 0.0 Keras VS seqevalA Python framework for sequence labeling evaluation(named-entity recognition, pos tagging, etc...)
4.0 7.3 Keras VS adaptive:chart_with_upwards_trend: Adaptive: parallel active learning of mathematical functions
3.9 7.5 Keras VS SciKit-Learn LaboratorySciKit-Learn Laboratory (SKLL) makes it easy to run machine learning experiments.
3.5 0.0 L4 Keras VS Feature ForgeA set of tools for creating and testing machine learning features, with a scikit-learn compatible API
The easiest way to use Machine Learning. Mix and match underlying ML libraries and data set sources. Generate new datasets or modify existing ones with ease.
2.9 9.6 Keras VS bodyworkML pipeline orchestration and model deployments on Kubernetes, made really easy.
2.1 8.0 Keras VS OptaPyOptaPy is an AI constraint solver for Python to optimize planning and scheduling problems.
1.9 8.7 Keras VS openskill.pyA faster, open-license alternative to Microsoft TrueSkill
1.3 3.1 Keras VS neptune-contribThis library is a location of the LegacyLogger for PyTorch Lightning.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of Keras or a related project?
Keras: Deep Learning for humans
This repository hosts the development of the Keras library. Read the documentation at keras.io.
Keras is a deep learning API written in Python, running on top of the machine learning platform TensorFlow. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result as fast as possible is key to doing good research.
- Simple -- but not simplistic. Keras reduces developer cognitive load to free you to focus on the parts of the problem that really matter.
- Flexible -- Keras adopts the principle of progressive disclosure of complexity: simple workflows should be quick and easy, while arbitrarily advanced workflows should be possible via a clear path that builds upon what you've already learned.
- Powerful -- Keras provides industry-strength performance and scalability: it is used by organizations and companies including NASA, YouTube, and Waymo.
Keras & TensorFlow 2
- Efficiently executing low-level tensor operations on CPU, GPU, or TPU.
- Computing the gradient of arbitrary differentiable expressions.
- Scaling computation to many devices, such as clusters of hundreds of GPUs.
- Exporting programs ("graphs") to external runtimes such as servers, browsers, mobile and embedded devices.
Keras is the high-level API of TensorFlow 2: an approachable, highly-productive interface for solving machine learning problems, with a focus on modern deep learning. It provides essential abstractions and building blocks for developing and shipping machine learning solutions with high iteration velocity.
Keras empowers engineers and researchers to take full advantage of the scalability and cross-platform capabilities of TensorFlow 2: you can run Keras on TPU or on large clusters of GPUs, and you can export your Keras models to run in the browser or on a mobile device.
First contact with Keras
The core data structures of Keras are layers and models.
The simplest type of model is the
Sequential model, a linear stack of layers.
For more complex architectures, you should use the Keras functional API,
which allows to build arbitrary graphs of layers, or write models entirely from scratch via subclasssing.
Here is the
from tensorflow.keras.models import Sequential model = Sequential()
Stacking layers is as easy as
from tensorflow.keras.layers import Dense model.add(Dense(units=64, activation='relu')) model.add(Dense(units=10, activation='softmax'))
Once your model looks good, configure its learning process with
model.compile(loss='categorical_crossentropy', optimizer='sgd', metrics=['accuracy'])
If you need to, you can further configure your optimizer. The Keras philosophy is to keep simple things simple, while allowing the user to be fully in control when they need to (the ultimate control being the easy extensibility of the source code via subclassing).
model.compile(loss=tf.keras.losses.categorical_crossentropy, optimizer=tf.keras.optimizers.SGD( learning_rate=0.01, momentum=0.9, nesterov=True))
You can now iterate on your training data in batches:
# x_train and y_train are Numpy arrays. model.fit(x_train, y_train, epochs=5, batch_size=32)
Evaluate your test loss and metrics in one line:
loss_and_metrics = model.evaluate(x_test, y_test, batch_size=128)
Or generate predictions on new data:
classes = model.predict(x_test, batch_size=128)
What you just saw is the most elementary way to use Keras.
However, Keras is also a highly-flexible framework suitable to iterate on state-of-the-art research ideas. Keras follows the principle of progressive disclosure of complexity: it makes it easy to get started, yet it makes it possible to handle arbitrarily advanced use cases, only requiring incremental learning at each step.
In much the same way that you were able to train & evaluate a simple neural network above in a few lines,
you can use Keras to quickly develop new training procedures or exotic model architectures.
Here's a low-level training loop example, combining Keras functionality with the TensorFlow
import tensorflow as tf # Prepare an optimizer. optimizer = tf.keras.optimizers.Adam() # Prepare a loss function. loss_fn = tf.keras.losses.kl_divergence # Iterate over the batches of a dataset. for inputs, targets in dataset: # Open a GradientTape. with tf.GradientTape() as tape: # Forward pass. predictions = model(inputs) # Compute the loss value for this batch. loss_value = loss_fn(targets, predictions) # Get gradients of loss wrt the weights. gradients = tape.gradient(loss_value, model.trainable_weights) # Update the weights of the model. optimizer.apply_gradients(zip(gradients, model.trainable_weights))
For more in-depth tutorials about Keras, you can check out:
- Introduction to Keras for engineers
- Introduction to Keras for researchers
- Developer guides
- Other learning resources
Keras comes packaged with TensorFlow 2 as
To start using Keras, simply install TensorFlow 2.
Release and compatibility
Keras has nightly releases (
keras-nightly on PyPI)
and stable releases (
keras on PyPI).
The nightly Keras releases are usually compatible with the corresponding version
keras-nightly==2.7.0.dev2021100607 should be
We don't maintain backward compatibility for nightly releases.
For stable releases, each Keras
version maps to a specific stable version of TensorFlow.
The table below shows the compatibility version mapping between TensorFlow versions and Keras versions.
All the release branches can be found on Github.
All the release binaries can be found on Pypi.
|Keras release||Note||Compatible Tensorflow version|
|2.4||Last stable release of multi-backend Keras||< 2.5|
|2.5-pre||Pre-release (not formal) for standalone Keras repo||>= 2.5 < 2.6|
|2.6||First formal release of standalone Keras.||>= 2.6 < 2.7|
|2.7||(Upcoming release)||>= 2.7 < 2.8|
You can ask questions and join the development discussion:
- In the TensorFlow forum.
- On the Keras Google group.
- On the Keras Slack channel. Use this link to request an invitation to the channel.
Opening an issue
You can also post bug reports and feature requests (only) in GitHub issues.