All Versions
29
Latest Version
Avg Release Cycle
37 days
Latest Release
1205 days ago

Changelog History
Page 3

  • v0.3.0 Changes

    April 27, 2020

    ๐Ÿš€ Kornia 0.3.0 release

    ๐Ÿš€ Today we released 0.3.0 which aligns with PyTorch releases cycle and includes:

    • ๐Ÿ‘ Full support to PyTorch v1.5.
    • โœ… Semi-automated GPU tests coverage.
    • ๐Ÿ“š Documentation has been reorganized [docs]
    • Data augmentation API compatible with torchvision v0.6.0.
    • Well integration with ecosystem e.g. Pytorch-Lightning.

    ๐Ÿš€ For more detailed changes check out v0.2.1 and v0.2.2.

    Highlights

    Data Augmentation

    โœ… We provide kornia.augmentation a high-level framework that implements kornia-core functionalities and is fully compatible with torchvision supporting batched mode, multi device cpu, gpu, and xla/tpu (comming), auto differentiable and able to retrieve (and chain) applied geometric transforms. To check how to reproduce torchvision in kornia refer to this Colab: Kornia vs. Torchvision @shijianjian

    import kornia as Kimport torchvision as T# korniatransform\_fcn = torch.nn.Sequential( K.augmentation.RandomAffine( [-45., 45.], [0., 0.5], [0.5, 1.5], [0., 0.5], return\_transform=True), K.color.Normalize(0.1307, 0.3081), )# torchvisiontransform\_fcn = T.transforms.Compose([T.transforms.RandomAffine( [-45., 45.], [0., 0.5], [0.5, 1.5], [0., 0.5]), T.transforms.ToTensor(), T.transforms.Normalize((0.1307,), (0.3081,)), ])
    

    Ecosystem compatibility

    Kornia has been designed to be very flexible in order to be integrated in other existing frameworks. See the example below about how easy you can define a custom data augmentation pipeline to later be integrated into any training framework such as Pytorch-Lighting. We provide examples in [here] and [here].

    class DataAugmentatonPipeline(nn.Module): """Module to perform data augmentation using Kornia on torch tensors."""def \_\_init\_\_(self, apply\_color\_jitter: bool = False) -\> None: super().\_\_init\_\_() self.\_apply\_color\_jitter = apply\_color\_jitterself.\_max\_val: float = 1024.self.transforms = nn.Sequential( K.augmentation.Normalize(0., self.\_max\_val), K.augmentation.RandomHorizontalFlip(p=0.5) ) self.jitter = K.augmentation.ColorJitter(0.5, 0.5, 0.5, 0.5) @torch.no\_grad() # disable gradients for effiencydef forward(self, x: torch.Tensor) -\> torch.Tensor: x\_out = self.transforms(x) if self.\_apply\_color\_jitter: x\_out = self.jitter(x\_out) return x\_out
    

    โœ… GPU tests

    โœ… Now easy to run GPU tests with pytest --typetest cuda

  • v0.2.2 Changes

    April 26, 2020

    ๐Ÿš€ Kornia 0.2.2 Release Notes

    ๐Ÿš€ This release is a checkpoint with minimum data augmentation API stability plus fixing some GPU tests before kornia upgrades to PyTorch v.1.5.0.

    • API changes
    • ๐Ÿ‘Œ Improvements
    • ๐Ÿ› Bug Fixes
    • ๐Ÿ“š Documentation

    API changes

    • Decoupled return_transform from apply_* function (#534)

    ๐Ÿ‘Œ Improvements

    • ๐Ÿ‘Œ improve setup packaging and build manywheel script (#543)

    ๐Ÿ› Bug Fixes

    • ๐Ÿ›  fix broken gpu tests (#538)
    • โšก๏ธ update sosnet urls (#541)

    ๐Ÿ“š Documentation

    • ๐Ÿ“„ reorganises color docs and adds ycbcr (#540)
    • reorganise documenation in subsections (#542)
  • v0.2.1 Changes

    April 21, 2020

    ๐Ÿš€ Kornia 0.2.1 Release Notes

    • Highlights
    • API changes
    • ๐Ÿ†• New Features
    • ๐Ÿ‘Œ Improvements
    • ๐Ÿ› Bug Fixes
    • ๐ŸŽ Performance

    Highlights

    ๐Ÿš€ In this release we support compatibility between kornia.augmentation and torchvision.transforms.

    ๐Ÿ‘ We now support all the same existing operations with torch.Tensor in the GPU with extra features such as returning for each operator the transformation matrix generated to produce such transformation.

    import kornia as Kimport torchvision as T# korniatransform\_fcn = torch.nn.Sequential( K.augmentation.RandomAffine( [-45., 45.], [0., 0.5], [0.5, 1.5], [0., 0.5], return\_transform=True), K.color.Normalize(0.1307, 0.3081), )# torchvisiontransform\_fcn = T.transforms.Compose([T.transforms.RandomAffine( [-45., 45.], [0., 0.5], [0.5, 1.5], [0., 0.5]), T.transforms.ToTensor(), T.transforms.Normalize((0.1307,), (0.3081,)), ])
    

    ๐Ÿ“š Check the online documentations with the updated API [DOCS]

    ๐Ÿ‘€ Check this Google Colab to see how to reproduce same results [Colab]

    kornia.augmentation as a framework

    In addition, we have re-designed kornia.augmentation such in a way that users can easily contribute with more operators, or just use it as a framework to create their custom operators.

    Each of the kornia.augmentation modules inherit from AugmentationBase and one can easily define a new operator by creating a subclass and overriding a couple of methods.

    Let's take a look at a custom MyRandomRotation . The class inherits from AugmentationBase making it a nn.Module so that can be stacked in a nn.Sequential to compute chained transformations.

    To implement a new functionality two things needed: override get_params and apply

    The get_params receives the shape of the input tensor and returns a dictionary with the parameters to use in the apply function.

    The applyfunction receives as input a tensor and the dictionary defined in get_params; and returns a tuple with the transformed input and the transformation applied to it.

    class MyRandomRotation(AugmentationBase): def \_\_init\_\_(self, angle: float, return\_transform: bool = True) -\> None: super(MyRandomRotation, self).\_\_init\_\_(self.apply, return\_transform) self.angle = angledef get\_params(self, batch\_shape: torch.Size) -\> Dict[str, torch.Tensor]: angles\_rad torch.Tensor = torch.rand(batch\_shape) \* K.piangles\_deg = kornia.rad2deg(angles\_rad) \* self.anglereturn dict(angles=angles\_deg) def apply(self, input: torch.Tensor, params: Dict[str, torch.Tensor]): # compute transformationangles: torch.Tensor = params['angles'].type\_as(input) center = torch.tensor([[W / 2, H / 2]]).type\_as(input) transform = K.get\_rotation\_matrix2d( center, angles, torch.ones\_like(angles)) # apply transformationoutput = K.warp\_affine(input, transform, (H, W)) return (output, transform)# how to use it# load an image and cast to tensorimg1: torch.Tensor = imread(...) # BxDxHxW# instantiate and apply the transformaug = MyRandomRotation(45., return\_transformation=True)img2, transform = aug(img1) # BxDxHxW - Bx3x3
    

    ๐Ÿ†• New Features

    kornia.color

    • Implement RGB to XYZ (#436)
    • Implement RGB to LUV (#442)
    • Implement histogramd2 (#530)

    kornia.feature

    • Implement hardnet descriptor (#498)
    • Implement deep descriptor sosnet (#521)

    kornia.jit

    • Create kornia.jit module and exposes rgb_to_grayscale (#261)

    API Changes

    • โœ‚ Remove PIL dependency (#512)
    • Remove float casting in image_to_tensor (#497)

    ๐Ÿ‘Œ Improvements

    • โž• Adds gradcheck for RandomCrop and RandomResizedCrop (#439)
    • Update spatial_soft_argmax.py (#496)
    • โž• Add epsilon value to make hessian matrix robust (#504)
    • โž• Add normalize_points flag in depth to 3d (#511)
    • ๐ŸŽ Functional augmentation performance test against Torchvision (#482)
    • ๐Ÿ›  AffineTransformation alignment and other fixes (#514)

    ๐ŸŽ Performance

    • Filter speed up conditional (#433)
      • Improves by far the time performance for filtering.
    • Speed-up warp_affine and fix bugs in RandomAffine (#474)
    • ๐Ÿ‘Œ Improve homography warper (#528)

    ๐Ÿ“„ Docs

    • ๐Ÿ‘‰ Make link work for PSNRLoss (#449)
    • ๐Ÿ”„ Change psnr to psnr_loss in docs (#450)
    • ๐Ÿ›  Fix import problem and fix docs for LuvToRgb and PSNR (#447)
    • ๐Ÿ›  Fix outdated example (#465)
    • โšก๏ธ Update color_adjust.py (#479)
    • Missing commas in bibtex (#500)

    ๐Ÿ› Bug fixes

    • ๐Ÿ›  Fix device problem in test (#456)
    • ๐Ÿ› Bug fixed in device tests (#475)
    • โž• Add epsilon value to sobel to improve backprop stability (#513)
  • v0.2.0 Changes

    January 27, 2020

    ๐Ÿš€ Kornia 0.2.0 Release Notes

    • Highlights
    • ๐Ÿ†• New Features
      • kornia.color
      • kornia.feature
      • kornia.geometry
      • kornia.losses
    • ๐Ÿ‘Œ Improvements
    • ๐Ÿ› Bug Fixes

    ๐Ÿš€ Kornia v0.2.0 release is now available.

    ๐Ÿš€ The release contains over 50 commits and updates support to PyTorch 1.4. This is the result of a huge effort in the desing of the new data augmentation module, improvements in the set of the color space conversion algorithms and a refactor of the testing framework that allows to test the library using the cuda backend.

    Highlights

    Data Augmentation API

    โœ… From this point forward, we will give support to the new data augmentation API. The kornia.augmentation module mimics the best of the existing data augmentation frameworks such torchvision or albumentations all re-implemented assuming as input torch.Tensor data structures that will allowing to run the standard transformations (geometric and color) in batch mode in the GPU and backprop through it.

    โšก๏ธ In addition, a very interesting feature we are very proud to include, is the ability to return the transformation matrix for each of the transform which will make easier to concatenate and optimize the transforms process.

    A quick overview of its usage:

    import torchimport korniainput: torch.Tensor = load\_tensor\_data(....) # BxCxHxWtransforms = torch.nn.Sequential( kornia.augmentation.RandomGrayscale(), kornia.augmentation.RandomAffine(degrees=(-15, 15)), ) out: torch.Tensor = transforms(input) # CPUout: torch.Tensor = transforms(input.cuda()) # GPU# same returning the transformation matrixtransforms = torch.nn.Sequential( kornia.augmentation.RandomGrayscale(return\_transformation=True), kornia.augmentation.RandomAffine(degrees=(-15, 15), return\_transformation=True), ) out, transform = transforms(input) # BxCxHxW , Bx3x3
    

    This are the following features found we introduce in the module:

    • BaseAugmentation (#407)
    • ColorJitter (#329)
    • RandomHorizontalFlip (#309)
    • MotionBlur (#328)
    • RandomVerticalFlip (#375)
    • RandomErasing (#344)
    • RandomGrayscale (#384)
    • Resize (#394)
    • CenterCrop (#409)
    • RandomAffine (#403)
    • RandomPerspective (#403)
    • RandomRotation (#397, #418)
    • RandomCrop (#408)
    • RandomResizedCrop (#408)
    • Grayscale

    โœ… GPU Test

    ๐Ÿ”จ We have refactored our testing framework and we can now easily integrate GPU tests within our library. At this moment, this features is only available to run locally but very soon we will integrate with CircleCI and AWS infrastructure so that we can automate the process.

    โœ… From root one just have to run: make test-gpu

    โœ… Tests look like this:

    import torchfrom test.common import devicedef test\_rgb\_to\_grayscale(self, device): channels, height, width = 3, 4, 5 img = torch.ones(channels, height, width).to(device) assert kornia.rgb\_to\_grayscale(img).shape == (1, height, width)
    

    Ref PR:

    ๐Ÿ†• New Features

    kornia.color

    We have added few more algorithms for color space conversion:

    kornia.geometry

    • Implement kornia.hflip, kornia.vflip and kornia.rot180 (#268)
    • Implement kornia.transform_boxes (#368)

    kornia.losses

    • Implements to total_variation loss (#250)
    • Implement PSNR loss (#272)

    kornia.feature

    • โž• Added convenience functions for work with LAF: get keypoint, orientation (#340)

    ๐Ÿ‘Œ Improvements

    • ๐Ÿ›  Fixed conv_argmax2d/3d behaviour for even-size kernel and added test (#227)
    • Normalize accepts floats and allows broadcast over channel dimension (#236)
    • ๐Ÿ‘ Single value support for normalize function (#301)
    • โž• Added boundary check function to local features detector (#254)
    • Correct crop_and_resize on aspect ratio changes. (#305)
    • Correct adjust brightness and contrast (#304)
    • โž• Add tensor support to Hue, Saturation and Gamma (#324)
    • Double image option for scale pyramid (#351)
    • Filter2d speedup for older GPUs (#356)
    • ๐Ÿ›  Fix meshgrid3d function (#357)
    • โž• Added support for even-sized filters in filter2d (#374)
    • โœ… Use latest version of CircleCI (#373)
    • Infer border and padding mode to homography warper (#379)
    • Apply normalization trick to conv_softmax (#383)
    • ๐Ÿ‘ Better nms (#371)
      • added spatial gradient 3d
      • added hardnms3d and tests for hardnms 2d
      • quadratic nms interp
      • update the tests because of changed gaussian blur kernel size in scale pyramid calculation
      • no grad for spatial grad
    • Focal loss flat (#393)
    • โž• Add optional mask parameter in scale space (#389)
    • โšก๏ธ Update to PyTorch 1.4 (#402)

    ๐Ÿ› Bug fixes

    • โž• Add from homogeneous zero grad test and fix it (#369)
    • Filter2d failed with noncontiguous input (view --> reshape) (#377)
    • โž• Add ceil_mode to maxblur pool to be able to be used in resnets (#395)

    As usual, thanks to the community to keep this project growing.
    Happy coding ! ๐ŸŒ„

  • v0.1.4 Changes

    October 05, 2019

    Table of Contents

    ๐Ÿ’ป We have just released Kornia: a differentiable computer vision library for PyTorch.

    ๐Ÿ“ฆ It consists of a set of routines and differentiable modules to solve generic computer vision problems. At its core, the package uses PyTorch as its main backend both for efficiency and to take advantage of the reverse-mode auto-differentiation to define and compute the gradient of complex functions.

    ๐Ÿ“ฆ Inspired by OpenCV, this library is composed by a subset of packages containing operators that can be inserted within neural networks to train models to perform image transformations, epipolar geometry, depth estimation, and low level image processing such as filtering and edge detection that operate directly on tensors.

    ๐Ÿ”จ It has over 300 commits and majorly refactors the whole library including over than 100 functions to solve generic Computer Vision problems.

    Highlights

    ๐Ÿ”– Version 0.1.4 includes a reorganization of the internal API grouping functionalities that consists of the following components:

    • kornia | a Differentiable Computer Vision library like OpenCV, with strong GPU support.
    • kornia.color | a set of routines to perform color space conversions.
    • kornia.contrib | a compilation of user contrib and experimental operators.
    • kornia.feature | a module to perform local feature detection.
    • kornia.filters | a module to perform image filtering and edge detection.
    • kornia.geometry | a geometric computer vision library to perform image transformations, 3D linear algebra and conversions using differen camera models.
    • kornia.losses | a stack of loss functions to solve different vision tasks.
    • kornia.utils | image to tensor utilities and metrics for vision problems.

    Big contribution in kornia.features:

    • Implemented anti-aliased local patch extraction.
    • Implemented classical local features cornerness functions: Harris, Hessian, Good Features To Track.
    • Implemented basic functions for work with local affine features and their patches.
    • Implemented convolutional soft argmax 2d and 3d operators for differentable non-maxima suppression.
    • implemented second moment matrix affine shape estimation, dominant gradient orientation and SIFT patch descriptor.

    Infrastructure

    ๐Ÿ’ฅ Breaking Changes

    • โœ‚ Removed nms and normalization from Harris response function 2209807
    • ๐Ÿ“‡ Renames GaussianBlur -> GaussianBlur2d and added an input to specify pad b0c522e
    • ๐Ÿ‘ Chaneged batch support for tensor2img and img2tensor 705a82f
    • ๐Ÿ›  Fixed torch.clamp for homogeneous division 506b0c9

    ๐Ÿ†• New Features

    • Several functionalities for Local Affine Frame (LAF): extract_patches_from_pyramid, extract_patches_simple, normalize_laf, ellipse_to_laf, make_upright, scale_laf, get_laf_scale 0a3cbb0
    • Differentiable SIFT descriptor 7f0eb80
    • โž• Added implementation of the differentiable spatial to numerical (DSNT) layer and related operations. abb4afa
    • Spatial gradient 1d 362adfc
    • Scale pyramid for local features detection 413051e
    • Added geometry.depth submodule including: depth_to_3d, depth_to_normals, warp_frame_depth d1dedb8
    • Implement Gaussian Pyramid bc586cb
    • Implement Filter2D to apply arbitrary depthwise 2d kernels 94b56f2
    • Implement to save/load pointclouds 4f32351
    • implement project_points 636f4f5
    • Implement unproject_points b02f403
    • Implement denormalize_coordinates b07ec45
    • Implement harris_corner detector 977a1f6
    • Implement non_maxima_suppression_2d 84cc128
    • Implement median_filter 6b6cf05
    • Implement blur_filter d4c8df9
    • Implement sobel_filter operator 9abe4c5
    • Implement max_blur_pool_2d 621be3b
    • Implement pyrup and pyrdown a4e110c
    • Implemen crop_and_resize 41b4fed
    • Implement center_crop b1188d5
    • Implement inverse_affine_matrix 6e10fb9
    • Implement opencv like remap function b0401de
    • Implement affine ceb3faf
    • Implement shear 81c5a27
    • Implement scale 75a84a3
    • Implement translate 11af4dd
    • Implement rotate 89c6d96
    • Implement Laplacian filter 5e3a89a
    • Implement rgb_to_gray 9a2bea6
    • Implement vectorised confusion_matrix f306062
    • Implement normalization on tensors 4c3f8fa
    • Implement rgb_to_bgr e25f6a4
    • Implement hsv_to_rgb 9726872
    • Implement adjust_brightness b8fd8b6

    ๐Ÿ› Bug Fixes

    • Normalize filtering functions kernels a301e3c
    • ๐Ÿ›  Fix the bug in spatial gradient with padding 5635e45
    • โœ… Disable JIT tests 4649317
    • ๐Ÿ›  Fix pyrdown with avg_pool2d b835143
    • Fix formulation issue in rotation_matrix_to_quaternion 58c6e8e
    • Switch torch.gesv -> torch.solve in get_perspective_transform c347a41
    • Fix and refactor test_warp_perspective d19121e
    • ๐Ÿ›  Fixed and updated the quaternion related docs to reflect quaternions 0161f65
    • โœ‚ Remove some unused test functions a64a8fb

    Contributors

  • v0.1.3

    October 02, 2019
  • v0.1.2 Changes

    March 14, 2019

    ๐Ÿ“ฆ Package

    • Migrated the project to Arraiy Open Source Organization: https://github.com/arraiyopensource/torchgeometry. 48ad11f
    • โšก๏ธ Update with support of PyTorch v1.0.1. In fact, we test each time against nightly builds. 5c9d9ae
    • ๐Ÿ›  Fix issue with pip package PyTorch minimal version. Now we require at least v1.0.0. 6e16734
    • ๐Ÿ“ฆ Package version file is auto-generate and too keep tracked sha. f337b3c
    • โž• Added codecov support to keep tracked tested code. e609b21

    ๐Ÿ’ฅ Breaking Changes

    • ๐Ÿ”จ Refactor DepthWarper API - now accepts PinholeCamera objects as parameters:

      >>> # pinholes camera models>>> pinhole_dst = tgm.PinholeCamera(...)>>> pinhole_src = tgm.PinholeCamera(...)>>> # create the depth warper, compute the projection matrix>>> warper = tgm.DepthWarper(pinhole_dst, height, width)>>> warper.compute_projection_matrix(pinhole_src)>>> # warp the destionation frame to reference by depth>>> depth_src = torch.ones(1, 1, 32, 32) # Nx1xHxW>>> image_dst = torch.rand(1, 3, 32, 32) # NxCxHxW>>> image_src = warper(depth_src, image_dst) # NxCxHxW

    ๐Ÿ†• New Features

    • โž• Added new PinholeCamera API to represent pinhole camera models. b6ec592
      pinhole_model
    • ๐Ÿ”จ Refactor and moved code from conversions.py and created a dedicated module for linear transforms transformations.py. a1c25b1
      • boxplus_transformation, boxminus_transformation, inverse_transformation, transform_points.
    • โž• Added a collection of losses:
    • โž• Added SpatialSoftArgmax2d operator to extract 2D coordinates from probability maps. cf7bb29
    • Added extract_tensor_patches routine similar to tf.extract_image_patches but for multidimensional tensors instead of images. f60fa57
    • Added boxplus_transform and boxminus_transform to compose or compute relative pose functions. e0882ea

    ๐Ÿ› Bug Fixes

    • ๐Ÿ›  Fixed DepthWarper in order to accept mini-batch computation. 7175b4f
    • โž• Added missing tests for warp_affine. 57cbd29
    • Fixed and refactored quaternion_to_axis_angle and axis_angle_to_quaternion to avoid nans. 4aa0bca

    โœ… Test

    contributors:

  • v0.1.1 Changes

    January 08, 2019

    Table of Contents

    • ๐Ÿ’ฅ Breaking Changes
    • ๐Ÿ†• New Features
    • ๐Ÿ› Bug Fixes
    • ๐Ÿ“š Documentation improvements

    ๐Ÿ’ฅ Breaking Changes

    • ๐Ÿšš tgm.inverse has been removed since now Pytorch supports batched version for torch.inverse f6c210d

    ๐Ÿ†• New Features

    • โž• Added tgm.warp_perspective matching OpenCV interface d53cbce
    • Added tgm.get_perspective_transform matching OpenCV interface
      a7db348
    • Added tgm.get_rotation_matrix2d matching OpenCV interface
      876b2c6

    ๐Ÿ› Bug Fixes

    • ๐Ÿ›  Fixed bug for inplace operation in tgm.inverse_pose 0aba15d

    ๐Ÿ“š Documentation improvements

    • Added notebook tutorial for tgm.warp_affine and tgm.warp_pesrpective 894bf52

    Other improvements

    • โšก๏ธ Update to Pytorch v1.0.0 3ee14c8
    • ๐Ÿ”จ Refactor in testing framework. Removed unittest and now using pytest since it's easy to parametrize unit tests.
      • parametrized tests for different batch sizes
      • parametrized tests for device types: cpu and cuda. Note: need to test on cuda yet.
    • ๐Ÿ“ฆ Now we have official pip package to install the library: pip install torchgeometry
  • v0.1.0 Changes

    October 03, 2018

    This is the initial release of the torchvision package.

    ๐Ÿ’ป It contains a set of routines and modules for geometric computer vision implementing multi-view reprojection primitives that work on images and feature maps warping. In addition, we provide routines for conversions and utilities for the pinhole model cameras.

    Table of Contents

    API

    โœ… Test

    • ๐Ÿณ docker kit ae8d9e1
    • โœ… automated test with TravisCI c213e3e
    • ๐Ÿ‘• lint tests c213e3e

    ๐Ÿ“š Documentation

    Examples: