Projects that are tagged with symbolic differentiation.


Logo Theano 1.0.0

by jaberg - November 16, 2017, 17:42:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 36708 views, 6160 downloads, 3 subscriptions

About: A Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Dynamically generates CPU and GPU modules for good performance. Deep Learning Tutorials illustrate deep learning with Theano.

Changes:

Theano 1.0.0 (15th of November, 2017)

Highlights (since 0.9.0):

  • Announcing that MILA will stop developing Theano <https://groups.google.com/d/msg/theano-users/7Poq8BZutbY/rNCIfvAEAwAJ>_

  • conda packages now available and updated in our own conda channel mila-udem To install it: conda install -c mila-udem theano pygpu

  • Support NumPy 1.13

  • Support pygpu 0.7

  • Moved Python 3.* minimum supported version from 3.3 to 3.4

  • Added conda recipe

  • Replaced deprecated package nose-parameterized with up-to-date package parameterized for Theano requirements

  • Theano now internally uses sha256 instead of md5 to work on systems that forbid md5 for security reason

  • Removed old GPU backend theano.sandbox.cuda. New backend theano.gpuarray is now the official GPU backend

  • Make sure MKL uses GNU OpenMP

  • NB: Matrix dot product (gemm) with mkl from conda could return wrong results in some cases. We have reported the problem upstream and we have a work around that raises an error with information about how to fix it.

  • Improved elemwise operations

  • Speed-up elemwise ops based on SciPy

  • Fixed memory leaks related to elemwise ops on GPU

  • Scan improvements

  • Speed up Theano scan compilation and gradient computation

  • Added meaningful message when missing inputs to scan

  • Speed up graph toposort algorithm

  • Faster C compilation by massively using a new interface for op params

  • Faster optimization step, with new optional destroy handler

  • Documentation updated and more complete

  • Added documentation for RNNBlock

  • Updated conv documentation

  • Support more debuggers for PdbBreakpoint

  • Many bug fixes, crash fixes and warning improvements


Logo DiffSharp 0.7.7

by gbaydin - January 4, 2016, 00:57:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 17574 views, 3222 downloads, 3 subscriptions

About: DiffSharp is a functional automatic differentiation (AD) library providing gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products as higher-order functions. It allows exact and efficient calculation of derivatives, with support for nesting.

Changes:

Fixed: Bug fix in forward AD implementation of Sigmoid and ReLU for D, DV, and DM (fixes #16, thank you @mrakgr)

Improvement: Performance improvement by removing several more Parallel.For and Array.Parallel.map operations, working better with OpenBLAS multithreading

Added: Operations involving incompatible dimensions of DV and DM will now throw exceptions for warning the user