News

We propose a novel differentiable graph neural network simulator (GNS) by combining reverse mode automatic differentiation (AD) of graph neural networks with gradient-based optimization for solving ...
We implement a differentiable version of the PUMA algorithm, NeuralPUMA, which we integrate into a traditional deep learning pipeline to implicitly learn to preprocess the wrapped phase into an ...
Discover a simple proof for characterizing differentiable quasiconvex functions. Explore our scientific journal for insightful research articles.
Researchers from Beijing Normal University, Central University of Finance and Economics, Zhejiang Normal University, and the University of York have developed a new hierarchical pooling method for ...
By wrapping the function rosenbach2 in jax_finite_difference, it will become completely compatible with JAX's automatic differentiation tooling, and works with other JAX primitives such as vmap.
De Finetti did not name this class of functions: the term “quasiconvex (quasiconcave) function” was given subsequently by Fenchel [2] . It is well-known that the above characterization is equivalent ...