Books    with  my name in title

updated Jan 20, 2023

109  books with Wasserstein or WGAN in title



 

1988 book 1

On a Formula for the L2 Wasserstein Metric Between Measures on Eclidean  and Hilbert Spaces 

Matthias Gelbrich - 1988  

Publisher

Sekt. Mathematik der Hƒxxxumboldt-Univ., 

Length

26 pages

 1988 book
On a formula for the $L 2$ Wasserstein metric between measures on Euclidean and Hilbert spaces

Author:M. Gelbrich
Print Book, 1988
English
Publisher:Humboldt-Universität zu Berlin. Sektion Mathematik, Berlin, 1988
On a formula for the L2 Wasserstein metric between measures on Euclidean and Hilbert spaces

Author:Matthias Gelbrich
Print Book, 1988




1989 book 2

Lp-Wasserstein-Metriken [Lp-Wasserstein-Metriken] und  Approximationen stochastischer Differentialgleichungen

Matthias Gelbrich,  1989 154 pages 

 

1990 book 3

Duality Theorems for Kantorovich-Rubinstein and Wasserstein Functionals

Svetlozar Todorov Rachev, ‎Rae Michael Shortt - 1990 - print book.

Warszawa : Państwowe Wydawn. Naukowe, 1990. 

ISSN 0012-3862  based on Ph.D.


 1990  book 4

Duality theorems for Kantorovich-Rubinstein and Wasserstein functionals

ST Rachev - 1990 - eudml.org

Abstract top CONTENTS § 0. Introduction............................................................................................

....................................... 5 § 1. Notation and terminology......................................................................

........................................ 6 § 2. A generalization of the Kantorovich-Rubinstein …

Cited by 40 Related articles All 6 versions 

[CITATION] Duality theorems for Kantorovich-Rubinstein and Wasserstein functionals.(1990)

ST Rachev - 1990

Duality theorems for Kantorovich-Rubinstein and Wasserstein functionals

book

1991 book 5

Skorohod Representation Theorem and Wasserstein Metrics

J. A. Cuesta Albertos, ‎Carlos Matrán Bea - 1991


 

1993 ook 6

On Lower Bounds for the L2-Wasserstein Metric in a Hilbert Space

J. A. Cuesta-Albertos, ‎Carlos Matrán Bea, ‎Universidad de Cantabria. Servicio de Publicaciones - 1993


1993 book 7
On lower bounds for the $ L ^2 $-Wasserstein metric in a Hilbert space

Authors:J. A. Cuesta-AlbertosC. Matrán-BeaA. Tuero-Diaz
Print Book, 1993
English
Publisher:Universidad de Cantabria. Departamento de Matematicas, Estadistica y Computacion, Santander, 1993

 
<— 7 boooks till 1993


1999

 Shape Recognition Via Wasserstein Distance

Wilfrid Gangbo, ‎Robert J. MacCann - 1999 


1999
Approximations of parabolic equations based upon Wasserstein's variational principle (Research report / 

Mathematical Sciences. Center for Nonlinear Analysis) 

by David Kinderlehrer | Jan 1, 1999

David Kinderlehrer and Noel J. Walkington

ABSTRACT: We illustrate how some interesting new variational principles can be used for the numerical approximation of solutions to certain (possibly degenerate) parabolic partial differential equations. One remarkable feature of the algorithms presented here is that derivatives do not enter into the variational principles, so, for example, discontinuous approximations may be used for approximating the heat equation. We present formulae for computing a Wasserstein metric which enters into the variational formulations. 

Key Words. Wasserstein Metric, Parabolic Equations, Numerical Approximations 

ASIN : B0006S8O86    DOI https://doi.org/10.1051/m2an:1999166 


<—9 books till 1999

 

2005

Eulerian Calculus for the Contraction in the Wasserstein Distance

Felix Otto, ‎Michael Westdickenberg - 2005 - ‎No preview


2005  book
Gradient flows in metric spaces and in the space of probability measures

Authors:Luigi AmbrosioNicola GigliGiuseppe Savaré
Summary:"This book is devoted to a theory of gradient flows in spaces which are not necessarily endowed with a natural linear or differentiable structure. It consists of two parts, the first one concerning gradient flows in metric spaces and the second one devoted to gradient flows in the space of probability measures on a separable Hilbert space, endowed with the Kantorovich-Rubinstein-Wasserstein distance."--BOOK JACKETShow more
Print Book, 2005
English
Publisher:Birkhauser, Boston, 2005



2007

Absolutely Continuous Curves in Wasserstein Spaces: With ...

Stefano Lisini - 2007 



2008

Separability and completeness for the Wasserstein distance 

By: Bolley, F. 

SEMINAIRE DE PROBABILITES XLI   Book Series: ‏ Lecture Notes in Mathematics   Volume: ‏ 1934   Pages: ‏ 371-377   Published: ‏ 2008 

Times Cited: 40 


2008

Differential forms on Wasserstein space and infinite-dimensional Hamiltonian systems

W Gangbo, HK Kim, T Pacini - 2008 - cds.cern.ch

Let M denote the space of probability measures on R^ D endowed with the Wasserstein metric. A differential calculus for a certain class of absolutely continuous curves in M was introduced by Ambrosio, Gigli and Savare'. In this paper we develop a calculus for the …

[PDF] umd.edu

[PDF] DIFFERENTIAL FORMS ON WASSERSTEIN SPACE AND



2008 book
Well-posedness of a parabolic moving-boundary problem in the setting of Wasserstein gradient flows

Author

Authors:J.W. PortegiesM.A. Peletier
Book, 2008
Publication:2008
Publisher:s.n, 2008


<— 15  book titles till 2008
end 2008



2009

The Wasserstein distances 

Book Author(s): Villani, C

OPTIMAL TRANSPORT: OLD AND NEW   Book Series: ‏ Grundlehren der Mathematischen Wissenschaften   Volume: ‏ 338   Pages: ‏ 93-111   Published: ‏ 2009 


<— 16  book titles till 2009 

end 2009


2010    1

[BOOK  Differential forms on Wasserstein space and infinite-dimensional Hamiltonian systems

W Gangbo, HK Kim, T Pacini - 2010 - books.google.com

Let $\mathcal {M} $ denote the space of probability measures on $\mathbb {R}^ D $

endowed with the Wasserstein metric. A differential calculus for a certain class of absolutely

continuous curves in $\mathcal {M} $ was introduced by Ambrosio, Gigli, and Savare. In this …

  Cited by 35 Related articles All 20 versions 


  

2010   2

 Differential Forms on Wasserstein Space and Infinite-dimensionall Hamiltonian Systems Hamilopnian Systems

Wilfrid Gangbo, ‎Hwa Kil Kim, ‎Tommaso Pacini - 2010 - ‎Preview - ‎More editions

Cited by 35 Related articles All 19 versions 
[BOOK] Differential forms on Wasserstein space and infinite-dimensional Hamiltonian systems

W Gangbo, HK Kim, T Pacini - 2010 - books.google.com

Let $\mathcal {M} $ denote the space of probability measures on $\mathbb {R}^ D $

endowed with the Wasserstein metric. A differential calculus for a certain class of absolutely

continuous curves in $\mathcal {M} $ was introduced by Ambrosio, Gigli, and Savare. In this …

  Cited by 35 Related articles All 20 versions 


2010   3

Wasserstein Metric: Mathematics, Metric (mathematics), Measure (mathematics), Metric Space, Computer Science, Earth Mover's Distance, Radon Measure, Conditional Probability, Expected Value

ISBN 10: 6130361807 ISBN 13: 9786130361808 

Publisher: Betascript Publishers, 2010 


2010    4

Wasserstein Metric 

Lambert M. Surhone

Published by Betascript Publishers Jan 2010 (2010) 

ISBN 10: 6130361807 ISBN 13: 9786130361808 


2010   5
From a large-deviations principle to the Wasserstein gradient flow : a new micro-macro passage

Authors:S. AdamsN. DirrM.A. PeletierJ. Zimmer
Summary:We study the connection between a system of many independent Brownian particles on one hand and the deterministic diffusion equation on the other. For a fixed time step h > 0, a large-deviations rate functional Jh characterizes the behaviour of the particle system at t = h in terms of the initial distribution at t = 0. For the diffusion equation, a single step in the time-discretized entropy-Wasserstein gradient fl ow is characterized by the minimization of a functional Kh. We establish a new connection between these systems by proving that Jh and Kh are equal up to second order in h as h --> 0. This result gives a microscopic explanation of the origin of the entropy-Wasserstein gradient fl ow formulation of the diffusion equation. Simultaneously, the limit passage presented here gives a physically natural description of the underlying particle system by describing it as an entropic gradient flowShow more
Book, 2010
Publication:2010
Publisher:Technische Universiteit Eindhoven, 2010


<— 21  book titles till 2010
end 2010


2011 1

Optimal Vector Quantization in Terms of Wasserstein Distance

Wolfgang Kreitmeier - 2011 


2011 2

Stability of the Global Attractor Under Markov-Wasserstein Noise

Martin Kell - 2011 - ‎No preview

We develop a weak Waz.ewski principleʺ for discrete and continuous time dynamical systems on metric spaces having a weaker topology to show that attractors can be continued in a weak sense.


2011 3
Geometric Approach to Evolution Problems in Metric Spaces : 

Product Formulas in CAT(0) Spaces Fokker-Planck Equation Maximal Monotone Operators on Wasserstein Spaces

Igor Stojkovi - 2011 - 240 pages

This PhD thesis contains four chapters where research material on a range of different topics is presented.

 Geometric approach to evolution problems in metric spaces: Product Formulas in CAT(0) spaces Fokker-Planck equation Maximal Monotone Operators on Wasserstein spaces 

by Igor Stojković | Sep 1, 2011

  paperback ISBN  9783845435633

2011 4 
Passing to the limit in a Wasserstein gradient flow : from diffusion to reaction

Authors:S. ArnrichArnrich, S. (Creator), Mielke, A. (Creator), Peletier, M.A. (Creator), Savaré, G. (Creator), Veneroni, M. (Creator)
Summary:We study a singular-limit problem arising in the modelling of chemical reactions. At finite $\epsilon > 0$, the system is described by a Fokker-Planck convection-diffusion equation with a double-well convection potential. This potential is scaled by $1/\epsilon$, and in the limit $\epsilon \rightarrow 0$, the solution concentrates onto the two wells, resulting into a limiting system that is a pair of ordinary differential equations for the density at the two wells. This convergence has been proved in Peletier, Savaré, and Veneroni, SIAM Journal on Mathematical Analysis, 42(4):1805-1825, 2010, using the linear structure of the equation. In this paper we re-prove the result by using solely the Wasserstein gradient-flow structure of the system. In particular we make no use of the linearity, nor of the fact that it is a second-order system. The first key step in this approach is a reformulation of the equation as the minimization of an action functional that captures the property of being a curve of maximal slope in an integrated form. The second important step is a rescaling of space. Using only the Wasserstein gradient-flow structure, we prove that the sequence of rescaled solutions is pre-compact in an appropriate topology. We then prove a Gamma-convergence result for the functional in this topology, and we identify the limiting functional and the differential equation that it represents. A consequence of these results is that solutions of the $\epsilon$-problem converge to a solution of the limiting problemShow more
eBook, 2011
English
Publication:Eindhoven : Technische Universiteit Eindhoven, CASA-report, (2011), 28 pp.
Publisher:Technische Universiteit Eindhoven, 2011



2011 5

Hamiltonian Systems and the Calculus of Differential Forms on the  Wasserstein space

Hwa Kil Kim - Proquest, Umi Dissertation Publishing, 2011 - 90 pages  ISBN1244090794, 9781244090798

This thesis consists of two parts. In the first part, we study stability properties of Hamiltonian systems on the Wasserstein space. Let H be a Hamiltonian satisfying conditions imposed in [2]. We regularize H via Moreau-Yosida approximation to get Htau and denote by mutau a solution of system with the new Hamiltonian Htau. Suppose Htau converges to H as tau tends to zero. We show mutau converges to mu and mu is a solution of a Hamiltonian system which is corresponding to the Hamiltonian H. At the end of first part, we give a sufficient condition for the uniqueness of Hamiltonian systems. In the second part, we develop a general theory of differential forms on the Wasserstein space. Our main result is to prove an analogue of Green's theorem for 1-forms and show that every closed 1-form on the Wasserstein space is exact. If the Wasserstein space were a manifold in the classical sense, this result wouldn't be worthy of mention. Hence, the first cohomology group, in the sense of de Rham, vanishes.


<— 26  book titles till 2011
 end 2011


2012 1

Entropic Burgers & Rsquo; Equation Via a Minimizing Movement Scheme Based on the Wasserstein Metric

Nicola Gigli, ‎Felix Otto - 2012 

As noted by the second author in the context of unstable two-phase porous medium flow, entropy solutions of Burgers & rsquo; equation can be recovered from a minimizing movement scheme involving the Wasserstein metric in the limit of ...
 
2012 ebook
Entropic Burgers & rsquo; equation via a minimizing movement scheme based on the Wasserstein metric

Author

Authors:Nicola GigliFelix Otto (
Summary:As noted by the second 

Author in the context of unstable two-phase porous medium flow, entropy solutions of Burgers & rsquo; equation can be recovered from a minimizing movement scheme involving the Wasserstein metric in the limit of vanishing time step size [4]. In this paper, we give a simpler proof by verifying that the anti-derivative is a viscosity solution of the associated Hamilton Jacobi equationShow more
eBook, 2012
English
Publisher:Niedersächsische Staats- und Universitätsbibliothek Max-Planck-Institut für Mathematik in den Naturwissenschaften, Göttingen, Leipzig, 2012


2012 2
Burenruzie in het Jodendom' [Review of: B. Wasserstein. Aan de vooravond]

Author:J.T.M. Houwink Ten Cate
Book
Publication:2012

<— 28  book titles till 2012
end 2012

 start 2013


2013 1 eBook
Asymptotic equivalence of the discrete variational functional and a rate-large-deviation-like functional in the Wasserstein gradient flow of the porous medium equation
Show more

Authors:M.H. DuongDuong, M.H. (Creator)
Summary:In this paper, we study the Wasserstein gradient flow structure of the porous medium equation. We prove that, for the case of q -Gaussians on the real line, the functional derived by the JKO-discretization scheme is asymptotically equivalent to a rate-large-deviation-like functional. The result explains why the Wasserstein metric as well as the combination of it with the Tsallis-entropy play an important roleShow more
eBook, 2013
English
Publication:s.n., arXiv.org, (2013), 17 pp.
Publisher:s.n, 2013
2013 book
Asymptotic equivalence of the discrete variational functional and a rate-large-deviation-like functional in the Wasserstein 


2013 2 book
Microscopic interpretation of Wasserstein gradient flows

Author:D.R.M. Renger
Summary:The discovery of Wasserstein gradient flows provided a mathematically precise formulation for the way in which thermodynamic systems are driven by entropy. Since the entropy of a system in equilibrium can be related to the large deviations of stochastic particle systems, the question arises whether a similar relation exists between Wasserstein gradient flows and stochastic particle systems. In the work presented in this thesis, such relation is studied for a number of systems. As explained in the introduction chapter, central to this research is the study of two types of large deviations for stochastic particle systems. The first type is related to the probability of the empirical measure of the particle system, after a fixed time step, conditioned on the initial empirical measure. The large-deviation rate provides a variational formulation of the transition between two macroscopic measures, in fixed time. This formulation can then be related to a discrete-time formulation of a gradient flow, known as minimising movement. To this aim, a specific small-time development of the rate is used, based on the concept of Mosco- or Gamma-convergence. The other type of large deviations concerns the probability of the trajectory of the empirical measure in a time interval. For these large deviations, the rate provides a variational formulation for the trajectory of macroscopic measures. For some systems, this rate can be related to a continuous-time formulation of gradient flows, known as an entropy-dissipation inequality. Chapter 2 serves as background, where a number of results from particle system theory and large deviations are proven in a general setting. In particular, the generator of the empirical measure is calculated, the many-particle limit of the empirical measure is proven, and the discrete-time large deviation principle is proven. In Chapter 3, the discrete-time large-deviation rate is studied for a system of independent Brownian particles in a force field, which yields the Fokker-Planck equation in the many-particle limit. Based on an estimate for the fundamental solution of the Fokker-Planck equation, it is proven that the small-time development of the rate functional coincides with the minimising movement of the Wasserstein gradient flow, under the conjecture that a similar relation holds if there is no force field. The Fokker-Planck equation is studied further in Chapter 4, but with a different technique. Here, an alternative formulation of the discrete-time large-deviation rate is derived from the continuous-time large-deviation rate. With this alternative formulation, the small-time development is proven in a much more general context, so that the conjecture of the previous chapter can indeed be dropped. To complete the discussion, it is mentioned that the continuous-time large deviations can be coupled to a Wasserstein gradient flow more directly, via an entropy-dissipation inequality. Chapter 5 discusses two related systems, whose evolutions are described by a system of reaction-diffusion equations, and by the diffusion equation with decay. In order to deal with the fact that the diffusion equation with decay is not mass-conserving, the decayed mass is added back to the system as decayed matter, which results in a system of reactiondiffusion equations. This allows for the construction of a system of independent particles whose probability evolves according to the reaction-diffusion equations. Similar to the previous chapters, a small-time development of the discrete-time large-deviation rate is proven. It turns out that in general the resulting functional can not be associated with a minimising movement formulation of a Wasserstein gradient flow. Nevertheless, it is proven that this functional defines a variational scheme that approximates the system of reaction-diffusion equations. As such, the functional, which involves entropy terms and Wasserstein distances, can be interpreted as a generalisation of a minimising movement. Chapter 6 is concerned with the effect of trivial Dirichlet boundary conditions on the discrete-time rate functional and its small-time development. Since the diffusion equation with trivial Dirichlet boundary conditions loses mass at the boundaries, the system is first transformed into a mass-conserving system by adding the amount of lost mass back to the system in two delta measures at the boundaries. This again allows for the construction of a corresponding microscopic particle system. For this particle system, the discrete-time large-deviation rate is calculated, and its small-time development is proven. It is still unclear whether the resulting functional can be used to define an variational approxiation scheme for the diffusion equation with Dirichlet boundary conditions. In Chapter 7, general Markov chains on a finite state space are studied. The macroscopic equation is then a linear system of ordinary differential equations, and the microscopic particle system consists of independent Markovian particles on the finite state space. First, the discrete-time rate functional for this particle system is calculated, and its small-time asymptotic development is proven for a two-state system. Although the resulting function looks like a minimising movement, it is still unknown whether this function defines a variational formulation for the system of linear differential equations. Next, the continuous-time rate functional is calculated, at least formally, using the Feng-Kurtz formalism. It is then shown that the continous-time rate function can be rewritten into an entropy-dissipation inequality. However, this inequality may imply unphysical behaviour unless the Markov chain satisfies detailed balance. Finally, the used methods and techniques and the general results are evaluated in Chapter8. It is shown that one can expect to find a gradient-flow structure via large-deviations of microscopic particle systems, if the macroscopic equation satisfies a generalised version of the detailed balance condition. Some important implications of detailed balance are proven for both the discrete-time approach, as well as for the continuous-time approach. The chapter ends with a discussion of the results for the different systems, and the general methods that are used in this thesisShow more
Book, 2013
Publication:2013
Publisher:Technische Universiteit Eindhoven, 20


2013 3   eBook
On gradient structures for Markov chains and the passage to Wasserstein gradient flows

Authors:Karoline DisserMatthias Liero (

Author)
eBook, 2013
English
Publisher:WIAS, Weierstraß-Inst. für Angewandte Analysis und Stochastik im Forschungsverbund Berlin e. V. Niedersächsische Staats- und Universitätsbibliothek Technische Informationsbibliothek u. Universitätsbibliothek, Berlin, Göttingen, Hannover, 2013
Also available asPrint Book
View AllFormats & Editions

 

<— 31  book titles till 2013
end 2013

 start 2014


2014 1

The Exponential Formula for the Wasserstein Metric

Katy Craig - 2014 - ‎No preview

Many evolutionary partial differential equations may be rewritten as the gradient flow of an energy functional, a perspective which provides useful estimates on the behavior of solutions.


2014 2

Weak Solutions to a Fractional Fokker-Planck Equation Via Splitting and Wasserstein Gradient Flow

Malcolm Bowles - 2014 - ‎No preview

 Bowles, Malcolm. Weak Solutions to a Fractional Fokker-Planck Equation via Splitting and Wasserstein Gradient Flow. 

Degree: Department of Mathematics and Statistics, 2014, University of Victoria

URL: http://hdl.handle.net/1828/5591 

► In this thesis, we study a linear fractional Fokker-Planck equation that models non-local (`fractional') diffusion in the presence of a potential field. The non-locality is… (more)

Subjects/Keywords: splitting; Fractional Laplacian; Wasserstein Gradient Flow

Record Details Similar Records Cite Share »
Technische Universiteit Eindhoven


2014  3  eBook
Multiscale modeling of pedestrian dynamics

Authors:Emiliano CristianiBenedetto PiccoliAndrea Tosin
Summary:This book presents mathematical models and numerical simulations of crowd dynamics. The core topic is the development of a new multiscale paradigm, which bridges the microscopic and macroscopic scales taking the most from each of them for capturing the relevant clues of complexity of crowds. The background idea is indeed that most of the complex trends exhibited by crowds are due to an intrinsic interplay between individual and collective behaviors. The modeling approach promoted in this book pursues actively this intuition and profits from it for designing general mathematical structures susceptible of application also in fields different from the inspiring original one. The book considers also the two most traditional points of view: the microscopic one, in which pedestrians are tracked individually, and the macroscopic one, in which pedestrians are assimilated to a continuum. Selected existing models are critically analyzed. The work is addressed to researchers and graduate studentsShow more
eBook, 2014
English
Publisher:Springer, Cham, 2014

<— 34 book titles till 2014
end 2014

start 2015

  


015  eBook
From large deviations to Wasserstein gradient flows in multiple dimensions

Authors:Matthias ErbarJan MaasD. R. Michiel RengerWeierstraß-Institut für Angewandte Analysis und Stochastik
Summary:We study the large deviation rate functional for the empirical measure of independent Brownian particles with drift. In one dimension, it has been shown by Adams, Dirr, Peletier and Zimmer [ADPZ11] that this functional is asymptotically equivalent (in the sense of 񖉖񖌠-convergence) to the Jordan-Kinderlehrer-Otto functional arising in the Wasserstein gradient flow structure of the Fokker-Planck equation. In higher dimensions, part of this statement (the lower bound) has been recently proved by Duong, Laschos and Renger, but the upper bound remained open, since the proof in [DLR13] relies on regularity properties of optimal transport maps that are restricted to one dimension. In this note we present a new proof of the upper bound, thereby generalising the result of [ADPZ11] to arbitrary dimensionsShow more
eBook, 2015
English
Publisher:Weierstraß-Institut für Angewandte Analysis und Stochastik im Forschungsverbund Berlin e.V, Berlin, 2015

<— 35  book titles till 2015
end 2015

start 2016



2016 online

Systèmes de particules en interaction, approche par flot de gradient dans l'espace de Wasserstein 

Systèmes de particules en interaction, approche par flot de gradient dans l'espace de Wasserstein

by Laborde, Maxime

eBookFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

 

<— 36 books till 2016

end 2016


2017 1

Wasserstein Distance on Finite Spaces: Statistical Inference and Algorithms    

Max Sommerfeld

Georg-August-Universität Göttingen, 2017

Wasserstein distances or, more generally, distances that quantify the optimal transport between probability measures on metric spaces have long been established as an important tool in probability theory.

Zbl 1392.62005

Sommerfeld, Max. Wasserstein Distance on Finite Spaces: Statistical Inference and Algorithms. 

Degree: PhD, Mathematik und Informatik, 2017, Georg-August-Universität Göttingen

URL: http://hdl.handle.net/11858/00-1735-0000-0023-3FA1-C 

► Wasserstein distances or, more generally, distances that quantify the optimal transport between probability measures on metric spaces have long been established as an important tool… (more)

Record Details Similar Records Cite Share »
Georg-August-Universität Göttingen


2017 2

Fréchet Means in Wasserstein Space: Theory and Algorithms

Yoav Zemel - 2017 - ‎No preview

Mots-clés de l'auteur: Fréchet mean ; functional data analysis ; geodesic variation ; optimal transportation ; phase variation ; point process ; random measure ; registration ; warping ; Wasserstein distance.

Zemel

<— 38 books - till 2017  

end 2017

start 2018  

2018 1

Wasserstein Variational Inference

by L Ambrogioni - ‎2018 - ‎Cited by 2 - ‎Related articles

May 29, 2018 - Abstract: This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both ...


2018 2

Learning and Inference with Wasserstein Metrics

Charles Albert Frogner - 2018 

In this work, we describe a novel approximate inference method, which is based on a characterization of the diffusion as following a gradient flow in a space of probability densities endowed with a Wasserstein metric.

Learning and inference with Wasserstein metrics - DSpace@MIT

dspace.mit.edu › handle

by CCA Frogner - ‎2018 - ‎Related articles

This thesis develops new approaches for three problems in machine learning, using tools from the study of optimal transport (or Wasserstein) distances between ...
Learning and Inference with Wasserstein Metrics 

by Frogner, Charles 

This thesis develops new approaches for three problems in machine learning, using tools from the study of optimal transport (or Wasserstein) distances between...

Dissertation/Thesis: Citation Online 

Learning 2. Frogner, Charles (Charles Albert). Learning and inference with Wasserstein metrics . 

Degree: Department of Brain and Cognitive Sciences, 2018, MIT

URL: http://hdl.handle.net/1721.1/120619 

► This thesis develops new approaches for three problems in machine learning, using tools from the study of optimal transport (or Wasserstein) distances between probability distributions.… (more)

Subjects/Keywords: Brain and Cognitive Sciences.

Record Details Similar Records Cite Share » 

and inference with Wasserstein


2018 3

Generative Modeling Using the Sliced Wasserstein Distance

Ishan Deshpande - 2018 

  MS, Electrical & Computer Engr, 2018, University of Illinois – Urbana-Champaign

► Generative adversarial nets (GANs) are very successful at modeling distributions from given samples, even in the high-dimensional case. However, their formulation is also known to… (more)

Subjects/Keywords: Machine Learning; Deep Learning

Record Details Similar Records Cite Share »
Université du Luxembourg


2018 4

Shape Space in Terms of Wasserstein Geometry and Application   to Quantum Physics

Bernadette Lessel - 2018  

This thesis offers a mathematical framework to treat quantum dynamics without reference to a background structure, but rather by means of the change of the shape of the state.


2018 5

Propriétés statistiques du barycentre dans l’espace de Wasserstein 

by Cazelles, Elsa 

2018  eBook  [French Statistical properties of the barycenter in Wasserstein space]

 Propriétés statistiques du barycentre dans l'espace de Wasserstein

E Cazelles - 2018 - tel.archives-ouvertes.fr

Cette thèse se concentre sur l'analyse de données présentées sous forme de mesures de 

probabilité sur R^ d. L'objectif est alors de fournir une meilleure compréhension des outils 

statistiques usuels sur cet espace muni de la distance de Wasserstein. Une première notion 

naturelle est l'analyse statistique d'ordre un, consistant en l'étude de la moyenne de Fréchet 

(ou barycentre). En particulier, nous nous concentrons sur le cas de données (ou 

observations) discrètes échantillonnées à partir de mesures de probabilité absolument …

Related articles All 3 versions 


2018 6
Optimización robusta distribucional con métrica de ...

Optimización robusta distribucional con métrica de Wasserstein y algunas aplicaciones. Front Cover. Diego Fernando Fonseca Valero. Uniandes, 2018.


2018 7
Learning and Inference with Wasserstein Metrics

By Charles Albert Frogner · 2018 MIT


2018  8  online

Propriétés statistiques du barycentre dans l’espace de Wasserstein

Propriétés statistiques du barycentre dans l’espace de Wasserstein

by Cazelles, Elsa

eBookFull Text Online

 Preview 

 Cite this item Email this item Save this item More actions

 

  2018 9

Graph Clustering and the Nuclear Wasserstein Metric

books.google.com › books

Daniel de Roux Uribe · 2018 · ‎No preview

We study the problem of learning the cluster structure of a random graph G from an independent sample.


<—— 38  books beforen2018

+ 9 in 2018

= 47 books till 2018    

end 2018

start 2019  17 books

2019  1

Reproducing-kernel Hilbert Space Regression with Notes on the Wasserstein Distance

books.google.com › books

Stephen Page  Lancaster University  - 201

Page, Stephen. Reproducing-kernel Hilbert space regression with notes on the Wasserstein distance. 

Degree: PhD, 2019, Lancaster University

URL: https://eprints.lancs.ac.uk/id/eprint/136512/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.784646 

► We study kernel least-squares estimators for the regression problem subject to a norm constraint. We bound the squared L2 error of our estimators with respect… (more)

Wasserstein Balls 183 5.1 Literature Review……254 7.3 Extreme Points of Wasserstein Balls . . . . . . . . . . . . . . . . . . . 256……above using the Wasserstein distance from optimal transport. The optimal transport problem……transport plans. Since the Wasserstein distance is determined by a cost function on the covariate……to some metric on the covariate set. The Wasserstein distance also arises naturally in the…

2019  2

Arterial Spin Labeling Images Synthesis via Locally-Constrained WGAN-GP Ensemble 

by Huang, Wei; Luo, Mingyuan; Liu, Xi; More... 

2019   eBookCitation Online

2019  3

Novel Bi-directional Images Synthesis Based on WGAN-GP with GMM-Based Noise Generation 

by Huang, Wei; Luo, Mingyuan; Liu, Xi; More... 

2019   eBookCitation Online 


2019  4

Predictive Density Estimation Under the Wasserstein Loss

2019 · ‎No preview
No image available


2019 5

Projection Au Sens de Wasserstein 2 Sur Des Espaces ...

books.google.com › books

Léo Lebrat · 2019 · ‎No preview

This work is initially motivated by the design of trajectories in MRI acquisition, however we provide new applications of these methods.

L Lebrat - 2019 - theses.fr

… Résumé. Cette thèse s'intéresse à l'approximation pour la métrique de 2-Wasserstein

de mesures de probabilité par une mesure structurée … Titre traduit. Projection in the

2-Wasserstein sense on structured measure space. Résumé …



2019  6

Using Wasserstein Generative Adversarial Networks for the ...

books.google.com › books

Susan Athey, ‎Guido Imbens, ‎Jonas Metzger · 2019 · ‎No preview

When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies.

Using Wasserstein Generative Adversarial Networks for the Design of Monte Carlo Simulations

By Susan Athey, Guido W. Imbens, Jonas Metzger, Evan Munro

September 2019Working Paper No. 3824

  Cited by 10 Related articles All 8 versions 
Print Book, English, 2019

Edition:View all formats and editions

Publisher:National Bureau of Economic Research, Cambridge, MA., 2019


2019  7

A Comparative Assessment of the Impact of Various Norms on Wasserstein Generative Adversarial Networks

books.google.com › books

Chandini Ramesh · 2019 · ‎No preview

"Generative Adversarial Networks (GANs) provide a fascinating new paradigm in machine learning and artificial intelligence, especially in the context of unsupervised learning.


2019  8

Algorithms for Optimal Transport and Wasserstein Distances

books.google.com › books

Jörn Schrieber · 2019 · ‎No preview

Optimal Transport and Wasserstein Distance are closely related terms that do not only have a long history in the mathematical literature, but also have seen a resurgence in recent years, particularly in the context of the many applications ...

 

2019  9

Using Wasserstein generative adversarial networks for the design of Monte Carlo simulations

by Athey, Susan

Working paper series, 2019

BookCitation Online

 

2019 10  online

Courbes et applications optimales à valeurs dans l'espace de Wasserstein

by Lavenant, Hugo

eBookFull Text Online

 


2019 11
Study of Constrained Network Structures for WGANs on Numeric Data GenerationAuthors:Wang, Wei (Creator), Wang, Chuang (Creator), Cui, Tao (Creator), Li, Yue (Creator)
Summary:Some recent studies have suggested using GANs for numeric data generation such as to generate data for completing the imbalanced numeric data. Considering the significant difference between the dimensions of the numeric data and images, as well as the strong correlations between features of numeric data, the conventional GANs normally face an overfitting problem, consequently leads to an ill-conditioning problem in generating numeric and structured data. This paper studies the constrained network structures between generator G and discriminator D in WGAN, designs several structures including isomorphic, mirror and self-symmetric structures. We evaluates the performances of the constrained WGANs in data augmentations, taking the non-constrained GANs and WGANs as the baselines. Experiments prove the constrained structures have been improved in 17/20 groups of experiments. In twenty experiments on four UCI Machine Learning Repository datasets, Australian Credit Approval data, German Credit data, Pima Indians Diabetes data and SPECT heart data facing five conventional classifiers. Especially, Isomorphic WGAN is the best in 15/20 experiments. Finally, we theoretically proves that the effectiveness of constrained structures by the directed graphic model (DGM) analysisShow more
Downloadable Archival Material, 2019-11-05
Undefined
Publisher:2019-11-05
Access Free

 

2019 12
SGD Learns One-Layer Networks in WGANsAuthors:Lei, Qi (Creator), Lee, Jason D. (Creator), Dimakis, Alexandros G. (Creator), Daskalakis, Constantinos (Creator)
Summary:Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexityShow more
Downloadable Archival Material, 2019-10-15
Undefined
Publisher:2019-10-15
Access Free

2019 13
Peer-reviewed
Feature augmentation for imbalanced classification with conditional mixture WGANsAuthors:Yinghui ZhangBo SunYongkang XiaoRong XiaoYunGang Wei
Summary:Heterogeneity of class distribution is an intrinsic property of a real-world dataset. Therefore, imbalanced classification is a popular but challenging task. Several methods exist to address this problem. Notably, the adversarial-based data augmentation method, which aims to directly learn the distribution of minority classes unlike simple data modification, has been applied to the challenging task. While effective, the method focuses on a certain domain and lacks universality, and the generated samples lack diversity due to the mode collapse of Generative Adversarial Networks (GANs). In this paper, we propose a general framework of data augmentation using GANs in feature space for imbalanced classification. The core of the framework comprises conditional mixture WGANs (cMWGANs), which are used to approximate true feature distribution and generate label preserved and diverse features for the minority class of various datasets. We conduct three experiments on SVHN, FER2013, and Amazon Review of Instant Video to demonstrate the versatility of the framework and better performance of our cMWGANs in single feature learning. The results show significant improvement with feature augmentation of cMWGANsShow more
Article
Publication:Signal Processing: Image Communication, 75, July 2019, 89

2019 14
How Well Do WGANs Estimate the Wasserstein Metric?Authors:Mallasto, Anton (Creator), Montúfar, Guido (Creator), Gerolin, Augusto (Creator)
Summary:Generative modelling is often cast as minimizing a similarity measure between a data distribution and a model distribution. Recently, a popular choice for the similarity measure has been the Wasserstein metric, which can be expressed in the Kantorovich duality formulation as the optimum difference of the expected values of a potential function under the real data distribution and the model hypothesis. In practice, the potential is approximated with a neural network and is called the discriminator. Duality constraints on the function class of the discriminator are enforced approximately, and the expectations are estimated from samples. This gives at least three sources of errors: the approximated discriminator and constraints, the estimation of the expectation value, and the optimization required to find the optimal potential. In this work, we study how well the methods, that are used in generative adversarial networks to approximate the Wasserstein metric, perform. We consider, in particular, the $c$-transform formulation, which eliminates the need to enforce the constraints explicitly. We demonstrate that the $c$-transform allows for a more accurate estimation of the true Wasserstein metric from samples, but surprisingly, does not perform the best in the generative settingShow more
Downloadable Archival Material, 2019-10-09
Undefined
Publisher:2019-10-09
Access Free

2019 15
Conditional WGANs with Adaptive Gradient Balancing for Sparse MRI ReconstructionAuthors:Malkiel, Itzik (Creator), Ahn, Sangtae (Creator), Taviani, Valentina (Creator), Menini, Anne (Creator), Wolf, Lior (Creator), Hardy, Christopher J. (Creator)
Summary:Recent sparse MRI reconstruction models have used Deep Neural Networks (DNNs) to reconstruct relatively high-quality images from highly undersampled k-space data, enabling much faster MRI scanning. However, these techniques sometimes struggle to reconstruct sharp images that preserve fine detail while maintaining a natural appearance. In this work, we enhance the image quality by using a Conditional Wasserstein Generative Adversarial Network combined with a novel Adaptive Gradient Balancing technique that stabilizes the training and minimizes the degree of artifacts, while maintaining a high-quality reconstruction that produces sharper images than other techniquesShow more
Downloadable Archival Material, 2019-05-02
Undefined
Publisher:2019-05-02
Access Free

 2019 16  eBook
Using Wasserstein Generative Adversarial Networks for the design of Monte Carlo simulationsAuthors:Susan Athey (Author), Guido Imbens (Author), Jonas Metzger (Author), Evan M. Munro (Author), National Bureau of Economic Research (Publisher)
Summary:When researchers develop new econometric methods it is common practice to compare the performance of the new methods to those of existing methods in Monte Carlo studies. The credibility of such Monte Carlo studies is often limited because of the freedom the researcher has in choosing the design. In recent years a new class of generative models emerged in the machine learning literature, termed Generative Adversarial Networks (GANs) that can be used to systematically generate artificial data that closely mimics real economic datasets, while limiting the degrees of freedom for the researcher and optionally satisfying privacy guarantees with respect to their training data. In addition if an applied researcher is concerned with the performance of a particular statistical method on a specific data set (beyond its theoretical properties in large samples), she may wish to assess the performance, e.g., the coverage rate of confidence intervals or the bias of the estimator, using simulated data which resembles her setting. To illustrate these methods we apply Wasserstein GANs (WGANs) to compare a number of different estimators for average treatment effects under unconfoundedness in three distinct settings (corresponding to three real data sets) and present a methodology for assessing the robustness of the results. In this example, we find that (i) there is not one estimator that outperforms the others in all three settings, so researchers should tailor their analytic approach to a given setting, and (ii) systematic simulation studies can be helpful for selecting among competing methods in this situationShow more
eBook, 2019
English
Publisher:National Bureau of Economic Research, Cambridge, Mass., 2019
Also available asPrint Book
View AllFormats & Editions

2019 17  eBook

Nonlinear diffusion equations and curvature conditions in metric measure spacesAuthors:Luigi Ambrosio (Author), Andrea Mondino (Author), Giuseppe Savaré (Author)
Abstract:Aim of this paper is to provide new characterizations of the curvature dimension condition in the context of metric measure spaces (X, d, m). On the geometric side, our new approach takes into account suitable weighted action functionals which provide the natural modulus of K-convexity when one investigates the convexity properties of N-dimensional entropies. On the side of diffusion semigroups and evolution variational inequalities, our new approach uses the nonlinear diffusion semigroup induced by the N-dimensional entropy, in place of the heat flow. Under suitable assumptions (most notably the quadraticity of Cheeger's energy relative to the metric measure structure) both approaches are shown to be equivalent to the strong CD*(K, N) condition of Bacher-SturmShow more
eBook, 2019
English
Publisher:American Mathematical Society, Providence, 2019
Also available asPrint Book
View AllFormats & Editions


2019 18  book
Predictive density estimation under the wasserstein loss
 / Predictive density estimation under the wasserstein loss
Print Book, 2019.4
English
Publisher:Department of Mathematical Informatics, Graduate School of Information Science and Technology, the University of Tokyo, Tokyo, 2019.4


2019 19  book
Data-driven chance constrained optimization under wasserstein ambiguity sets
Authors:Ashish R. HotaAshish CherukuriJohn Lygeros
Summary:We present a data-driven approach for distri-butionally robust chance constrained optimization problems (DRCCPs). We consider the case where the decision maker has access to a finite number of samples or realizations of the uncertainty. The chance constraint is then required to hold for all distributions that are close to the empirical distribution constructed from the samples (where the distance between two distributions is defined via the Wasserstein metric). We first reformulate DRCCPs under data-driven Wasserstein ambiguity sets and a general class of constraint functions. When the feasibility set of the chance constraint program is replaced by its convex inner approximation, we present a convex reformulation of the program and show its tractability when the constraint function is affine in both the decision variable and the uncertainty. For constraint functions concave in the uncertainty, we show that a cutting-surface algorithm converges to an approximate solution of the convex inner approximation of DRCCPs. Finally, for constraint functions convex in the uncertainty, we compare the feasibility set with other sample-based approaches for chance constrained programs
Show more
Book, 2019-07-01
Publication:2019
Publisher:Institute of Electrical and Electronics Engineers Inc, 2019-07-01


2019 20 see 2018 1  book
Wasserstein Variational Inference
Authors:L. AmbrogioniU. GüçlüY. GüçlütürkM. HinneM. van GervenE. MarisS. BengioH. WallachH. LarochelleK. Grauman
Summary:This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases. The gradients of the Wasserstein variational loss are obtained by backpropagating through the Sinkhorn iterations. This technique results in a very stable likelihood-free training method that can be used with implicit distributions and probabilistic programs. Using the Wasserstein variational inference framework, we introduce several new forms of autoencoders and test their robustness and performance against existing variational autoencoding techniques
Show more
Book, 2019
Publication:2019
Publisher:Neural Information Processing Systems Foundation, 2019

<—-  47 before 2019

 + 20 in 2019

 =  67 books till 2019 

end 2019

start 2020  15 books->


2020 1

An Invitation to Statistics in Wasserstein Space

Victor M. Panaretos, ‎Yoav Zemel - 2020 - ‎No preview

This open access book presents the key aspects of statistics in Wasserstein spaces, i.e. statistics in the space of probability measures when endowed with the geometry of optimal transportation.

 Cited by 27 Related articles All 7 versions 

2020 2
 Mastering Machine Learning Algorithms: 

Expert techniques to implement popular machine learning algorithms and fine-tune your models.

Giuseppe Bonaccorso - 2020

Expert techniques for implementing popular machine learning algorithms, fine-tuning your models, and understanding how ... Wasserstein. GAN. As explained in the previous section, one of the most difficult problems with standard GANs is ...

 

2020 3

Wasserstein Distributionally Robust Learning

books.google.com › books

OROOSH Shafieezadeh Abadeh · 2020 · ‎No preview

Mots-clés de l'auteur: Distributionally robust optimization ; Wasserstein distance ; Regularization ; Supervised Learning ; Inverse optimization ; Kalman filter ; Frank-Wolfe algorithm.


2020 4

Diffusions on Wasserstein Spaces

books.google.com › books

Lorenzo Dello Schiavo · 2020 · ‎  Lp-Wasserstein and Flux-limited Gradient Flows: Entropic Discretization, Convergence Analysis and Numerics

By Benjamin Söllner · 2020


2020 5

Wasserstein Barycenters: Statistics and Optimization

books.google.com › books

Austin James Stromme · 2020 · ‎No preview

We study a geometric notion of average, the barycenter, over 2-Wasserstein space.


2020 6
[PDF] oapen.org

[BOOK] An Invitation to Statistics in Wasserstein Space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie statistics in the space of probability measures when endowed with the geometry of optimal transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 16 Related articles All 7 versions 


2020 7

s.google.com › books › about

· Translate this page

WGAN-GP對臉部馬賽克進行眼睛補圖- Google Books

 [Use WGAN-GP to complement the face mosaic]

[CITATION] 使用 WGAN-GP 對臉部馬賽克進行眼睛補圖

HT Chang - 2020 - 長庚大學

[Chinese  Use WGAN-GP to complement the face mosaic]

books.google.com › books

使用WGAN-GP對臉部馬賽克進行眼睛補圖

Translate this page

吳承軒 · 2020 · ‎No preview


2020 8

Probability Forecast Combination Via Entropy
 Ryan Cumings-Menon
, ‎Minchul Shin · 2020 ·



2020 9

 Structure-preserving Variational Schemes for Fourth Order ...

Nonlinear Partial Differential Equations with a Wasserstein Gradient Flow Structure

books.google.com › books     

Blake Ashworth · 2020 · ‎No preview

2020 10

[BOOK] An invitation to statistics in Wasserstein space

VM Panaretos, Y Zemel - 2020 - library.oapen.org

This open access book presents the key aspects of statistics in Wasserstein spaces, ie

statistics in the space of probability measures when endowed with the geometry of optimal

transportation. Further to reviewing state-of-the-art aspects, it also provides an accessible …

  Cited by 21 Related articles All 7 versions 

2020 11
Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data AugmentationAuthors:Wei WangChuang WangTao CuiYue Li
Summary:Some recent studies have suggested using Generative Adversarial Network (GAN) for numeric data over-sampling, which is to generate data for completing the imbalanced numeric data. Compared with the conventional over-sampling methods, taken SMOTE as an example, the recently-proposed GAN schemes fail to generate distinguishable augmentation results for classifiers. In this paper, we discuss the reason for such failures, based on which we further study the restrained conditions between G and D theoretically, and propose a quantitative indicator of the restrained structure, called Similarity of the Restrained Condition (SRC) to measure the restrained conditions. Practically, we propose several candidate solutions, which are isomorphic (IWGAN) mirror (MWGAN) and self-symmetric WGAN (SWGAN) for restrained conditions. Besides, the restrained WGANs enhance the classification performance in AUC on five classifiers compared with the original data as the baseline, conventional SMOTE, and other GANs add up to 20 groups of experiments in four datasets. The restrained WGANs outperform all others in 17/20 groups, among which IWGAN accounted for 15/17 groups and the SRC is an effective measure in evaluating the restraints so that further GAN structures with G-D restrains could be designed on SRC. Multidimensional scaling (MDS) is introduced to eliminate the impact of datasets and evaluation of the AUC in a composite index and IWGAN decreases the MDS distance by 20% to 40%. Moreover, the convergence speed of IWGAN is increased, and the initial error of loss function is reducedShow more
Article, 2020
Publication:IEEE Access, 8, 2020, 89812
Publisher:2020

2020 12
Study of Restrained Network Structures for Wasserstein Generative Adversarial Networks (WGANs) on Numeric Data AugmentationAuthors:Wei WangChuang WangTao CuiYue Li
Article, 2020
Publication:IEEE access, 8, 2020, 89812
Publisher:2020

2020 13

Parameter-Transferred Wasserstein Generative Adversarial Network (PT-WGAN) for Low-Dose PET Image Denoising

Authors:Yu GongHongming ShanYueyang TengNing TuMing LiGuodong LiangGe WangShanshan Wang

Summary:Due to the widespread use of positron emission tomography (PET) in clinical practice, the potential risk of PET-associated radiation dose to patients needs to be minimized. However, with the reduction in the radiation dose, the resultant images may suffer from noise and artifacts that compromise diagnostic performance. In this paper, we propose a parameter-transferred Wasserstein generative adversarial network (PT-WGAN) for low-dose PET image denoising. The contributions of this paper are twofold: i) a PT-WGAN framework is designed to denoise low-dose PET images without compromising structural details, and ii) a task-specific initialization based on transfer learning is developed to train PT-WGAN using trainable parameters transferred from a pretrained model, which significantly improves the training efficiency of PT-WGAN. The experimental results on clinical data show that the proposed network can suppress image noise more effectively while preserving better image fidelity than recently published state-of-the-art methods. We make our code available at https://github.com/90n9-yu/PT-WGAN

Show m

Book, 2020

Publication:arXiv.org, Aug 26, 2020, n/a

Publisher:Cornell University Library, arXiv.org, Ithaca, 2020



2020 14

Many-Objective Estimation of Distribution Optimization Algorithm Based on WGAN-GP

Authors:Zhenyu LiangYunfan LiZhongwei Wan

Summary:Estimation of distribution algorithms (EDA) are stochastic optimization algorithms. EDA establishes a probability model to describe the distribution of solution from the perspective of population macroscopically by statistical learning method, and then randomly samples the probability model to generate a new population. EDA can better solve multi-objective optimal problems (MOPs). However, the performance of EDA decreases in solving many-objective optimal problems (MaOPs), which contains more than three objectives. Reference Vector Guided Evolutionary Algorithm (RVEA), based on the EDA framework, can better solve MaOPs. In our paper, we use the framework of RVEA. However, we generate the new population by Wasserstein Generative Adversarial Networks-Gradient Penalty (WGAN-GP) instead of using crossover and mutation. WGAN-GP have advantages of fast convergence, good stability and high sample quality. WGAN-GP learn the mapping relationship from standard normal distribution to given data set distribution based on a given data set subject to the same distribution. It can quickly generate populations with high diversity and good convergence. To measure the performance, RM-MEDA, MOPSO and NSGA-II are selected to perform comparison experiments over DTLZ and LSMOP test suites with 3-, 5-, 8-, 10- and 15-objective

Show more

Book, Mar 16, 2020

Publication:arXiv.org, Mar 16, 2020, n/a

Publisher:Mar 16, 2020


2020 15

Accelerated WGAN update strategy with loss change rate balancing

Authors:Xu OuyangGady Agam

Summary:Optimizing the discriminator in Generative Adversarial Networks (GANs) to completion in the inner training loop is computationally prohibitive, and on finite datasets would result in overfitting. To address this, a common update strategy is to alternate between k optimization steps for the discriminator D and one optimization step for the generator G. This strategy is repeated in various GAN algorithms where k is selected empirically. In this paper, we show that this update strategy is not optimal in terms of accuracy and convergence speed, and propose a new update strategy for Wasserstein GANs (WGAN) and other GANs using the WGAN loss(e.g. WGAN-GP, Deblur GAN, and Super-resolution GAN). The proposed update strategy is based on a loss change ratio comparison of G and D. We demonstrate that the proposed strategy improves both convergence speed and accuracy

Show more

Book, 2020

Publication:arXiv.org, Nov 3, 2020, n/a

Publisher:Cornell University Library, arXiv.org, Ithaca, 2020


2020 16
[PDF] Risk Measures Estimation Under Wasserstein Barycenter

https://www.semanticscholar.org › paper

https://www.semanticscholar.org › paper

Aug 13, 2020 — Model Risk Measurement Under Wasserstein Distance ... This book provides the most comprehensive treatment of the theoretical concepts and ...



2020 17  Book
Distributed optimization with quantization for computing Wasserstein barycenters
Authors:Roman KrawtschenkoCésar A. UribeAlexander GasnikovPavel DvurechenskyWeierstraß-Institut für Angewandte Analysis und Stochastik
Summary:We study the problem of the decentralized computation of entropy-regularized semi-discrete Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we propose a sampling gradient quantization scheme that allows efficient communication and computation of approximate barycenters where the factor distributions are stored distributedly on arbitrary networks. The communication and algorithmic complexity of the proposed algorithm are shown, with explicit dependency on the size of the support, the number of distributions, and the desired accuracy. Numerical results validate our algorithmic analysis
Show more
eBook, 2020
English
Publisher:Weierstraß-Institut für Angewandte Analysis und Stochastik Leibniz-Institut im Forschungsverbund Berlin e.V, Berlin, 2020

2020 18 eBook
Probability forecast combination via entropy regularized Wasserstein distance
Authors:Ryan Cumings-MenonMinchul Shin
eBook, 2020
English, Revised August 2020
Publisher:Research Department, Federal Reserve Bank of Philadelphia, Philadelphia, PA, 2020

 




2020 19  book
CONLON: A pseudo-song generator based on a new pianoroll, Wasserstein autoencoders, and optimal interpolations
Authors:Luca AngioloniV.A.J. (Tijn) BorghuisLorenzo BrusciPaolo FrasconiJulie CummingsJin Ha LeeBrian McFeeMarkus SchedlJohanna DevaneyCorey McKay
Summary:We introduce CONLON, a pattern-based MIDI generation method that employs a new lossless pianoroll-like data description in which velocities and durations are stored in separate channels. CONLON uses Wasserstein autoencoders as the underlying generative model. Its generation strategy is similar to interpolation, where MIDI pseudo-songs are obtained by concatenating patterns decoded from smooth trajectories in the embedding space, but aims to produce a smooth result in the pattern space by computing optimal trajectories as the solution of a widest-path problem. A set of surveys enrolling 69 professional musicians shows that our system, when trained on datasets of carefully selected and coherent patterns, is able to produce pseudo-songs that are musically consistent and potentially useful for professional musicians. Additional materials can be found at https://paolo-f.github.io/CONLON/
Show more
Book, 2020-10
Publication:2020
Publisher:International Society for Music Information Retrieval, 2020-10
 

2020 20  book
Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference
Authors:Christoph Alexander WeitkampKatharina ProkschCarla TamelingAxel Munk
Summary:In this paper, we aim to provide a statistical theory for object matching based on the Gromov-Wasserstein distance. To this end, we model general objects as metric measure spaces. Based on this, we propose a simple and efficiently computable asymptotic statistical test for pose invariant object discrimination. This is based on an empirical version of a $\beta$-trimmed lower bound of the Gromov-Wasserstein distance. We derive for $\beta\in[0,1/2)$ distributional limits of this test statistic. To this end, we introduce a novel $U$-type process indexed in $\beta$ and show its weak convergence. Finally, the theory developed is investigated in Monte Carlo simulations and applied to structural protein comparisons
Show more
Book, 2020-06-22
Publication:2020
Publisher:arXiv.org, 2020-06-22

2020 21 book
Pattern-Based Music Generation with Wasserstein 

Autoencoders and PRC Descriptions
Authors:V.A.J. (Tijn) BorghuisLuca AngioloniLorenzo BrusciPaolo FrasconiChristian Bessiere
Summary:We demonstrate a pattern-based MIDI music generation system with a generation strategy based on Wasserstein autoencoders and a novel variant of pianoroll descriptions of patterns which employs separate channels for note velocities and note durations and can be fed into classic DCGAN-style convolutional architectures. We trained the system on two new datasets (in the acid-jazz and high-pop genres) composed by musicians in our team with music generation in mind. Our demonstration shows that moving smoothly in the latent space allows us to generate meaningful sequences of four-bars patterns
Show mor
Book, 2020-7
Publication:2020
Publisher:International Joint Conference on Artificial Intelligence (IJCAI), 2020-7


2020 22  eBook
Statistical inference for Bures-Wasserstein barycenters
Authors:Alexey KroshninVladimir G. SpokojnyjAlexandra SuvorikovaWeierstraß-Institut für Angewandte Analysis und Stochastik
eBook, 2020
English
Publisher:Weierstraß-Institut für Angewandte Analysis und Stochastik Leibniz-Institut im Forschungsverbund Berlin e.V, Berlin, 2020


<—-  67 titles before 2020

+ 212 titles in 2020

= 89 looks till 2020  

end 2020  e20

start 2021


 

2021  1

EMS - European Mathematical Society Publishing House

https://www.ems-ph.org › books › book

An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows. August 2021, 144 pages, hardcover, 16.5 x 23.5 cm. This book provides a self-contained introduction to optimal transport, and it is intended as a starting point for any researcher who wants to enter into this beautiful subject.

[CITATION] An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows

A FigalliF Glaudo - 2021 - ems-ph.org

The presentation focuses on the essential topics of the theory: Kantorovich duality, existence

and uniqueness of optimal transport maps, Wasserstein distances, the JKO scheme, Otto's

calculus, and Wasserstein gradient flows. At the end, a presentation of some selected …

  Cited by 2 Related articles

online

An Invitation to Optimal Transport, Wasserstein Distances, and Gradient Flows

ProtoView, 09/2021

Book ReviewFull Text Online

Zbl 1472.49001


2021 2

[PDF] carloalberto.org

[BOOK] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - 2021 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the

most active research lines in the last two decades, with random vectors of measures

representing a natural and popular tool to define them. Nonetheless a principled approach …

  Cited by 1 Related articles 


 2021 3

[BOOK] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - 2021 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the

most active research lines in the last two decades, with random vectors of measures

representing a natural and popular tool to define them. Nonetheless a principled approach …

Save Cite Cited by 4 Related articles 


2021 4

 An Invitation to Optimal Transport, Wasserstein Distances, and ...

https://www.amazon.com › Invitation-Transport-Wasser...

The presentation focuses on the essential topics of the theory: Kantorovich duality, existence and uniqueness of optimal transport maps, Wasserstein distances, ...

[CITATION] An invitation to Optimal Transport, Wasserstein Distances and Gradient Flows. EMS Textbooks in Mathematics

A Figalli, F Glaudo - 2021 - EMS Press (accepted, 2020)

Save Cite Cited by 2

2021 ebBook
An invitation to optimal transport, Wasserstein distances, and gradient flows
Authors:Alessio Figalli (Author), Federico Glaudo (Author)
English
Publisher:European Mathematical Society, [Zürich, Switzerland], 2021
Also available asPrint Book
View AllFormats & Editions

2021 5   [PDF] carloalberto.org

[BOOK] Measuring dependence in the Wasserstein distance for Bayesian nonparametric models

M Catalano, A Lijoi, I Prünster - 2021 - carloalberto.org

The proposal and study of dependent Bayesian nonparametric models has been one of the

most active research lines in the last two decades, with random vectors of measures

representing a natural and popular tool to define them. Nonetheless a principled approach …

 Cited by 4 Related articles 


2021 6

Wasserstein Perturbations of Markovian Transition Semigroups

books.google.com › books

Sven Fuhrmann, ‎Michael Kupper, ‎Max Nendel · 2021 · ‎No preview

In this paper, we deal with a class of time-homogeneous continuous-time Markov processes with transition probabilities bearing a nonparametric uncertainty.


2021 7
Trust the Critics: Generatorless and Multipurpose WGANs with Initial Convergence GuaranteesAuthors:Milne, Tristan (Creator), Bilocq, Étienne (Creator), Nachman, Adrian (Creator)
Summary:Inspired by ideas from optimal transport theory we present Trust the Critics (TTC), a new algorithm for generative modelling. This algorithm eliminates the trainable generator from a Wasserstein GAN; instead, it iteratively modifies the source data using gradient descent on a sequence of trained critic networks. This is motivated in part by the misalignment which we observed between the optimal transport directions provided by the gradients of the critic and the directions in which data points actually move when parametrized by a trainable generator. Previous work has arrived at similar ideas from different viewpoints, but our basis in optimal transport theory motivates the choice of an adaptive step size which greatly accelerates convergence compared to a constant step size. Using this step size rule, we prove an initial geometric convergence rate in the case of source distributions with densities. These convergence rates cease to apply only when a non-negligible set of generated data is essentially indistinguishable from real data. Resolving the misalignment issue improves performance, which we demonstrate in experiments that show that given a fixed number of training epochs, TTC produces higher quality images than a comparable WGAN, albeit at increased memory requirements. In addition, TTC provides an iterative formula for the transformed density, which traditional WGANs do not. Finally, TTC can be applied to map any source distribution onto any target; we demonstrate through experiments that TTC can obtain competitive performance in image generation, translation, and denoising without dedicated algorithmsShow more
Downloadable Archival Material, 2021-11-29
Undefined
Publisher:2021-11-29
Access Free


2021 8   ebook  book
MultiMedia Modeling : 27th International Conference, MMM 2021, Prague, Czech Republic, June 22-24, 2021 : proceedings. Part I
Authors:International Conference on Multi-Media ModelingJakub Lokoč (Editor), Tomas Skopal (Editor), Klaus Schoeffmann (Editor), Vasileios Mezaris (Editor), Xirong Li (Editor), Stefanos Vrochidis (Editor), Ioannis Patras (Editor)Show more
2021  Summary:The two-volume set LNCS 12572 and 1273 constitutes the thoroughly refereed proceedings of the 27th International Conference on MultiMedia Modeling, MMM 2021, held in Prague, Czech Republic, in June2021. Of the 211 submitted regular papers, 40 papers were selected for oral presentation and 33 for poster presentation; 16 special session papers were accepted as well as 2 papers for a demo presentation and 17 papers for participation at the Video Browser Showdown 2021. The papers cover topics such as: multimedia indexing; multimedia mining; multimedia abstraction and summarization; multimedia annotation, tagging and recommendation; multimodal analysis for retrieval applications; semantic analysis of multimedia and contextual data; multimedia fusion methods; multimedia hyperlinking; media content browsing and retrieval tools; media representation and algorithms; audio, image, video processing, coding and compression; multimedia sensors and interaction modes; multimedia privacy, security and content protection; multimedia standards and related issues; advances in multimedia networking and streaming; multimedia databases, content delivery and transport; wireless and mobile multimedia networking; multi-camera and multi-view systems; augmented and virtual reality, virtual environments; real-time and interactive multimedia applications; mobile multimedia applications; multimedia web applications; multimedia authoring and personalization; interactive multimedia and interfaces; sensor networks; social and educational multimedia applications; and emerging trendsShow more
eBook, 2021
English
Publisher:Springer, Cham, 2021
Also available asPrint Book
View AllFormats & Editions


2021 9   ebook
Posterior asymptotics in Wasserstein metrics on the real line

Authors:Minwoo ChaePierpaolo De BlasiStephen G. Walker
eBook, 2021
English
Publisher:CCA, Fondazione Collegio Carlo Alberto, Torino, 2021


2021 10

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial IntelligenceShow more
Authors:Massimiliano Lupo PasiniJunqi YinOak Ridge National Lab (ORNL), Oak Ridge, TN (United States)The 2021 International Conference on Computational Science and Computational Intelligence (CSCI'21) - Las Vegas, Nevada, United States of America - 12/15/2021 5:00:00 AM-12/17/2021 5:00:00 AMShow more
Summary:We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs). The parallel training reduces the risk of mode collapse and enhances scalability by using multiple generators that are concurrently trained, each one of them focusing on a single data label. The use of the Wasserstein metric reduces the risk of cycling by stabilizing the training of each generator. We apply the approach on the CIFAR10 and the CIFAR100 datasets, two standard benchmark datasets with images of the same resolution, but different number of classes. Performance is assessed using the inception score, the Fréchet inception distance, and image quality. An improvement in inception score and Fréchet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs). Weak scaling is attained on both datasets using up to 100 NVIDIA V100 GPUs on the OLCF supercomputer SummitShow more
Book, 2021
Publication:20211201
Publisher:2022


<––
89 books before 2o21

 + 10 books in 2021

 = 99 books till 2021

end 2021  e21

start 2022 


2022 1

Optimal 1-Wasserstein Distance for WGANsAuthors:Stéphanovitch, Arthur (Creator), Tanielian, Ugo (Creator), Cadre, Benoît (Creator), Klutchnikoff, Nicolas (Creator), Biau, Gérard (Creator)
Summary:The mathematical forces at work behind Generative Adversarial Networks raise challenging theoretical issues. Motivated by the important question of characterizing the geometrical properties of the generated distributions, we provide a thorough analysis of Wasserstein GANs (WGANs) in both the finite sample and asymptotic regimes. We study the specific case where the latent space is univariate and derive results valid regardless of the dimension of the output space. We show in particular that for a fixed sample size, the optimal WGANs are closely linked with connected paths minimizing the sum of the squared Euclidean distances between the sample points. We also highlight the fact that WGANs are able to approach (for the 1-Wasserstein distance) the target distribution as the sample size tends to infinity, at a given convergence rate and provided the family of generative Lipschitz functions grows appropriately. We derive in passing new results on optimal transport theory in the semi-discrete settingShow more
Publisher:2022-01-08


2022 2

DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

Authors:Hristo PetkovColin HanleyDong Feng

Summary:The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint, allowing for the exploration of deep generative models to better capture data sample distributions and support the discovery of Directed Acyclic Graphs (DAGs) that faithfully represent the underlying data distribution. However, so far no study has investigated the use of Wasserstein distance for causal structure learning via generative models. This paper proposes a new model named DAG-WGAN, which combines the Wasserstein-based adversarial loss, an auto-encoder architecture together with an acyclicity constraint. DAG-WGAN simultaneously learns causal structures and improves its data generation capability by leveraging the strength from the Wasserstein distance metric. Compared with other models, it scales well and handles both continuous and discrete data. Our experiments have evaluated DAG-WGAN against the state-of-the-art and demonstrated its good performance

Show morBook, Apr 1, 2022

Publication:arXiv.org, Apr 1, 2022, n/a

Publisher:Apr 1, 2022

  


2022 3

Radio Galaxy Classification with wGAN-Supported Augmentation

Authors:Janis KummerLennart RustigeFlorian GrieseKerstin BorrasMarcus BrüggenPatrick L S ConnorFrank GaedeGregor KasieczkaPeter Schleper

Summary:Novel techniques are indispensable to process the flood of data from the new generation of radio telescopes. In particular, the classification of astronomical sources in images is challenging. Morphological classification of radio galaxies could be automated with deep learning models that require large sets of labelled training data. Here, we demonstrate the use of generative models, specifically Wasserstein GANs (wGAN), to generate artificial data for different classes of radio galaxies. Subsequently, we augment the training data with images from our wGAN. We find that a simple fully-connected neural network for classification can be improved significantly by including generated images into the training set

Show more

Book, Oct 7, 2022

Publication:arXiv.org, Oct 7, 2022, n/a

Publisher:Oct 7, 2022




2022 4

Application of WGAN-GP in recommendation and Questioning the relevance of GAN-based approaches

Author:Oussama Boudjeniba

Summary:Many neural-based recommender systems were proposed in recent years and part of them used Generative Adversarial Networks (GAN) to model user-item interactions. However, the exploration of Wasserstein GAN with Gradient Penalty (WGAN-GP) on recommendation has received relatively less scrutiny. In this paper, we focus on two questions: 1- Can we successfully apply WGAN-GP on recommendation and does this approach give an advantage compared to the best GAN models? 2- Are GAN-based recommender systems relevant? To answer the first question, we propose a recommender system based on WGAN-GP called CFWGAN-GP which is founded on a previous model (CFGAN). We successfully applied our method on real-world datasets on the top-k recommendation task and the empirical results show that it is competitive with state-of-the-art GAN approaches, but we found no evidence of significant advantage of using WGAN-GP instead of the original GAN, at least from the accuracy point of view. As for the second question, we conduct a simple experiment in which we show that a well-tuned conceptually simpler method outperforms GAN-based models by a considerable margin, questioning the use of such models

Show more

Book, Apr 28, 2022

Publication:arXiv.org, Apr 28, 2022, n/a

Publisher:Apr 28, 2022


2022 5. PDF] columbia.edu

[BOOK] Computational Inversion with Wasserstein Distances and Neural Network Induced Loss Functions

W Ding - 2022 - search.proquest.com

… functions defined with the Wasserstein metrics and with deep … the general impacts of different 

Wasserstein metrics and the … in the reconstructions with the Wasserstein metrics as well as …

Save Cite Related articles All 3 versions


2022 6

A WASSERSTEIN GAN BASED FRAMEWORK FOR ADVERSARIAL ATTACKS

AGAINST INTRUSION DETECTION SYSTEMS.

books.google.com › books

books.google.com › books

Fangda Cui · 2022 · ‎ No preview

IIntrusion detection system (IDS) detects malicious activities in network flows and is essential for modern communication netw


<— 99 titles before 2022

     + 6 titles in 2022

   = 105  book titles with my name total  

end 2022  e22

Start 2023 s23


2023 1 ebook
Data Generation Scheme for Photovoltaic Power Forecasting Using Wasserstein Gan with Gradient Penalty Combined with Autoencoder and Regression Models
Show more
Authors:Sungwoo ParkJaeuk MoonEenjun Hwang
Summary:Machine learning and deep learning (DL)-based forecasting models have shown excellent predictive performances, but they require a large amount of data for model construction. Insufficient data can be augmented using generative adversarial networks (GANs), but these are not effective for generating tabular data. In this paper, we propose a novel data generation scheme that can generate tabular data for photovoltaic power forecasting (PVPF). The proposed scheme consists of the Wasserstein GAN with gradient penalty (WGAN-GP), autoencoder (AE), and regression model. AE guides the WGAN-GP to generate input variables similar to the real data, and the regression model guides the WGAN-GP to generate output variables that well reflect the relationship with the input variables. We conducted extensive comparative experiments with various GAN-based models on different datasets to verify the effectiveness of the proposed scheme. Experimental results show that the proposed scheme generates data similar to real data compared to other models and, as a result, improves the performance of PVPF models. Especially the deep neural network showed 62% and 70% improvements in mean absolute error and root mean squared error, respectively, when using the data generated through the proposed scheme, indicating the effectiveness of the proposed scheme in DL-based forecasting models
Show more
eBook, 2023


2023 2 see 2021

An invitation to optimal transport, Wasserstein distances, and gradient flows
Authors:Alessio FigalliFederico Glaudo
Computer File, 2023
English, Second edition
Publisher:EMS Press, Berlin, 2023



2023 3

Figalli, Alessio; Glaudo, Federico

An invitation to optimal transport. Wasserstein distances, and gradient flows. 2nd edition. (English) Zbl 1527.49001

EMS Textbooks in Mathematics. Berlin: European Mathematical Society (EMS) (ISBN 978-3-98547-050-1/hbk; 978-3-98547-550-6/ebook). vi, 146 p. (2023).

This is the second edition of a graduate text which gives an introduction to optimal transport theory. The first chapter gives an introduction of the historical roots of optimal transport, with the work of Gaspard Monge and Leonid Kantorovich. Moreover the basic notions of measure theory and Riemannian Geometry are presented. Finally some examples of transport maps are presented.

Chapter 2 presents the core of optimal transport theory, as the solution of Kantorovich’s problem for general costs and the solution of the Monge’s problem for suitable costs. Other applications are presented, as the polar decomposition and an application to the Euler equation of fluid dynamics.

Chapter 3 presents some connections between optimal transport, gradient flows and partial differential equations. The Wasserstein distances and gradient flows in Hilbert spaces are introduced. Then the authors show that the gradient flow of the entropy functional in the Wasserstein space coincides with the heat equation, following the seminal approach of Jordan, Kinderlehrer and Otto.

Chapter 4 is devoted to an analysis of optimal transport from the differential point of view, in particular some several important partial differential equations are interpreted as gradient flows with respect to the 2-Wasserstein distance.

The last Chapter 5 presents some further reading on optimal transport for the readers.

The book contains also two appendices, Appendix A, which presents some exercises on optimal transport, and Appendix B, in which the authors give a sketch of the proof of a disintegration theorem, remanding to a book by Luigi Ambrosio, Nicola Fusco e Diego Pallara for a complete proof.

For the first edition of the book, see [A. Figalli and F. Glaudo, An invitation to optimal transport, Wasserstein distances, and gradient flows. Berlin: European Mathematical Society (EMS) (2021; Zbl 1472.49001)].

Reviewer: Antonio Masiello (Bari)

MSC:

49-01

Introductory exposition (textbooks, tutorial papers, etc.) pertaining to calculus of variations and optimal control

49-02

Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control

49Q22

Optimal transportation

60B05

Probability measures on topological spaces

28A33

Spaces of measures, convergence of measures

35A15

Variational methods applied to PDEs

35Q35

PDEs in connection with fluid mechanics

49N15

Duality theory (optimization)

28A50

Integration and disintegration of measures

Keywords:

optimal transport; Wasserstein distance; duality; gradient flows; measure theory; displacement convexity

Citations:

Zbl 1472.49001


2023 4

ntegration of Metropolis-Hastings Algorithm and WGAN Model ...

books.google.com › books

books.google.com › books

2023 · ‎ No preview


<— 105 titles till 2022

     + 4  title in 2023

   = 109  book titles with my name total  

end 2023  e23










ti:Wasserstein & py:2023 & dt:b

Search

Fields Operators

  • Help 






WGAN = Wasserstein GAN

Wasserstein = Vaserstein
————————————————————yyy

_____________________________________________

Search links

 Advanced Book Search - Google Books


Google books


WorldCat


zMath Bbl. dt: b

 

————————————

_____________________bbb


 

 No image available